Monday 24 August 2020

TO A DIFFERENT DRUMMER

 Dick Pountain/ Idealog 306/  January 6th 2020

My Christmas present to myself this year was a guitar, an Ibanez AS73 Artcore. This isn't meant to replace my vintage MIJ Strat but rather to complement it in a jazzier direction. 50-odd years ago I fell in love with blues, ragtime and country finger-picking, then slowly gravitated toward jazz via Jim Hall and Joe Pass, then to current Americana-fusionists like Bill Frisell, Charlie Hunter and Julian Lage (none of whom I'm anywhere near skillwise). It's a long time since I was in a band and I play mostly for amusement, but can't escape the fact that all those idols all work best in a trio format, with drums and bass. My rig does include Hofner violin bass, drum machine and looper pedal to record and replay accompaniments, and I have toyed with apps like Band-in-a-Box, or playing along to Spotify tracks, but find none of these really satisfactory -- too rigid, no feedback. Well, I've mentioned before in this column my project to create human-sounding music by wholly programmatic means. The latest version, which I've named  'Algorhythmics', is written in Python and is getting pretty powerful. I wonder, could I use it to write myself a robot trio?

Algorhythmics starts out using native MIDI format, by treating pitch, time, duration and volume data as four seperate streams, each represented by a list of ASCII characters. In raw form this data just sounds like a hellish digital musical-box, and the challenge is to devise algorithms that inject structure, texture, variation and expression. I've had to employ five levels of quasi-random variation to achieve something that sounds remotely human. The first level composes the data lists themselves by manipulating, duplicating, reversing, reflecting and seeding with randomness. The second level employs two variables I call 'arp' (for arpeggio) and 'exp' (for expression) that alter the way notes from different MIDI tracks overlap to control legato and staccato. A third level produces tune structure by writing functions called 'motifs' to encapsulate short tune fragments, which can then be assembled like Lego blocks into bigger tunes with noticeably repeating themes. Motifs alone aren't enough though: if you stare at wallpaper with a seemingly random pattern, you'll invariably notice where it starts to repeat, and the ear has this same ability to spot (and become bored by) literal repetition. Level four has a function called 'vary' that subtly alters the motifs inside a loop at each pass, and applies tables of chord/scale relations (gleaned from online jazz tutorials and a book on Bartok's composing methods) to harmonise the fragments. Level five is the outer loop that generates the MIDI output, in which blocks of motifs are switched on and off under algorithmic control, like genes being expressed in a string of DNA.

So my robot jazz trio is a Python program called TriBot that generates improvised MIDI accompaniments -- for Acoustic Bass and General MIDI drum kit -- and plays them into my Marshall amplifier. The third player is of course me, plugged in on guitar. The General MIDI drum kit feels a bit too sparse, so I introduced an extra drum track using ethnic instruments like Woodblock, Taiko Drum and Melodic Tom. Tribot lets me choose tempo, key, and scale (major, minor, bop, blues, chromatic, various modes) through an Android menu interface, and my two robot colleagues will improvise away until halted. QPython lets me save new Tribot versions as clickable Android apps, so I can fiddle with its internal works as ongoing research.

It's still only a partial solution, because although drummer and bass player 'listen' to one another -- they have access to the same pitch and rhythm data -- they can't 'hear' me and I can only follow them. In one sense this is fair enough as it's what I'd experience playing alongside much better live musicians. At brisk tempos Tribot sounds like a Weather Report tribute band on crystal meth, which makes for a good workout. But my ideal would be what Bill Frisell described in this 1996 interview with a Japanese magazine (https://youtu.be/tKn5VeLAz4Y, at 47:27), a trio that improvise all together, leaving 'space' for each other. That's possible in theory, using a MIDI guitar like a Parker or a MIDI pickup for my Artcore. I'd need to make Tribot work in real-time -- it currently saves MIDI to an intermediate file -- then merge in my guitar's output translated back into Algorhythmic data format, so drummer and bass could 'hear' me too and adjust their playing to fit. A final magnificent fantasy would be to extend TriBot so it controlled an animated video of cartoon musicians. I won't have sufficient steam left to do either, maybe I'll learn more just trying to keep up with my robots... 

[ Dick Pountain recommends you watch this 5 minute video, https://youtu.be/t-ReVx3QttA, before reading this column ]

 

   


   

No comments:

Post a Comment

SOCIAL UNEASE

Dick Pountain /Idealog 350/ 07 Sep 2023 10:58 Ten years ago this column might have listed a handful of online apps that assist my everyday...