Sunday, March 15, 2009

Listen with your eyes.


I've spent most of the weekend hacking away at the detection grid project for creating generative ambient music from video input. As is usual in these kinds of experiments the end result is quite a long way from my expectations, fortunately pleasingly so.

Snap To Grid

My first challenge was one of granularity. To create pleasing tonal compositions I needed to restrict the number of possible active notes. After a bit of trial and error I figured that around 16 notes was optimal. However using only a 4x4 grid across the video didn't provide enough granularity for detection. Increasing the size of the matrix really slows down the frame rate however and quickly adds too many notes into the mix.

Short And Snappy

Next issue was that the notes themselves were very short and often repeated very quickly. This is great for quick glitchy noises but not so great for the ambient soundscape I was trying to create. I tried making patches that had a very short (zero) attack phase with a long release and these worked well for sparse hits, the repeating notes however still didn't sound right.

Messing around a bit more I fired up my favorite Ableton arpeggiators and tried feeding the notes into a held pattern. This has the effect of negating the displeasing rapid repeating notes but leads to a continually sounding arpeggio.

Less Is More

The original composer patch I used as the base for my work used a bit of Javascript to restrict the notes to a specific scale. This worked well but I found it more pleasing to have the generator output from the full chromatic scale and use the Ableton scale plugin to filter the notes. This allows for me to tweak the scale in use and it's tonal qualities in realtime in Ableton should I want to. It also means I can optimise the Quartz composition to use math expression patches to calculate the note output rather than Javascript that runs significantly slower.

Invoking Reich

At some point in the proceedings I ended up with an arpeggio and scale that produced a repeating composition with the period equal to that of the video clip. It's sound and mood was immediately reminiscent of Steve Reich so I tweaked the hell out if it to make it more so. Unfortunately during recording it Quartz Composer crashed and I hadn't saved in ages! I managed to get most of the way back to where I was and decided that the whole point of this was to be generative so shouldn't sweat it too much.


Didcot Plays Reich from Rick Hawkins on Vimeo.

This video shows four movements of Didcot Plays Reich (a working title). The frame rate sucks ass as performing the analysis, generating the OSC messages and rendering to disk all at once strains even my new Macbook pro. This means that the video and soundtrack have become a bit out of sync in the final render. I have ideas on how I can optimise the processing by swapping out Javascript for math expressions and programming my own Quartz plugin in Cocoa to do the grid analysis and reduce the load on the iterator macro in Quartz. As it is intended to be a "live" piece (if generative can be called that) I am not too worried about it right now. You will just have to trust me that it looks and sounds pretty cool in realtime.

Although a long way off from being perfected I am very happy with the results of this weekend's hacking and am looking forward to working up my improvements.

1 comments:

Anonymous said...

I really like it.