Jitter and M4L proof of concept.

Since my beginnings in electronic music I have always been interested in the potential of digital sounds and images to interact. In the past this required a lot of workarounds.  Using multiple programs simultaneously and figuring out ways to send MIDI and audio data between the programs (fairly easy on the Mac OSX using the built in IAC bus, and the JACK audio routing software).  Inevitably, this became frustrating, and taxing on my CPU, although some of that can be attributed to my lack of experience in efficient programming practices. Recently I have been investigating ways of doing these types of experiments (yes, I am calling them experiments because I really don’t know what I am doing yet) all inside of Live using Jitter, the graphical, matrix processing side of M4L.

I think that Live’s modular environment is the perfect host for a collection of small, usable, and flexible visual devices that generate and modify visuals based on incoming midi and sound data from the music that I perform using Live.

The following video is small demo of me using four devices to generate visuals, albeit very primitive visuals, completely in real time inside of Live.

I am using some very simple operations and objects to visualize the audio data and do some small processing on the resulting image.

Advertisements
Jitter and M4L proof of concept.