The device I made that leverages some new API calls we got in Live 9.2 has gotten some good updates over the past month or so. I’ve included support for the new Launchpad Pro, as well as compatibility with the very awesome Launchpad95 script.
If anyone does anything cool with this tool I’d love to see/hear it!
I finally got around to cleaning this M4L device up and getting it to be stable enough to share with people. Check out a demo:
It’s a four channel Looper that can capture audio on any track in Live. It then chops the loop into 16 slices that you can trigger with the Push’s grid. There are hardware controls for playback speed, a stuttering effect, forward/reverse, one shot mode, and pattern recording. There’s also a lot of loop and quantization settings available on the device itself.
I’ve had this idea floating around since I got the Push a while ago, and it feels nice to have something usable. I used this in a performance of my piece South Hudson last weekend in Troy, NY (thanks for having me out Mr. Ryan Ross Smith!), and it proved to be stable and more importantly a lot of fun to use in a performance setting.
You can download the devices HERE. There’s more development and documentation on the way!
You can check out the SECOND INSTALLMENT in a series on the Live API. We look at iterations and checks in the API through M4L. This expands the Session Box control that we created in the previous article.
In this series of three articles I get into accessing the Control Surface part of the Live API. We create a M4L device that allows MIDI control of any script that has a Session ‘redbox’.
You can catch the first part right HERE.
I’ve been doing some more writing for the KMI blog,
mostly on Live and Max4Live. You can catch the second part of ‘Controlling the Controllers’ right HERE.
Since my beginnings in electronic music I have always been interested in the potential of digital sounds and images to interact. In the past this required a lot of workarounds. Using multiple programs simultaneously and figuring out ways to send MIDI and audio data between the programs (fairly easy on the Mac OSX using the built in IAC bus, and the JACK audio routing software). Inevitably, this became frustrating, and taxing on my CPU, although some of that can be attributed to my lack of experience in efficient programming practices. Recently I have been investigating ways of doing these types of experiments (yes, I am calling them experiments because I really don’t know what I am doing yet) all inside of Live using Jitter, the graphical, matrix processing side of M4L.
I think that Live’s modular environment is the perfect host for a collection of small, usable, and flexible visual devices that generate and modify visuals based on incoming midi and sound data from the music that I perform using Live.
The following video is small demo of me using four devices to generate visuals, albeit very primitive visuals, completely in real time inside of Live.
I am using some very simple operations and objects to visualize the audio data and do some small processing on the resulting image.
At Mills College I help run a bi-weekly concert event called Thursday Night Special (TNS), which is a forum for graduate students (and a few brave undergrads) to share their art with the Mills community. This past TNS, my friend Moni decided to include a ‘sample of the week’ (which can be found HERE), and challenge any performer that was brave enough to include it as part of their piece. Well, I took this challenge seriously, and decided to make an entire track in 6 hours using just a 25 second sample of the dial up internet sound. Although I did cheat a little, using drum samples, and Operator to make the bass line. My friend Sean Price also took on the challenge, and did his a track live with his Octatrack sampling instrument, and it was DOPE.
I have been in the habit lately of documenting my musical work and thought I’d just spout for awhile about how I approached this. I use Ableton Live, and Max/MSP for all of my sound creation, and these two tools are absolutely perfect for this kind of endeavor. As with any sample based composition, I always start by listening to the sound itself for a few minutes, and finding sections that I think would take well to some mangling. Then the fun begins. Time stretching with the idea of producing artifacts always works well for me and I find the aesthetic that it creates provides a good starting point for more ‘real’ sounding things. Live’s different warp modes are great for stretching convincingly, but can also be abused to produce some VERY nice effects.
So, after mucking about with some of the ‘tones’ from the sample I had a decent intro created, and it was time to move on to the actual tune. I wanted to really reference the source material blatantly, so I got a phrase of the actual dialing sound the I lined up on eighth notes. I also wanted to get some convincing melodies, and Live’s Sampler instrument is an absolute beast for taking any source material and making something usable out of it. It’s all about setting the proper loop points (at least for me) and finding interesting modulation routings to really bring the material to life, finally, making sure it is tuned properly. You can do this by ear, with a sine tone, or use a digital tuning plugin, if the sound is pretty weird. I have few instances of some sample based synths running throughout, as well as some one-shot glitched out phrases made completely in Live’s arrange view.
I’ve decided to include the Live Pack, if anyone wants to have some fun with this.
Here’s the performance: