Stepah – Live Performance Environment

Project for Advanced A/V Programming module of my MA at Goldsmiths.
A Max/MSP/Jitter Project.

The goal I set for this project was to create an audio-visual environment for a live performance, capable of taking a small amount of elements/source material and forming a developed musical piece, providing a plethora of methods to manipulate the sequencing of audio events which would drive corresponding visual elements.

The results can be seen/heard in the demo video below.

This piece was captured live, comprising of impulse samples and short two second video clips and has not been mixed or mastered down, describing a walk on a calm late Winter morning.

Beginnings – The Patch
The basis for my patch is an 8 channel 4 bar 16 step sequencer; I chose this particular arranging-function, as my musical output is heavily influenced by Juke/Footwork (originally hailing from Chicago, USA) which itself is almost completely technologically centred around the Roland TR-808.

Much like the TR-808, I required the ability to store patterns and recall them at will. This would allow for quick rhythmic changes across multi-sample based instruments serving as a function to rhythmically develop musical thematics.

However, unlike the TR-808 (which is purely a drum synthesiser), the sequencer would playback audio samples, allowing a wider sonic aesthestical scope. As I intended to create whole pieces of music, the patch would also need a way to manipulate the pitch.

To top it off, the patch would also provide a way to sequence and manipulate short video clips in time with the music, providing a complementary visual component.

Building – Audio
I built the sequencer using the multislider (combined with tempo) object as the event-driving object. Although this was fine initially, it would ultimately prove too unstable as a timing mechanism and would later be replaced using a phasor~ to provide accurate timing. At this stage I was also using gizmo~ (a Frequency-domain pitch shifter for pfft~) to manipulate the pitch of the sample. Unfortunately as the patch grew with control data, eventually my laptop was unable to process pitch shifting in this manner and I settled with the transratio method.

The next step *badumbumtssch* was to create a mechanism (which I dubbed ‘Target Mode’) that would allow for the storage of sequences and the ability to recall them across the different channels using the keyboard; this was achieved using a jit.cellblock and a series of gates/switches to corresponding selects. At this point it became untenable to maintain using the laptop keyboard (too many keys needed!) so I switched to midi hardware.

After an aborted attempt at mapping the control data to my Behringer BCR2000 (the controlling processing was too fiddly and sensitive) I ended up acquiring a Novation LaunchpadMini which was almost the perfect tool for the job. The 3 colour LED pads provided visual feedback that was lacking, which resulted in a user-friendly and intuitive device. At this point it is worth mentioning after several hours of trying to understand how to program the On/Off of the LEDS, I consulted the novation website and downloaded their example patches (found here) So my conceptual understanding of how to build in Max for the device came from there.

The mapping/storage and LED display recollection between changes of channels on the pad became the most difficult and time-consuming of the build. Building the mechanism to recall where the pads were previously – once you had changed from one channel to another and back again became hard work. Once the individual channel sequences were sorted, the ‘Target Mode’ control was the next challenge. The ultimate solution I settled on used the the note-on/off midi data via matrixctrl(s) to the device and screen, sending it between channel changes. I kept the jit.cellblock method as a means to store (over different uses of the patch) patterns, allowing the user to build their own library of recallable rhythms. Additionally, pitch control mechanisms were built so I could use my midi keyboard to determine pitches for the selected channels, as well as the faders to control the volumes for each individual channel.

Building – Visual
Originally I textured the video clips to openGL planes and manipulate the planes in time with music; this allowed for all sorts of interesting creative uses (especially switching between erasing/not erasing the render per frame). However, lamentably, this process was far too CPU intensive and would cause the playback to significantly slow or, worse still, crash. Instead I raided the Jitter tutorials and found less costly video manipulation processes in the form of mixing and colour mixing. Parameters of which would then be hooked to the master playhead of the patch and to individual step outputs of the channels to provide synchronised playback.

Source Material 
The audio samples I used were either created in Logic pro or from my personal sample library with the exception of the ambient loop that was taken from here.

The video clips were wonderfully provided by David C. Montgomery, and can be found under the Pohutukawa and Dandelion Free Culture Libraries on his website.

The Outcome
The outcome (as demonstrated in the above video) allows for interesting results, and given more time to grow accustom to patch and its human-computer-interaction flow, I think the resulting musical outcomes could be highly complex and well polished. Although, on the contrary, the visual creating process could do with tweaking as it currently feels a bit too automated, and not as correlatable to the sound. However, considering I have never worked with video prior to this, it was not entirely unexpected that this would be the case. The patch itself is also a very good starting point on which I can (and will) develop more robust and improved performance tools (it could definitely be more efficient in terms of handling data, as there are processes which feel convoluted).

To conclude, I certainly achieved my goals I set out at the beginning of this project, learning a great deal about data handling and midi mapping, not to mention the complexities of working with video manipulation. The knowledge and experiences garnered will definitely be put to further use.

Reference
Apart from the source material and example code from the Novation website as reference above, the only other point of reference was the Cycling74 website.

Tech
2007 4gb 17inch MacBook Pro
Novation LaunchPad Mini
Evolution Midi Keyboard
Max 6