GAIT/GATE Installation at Test Signal Exhibition

As part of the Exhibition Test Signal on the 11/12/13 of September.

GAIT/GATE is a real-time immersive and interactive motion driven sound installation.

GAIT02

Capturing biometric data from the participants as they enter, and whilst they move through the installation, GAIT/GATE explores the relationship between one’s body, motion, music and the auditory environment. Examining kinesiology, and identifying salient reference points, the software maps user input to a corresponding musical equivalent generating a live piece of music, that in-turn affect the user’s following behaviour.

 

The decision to interpret the users body movement as musical (twelve-tone equal temperament) events, rather than literal sonic events was deliberate; as the conditional requirements music places on the interaction between sound, movement and our cognitive function as humans are more substantial yet less understood compared to non-musical events. There have also been numerous works that look to appropriate the raw physical sound of our bodies via Phonomyogram (a technique to measure the force of muscle contraction, for example Marco Donnarumma’s Muscular Interactions: Combining EMG and MMG sensing for musical practice), whereas, excluding conceptual work there has been considerably fewer projects that study movement in a musical context.

GAIT06

Within Systematic musicology, studies in Embodied music cognition consider the body as the mediator between our musical intentions, implied significance and our environment. Considering the corporeal nature of how humans assume meaning to music; for example many people associate different music to corresponding actions, temporal events or moods, it is a fair assumption that the impact of body movement on musical understanding is incredibly high.

With this in mind I aimed to provide an interactive framework for examining the music-movement relationship of individuals, and whether their subtle differences in body movement would translate to their own correspondingly distinctive musical output. Counterpoint to this, I also intended for the participant to receive feedback of their musical movements in real-time, inviting and encouraging them to explore their own movement in a musical context. This would also afford the chance to observe the behavioural effects music has on one’s movement, and whether changing discreet musical factors can have a pronounced effect on how we move through a space.

GAIT05

As a result of my intentions it would be exceedingly important for the functionality of the piece to be accessible, easy-to-use and non-invasive, as it’s harder to move naturally when you have items strapped to your limbs! In an ideal scenario the piece would only need your body movement in a set area and would not require any calibration on behalf of the user. Regrettably due to financial constraints, the only option available to me, in terms of skeleton tracking hardware, was to use off-the-shelf consumer hardware in the form of multiple Microsoft Kinects (I would unfortunately require three in total to cover a worthwhile area that would afford enough of an experience for the participants).

The inspiration to pursue the topic was largely borne out of curiosity to explore the widely accepted notion that our gaits (the manner and pattern of limb movement in locomotion over a solid substrate) are unique. It would also prove to be a useful starting position for applying a variety of musical contexts to, as it is easy enough to identify, extrapolate and characterise with a dominant emotional context that’s represented (one seldom skips or prances when angry).

GAIT01

As gaits are classified typically in footfall patterns, it affords the opportunity to draw parallels with musical rhythms. By identifying peak points in locomotion, such as when a foot is down, when it is lifting up, rhythmic intervals can be deduced, and a matching rhythm can be generated accordingly. After a period of experimentation I decided to take the idea further and incorporate the individual’s step motion and impact as a mechanism to control a step-sequencer (by sheer coincidence, aptly named). Through appropriating characteristics of stride length to different percussive sounds accordingly, such as a long high stride with a powerful snare with a long envelope release, the individual can generate a drum groove, comprised of different samples.

The triggering rhythmic events and step-sequencer are temporally matched by using the peak footstep values as a tap-tempo control method. In order to maintain the important sensation of locomotion, the playback speeds of each individual sample in the step-sequencer are referenced against the incoming tempo. If there are no further kinetic events the sequencer winds down, mimicking conservation of energy. Although it was not an easy task (it was computationally expensive to run on the equipment I had access to), the resulting process works well, as the rhythms generated are accurate, complement the user’s dynamic motion and sound organic, although perhaps unworldly to some ears.

GAIT07

The first tonal instrumentation I introduced into the system was a stereo split granular synthesiser, which is used to control and modulate the bass. As bass is often a grounding feature of music, I opted to tie it the centre of mass of the participant, using their Z-axis movement as an amplitude control. In order to generate the changes/intervals in pitch I measured the velocity of the knees movement. Subsequently the bass lines generated are intrinsically tied to the step distance, potentially resulting in literal giant steps (like John Coltrane!), which is very pleasing. Additionally I mapped height to each key, and used the position of the users shoulders relative to their centre of mass position to generate the scale quality, consequently if the user hunches over the piece will change into the minor key.

To generate melodic content I proposed two methods split by stereo fields, in an effort to accurately capture upper body limb motion. I opted to equate hand motion with melody as our hands have a higher resolution of dexterity, capable of finer and increased subtle movements akin to those typical found in the melodic instruments of western music. The first method was to duplicate the interval selection process of the bass/granular synthesiser. The second was to create an arpeggiator. The latter solution worked exceptionally well with the flowing ‘swing’ of the hands motion, enriching the movement. Unsurprisingly the use of hand motion is not a unique solution to melodic control, as since the Theremin in 1928 right up to recently with Imogen Heap’s gloves, the method has been tried with varying success. I would suggest my attempts give an improved sense of organicity; in their approach to movement and music when compared to a Theremin, but are still lacking in depth compared to more recent efforts. However the simplicity does lend itself to greater accessibility, with improved clarity for the users to correlate action with sound.

GAIT03

The spatialisation of sound was also an important task to handle, as the body would be moving though space, the effect of motion in sound would be lacking without the sound also traveling the same distance. Although there was the option to make the result music conceptual (in terms of the relationship between movement and the sound heard) it was a conscious decision to avoid, or at best minimise, use of such methodology as I pursued an authentic representation with greater accuracy.

Last but not least, the overall aesthetic I chose was designed to situate the work in retro-futuristic sci-fi landscape, a familiar feeling, but in unrecognisable world. ‘Could this be us?! An over-arching theme of mid-twentieth century sci-fi’. The cave-like primordial lair in which the user walks down reflects the primitive connection between the two main foci of the work: music and movement. Much like the distance worlds visited in pop-culture sci-fi, the barren landscapes nearly always almost uninhabited, steeped in obvious metaphor to represent our world around us, which we don’t necessarily always see. The decision to use synthesis, rather than recordings of acoustic instruments was to compound this lineage of though, not to mention the nature of the subject matter. It wasn’t that long ago that studies in cognitive science, or embodied music cognition were considered in the realm of ‘the future’ – a period of exciting new optimistic technologies, that would amaze and wonder.

The toughest challenges faced in this project were largely technical issues. Wide- area skeleton tracking is not something the Microsoft Kinect is built for, and the people within the already small community who have achieved satisfactory results with such technology are few and far between. This problem was also compounded further due to the now offline status of openni.org (since April 2014) the community formerly dedicated to providing appropriate drivers/software for hacking the devices. It was also very difficult and time consuming to find the appropriate control methods that could be easily discovered and used yet still yield an engaging experience for the participant, as Identifying natural movements and how best to represent them with a degree of authenticity is no small feat. Overall I am please with the outcome, and have to some extent, been successful in achieving my original aims of the piece. However in future I would consider scaling down to focus on a specific type of motion or gesture, in order to achieve a higher resolution of understanding, and providing a more detailed and useful dataset.

GAIT04