The Macrophonics artist collective has just commenced our Open Source media technologies residency with Sydney based physical theatre company Legs on the Wall at their Red Box development space. Background information on the project can be found here.
The residency sees the group examine a range of interface technologies and approaches to explore the nexus between theatrical performers and a four member live media ensemble. The objective is to build a responsive performance environment that will allow the theatre performers to have direct gestural input into the media control system, defining new relationships between the performance ensemble, the media design elements and the media ensemble.
We’ve broadly structured our investigation across two approaches. The first is video tracking/computer vision techniques (using the cv.jit suite of objects for Jitter) where the stage area can be analysed and moving objects tracked.
The second area of investigation is wearable garment interfaces that sense movement directly at each performer (accelerometers, pressure and flex sensors, light sensors etc) and send data wirelessly to the media ensemble. We are using the LilyPad Arduino platform for this work. The LilyPad is a small, flat implementation of the Arduino platform that can be sewn into garments. Conductive ‘thread’ can then be sewn in ‘tracks’ into the garment to form the links between the device and the attached sensors.
The LilyPad can be run from a battery source and can, with a bit of extra stuff, transmit data wirelessly over an ad hoc wireless network.
We’ve just been advised that our Macrophonics collective will be one of two artist groups in residence at Legs On The Wall in Sydney in Nov/Dec 2012 as part of their Open Source program.
This project is supported by the Australia Council for the Arts and provides and opportunity for artists working with media technologies to undertake a month long residency with the company to explore an aspect of digital media in theatre/performance contexts.
The group proposes that for Open Source we build on the work and momentum of Macrophonics Brisbane Festival project in 2011 by establishing a group collaboration framework to explore the use of sensing technologies and instruments in a theatrical context. Each artist brings a different skill set and, in combination, they represent an extremely broad capability across sound and visual media technologies and physical computing. Much of our work to date has focused on sensing in an ‘instrumental music’ fashion, drawing upon music models for performer/technological interaction. In this project we would examine the ways in which we could work with live theatre performers as agents within a multi-performer sensing project. Video tracking, position, distance and touch sensors would be used to create a responsive space for performance. The performance space would output data as a result of the performer’s movement and this data would be drawn upon by the Macrophonics team to create a responsive sound and media scape. The objective would be to augment the expressive range of the theatre performer by situating them in a responsive environment. Specifically to explore and define new direct relationships between the theatre performers, musicians and media artists who traditionally work with a network of indirect or interpretive relationships with one another.