Category Archives: Performance

Ghost Ships Video Documentation Online

We’ve just completed some video documentation of our mediatized performance work ‘Ghost Ships’, which we shot in the studios at Macquarie University in Sydney.

‘Ghost Ships’ is a work for wireless wearable performance interface and intelligent lighting systems. The work uses of an unencumbered wireless performance interface to drive all media elements via free air gestures from a single on stage performer. Through the use of a wearable microcontroller and sensor system designed for the project by the artists, a single performer is able to play the entire theatrical space (light, sound, video) through arm and hand gestures. This includes an intelligent moving light system that moves theatrical lights and changes light parameters in response to the performer’s gestures.

This is an evolving prototype of an interface we’ve been working on for a couple of years now. You can read about some of the R&D work here.

Software/media programming: Julian Knowles

Wearable interface design and additional programming: Donna Hewitt

Music: Julian Knowles

Performance improvisations: Donna Hewitt

Macrophonics Open Source 2012 photo gallery

Vivid Digital Workshop. Museum of Contemporary Art Australia

Here are a few snaps of the recent Macrophonics digital workshop for Vivid Sydney at the Museum of Contemporary Art in Sydney. Donna Hewitt and Julian Knowles delivered a 3 hour presentation/workshop on approaches to gesture controlled audio and video.

Museum of Contemporary Art Australia

Museum of Contemporary Art Australia

 The workshop introduced participants to the Arduino platform and various approaches to designing gestural sensing systems. Below you can see our most recent wearable interface prototype (built as part of our ongoing creative R&D with Legs on the Wall) which has been extended to include a glove based interface in addition to flex and accelerometer sensors.

Wearable prototype #2 and josystick
Wearable prototype #2 and joystick

The second prototype has been designed so that it can be retrofitted to different performers of different sizes. Donna has used a neoprene and velcro system to mount sensors and provided a means to replace or change sensors via locking connectors for each sensor. The workshop also provided an opportunity to give an overview of the Max/Jitter patching environment and how gestural data was mapped in the process of achieving successful and engaging creative outcomes.

Donna Hewitt showing wearable prototype #1
Donna Hewitt showing wearable prototype #1

The workshop participants had a broad range of prior knowledge (from little, to quite a lot), but all seemed to get a lot out of the workshop process. And besides… it’s such a lovely building.

Vivid Ideas talk: The Making of the Future of Performance

Vivid Sydney. Museum of Contemporary Art Australia
Vivid Sydney. Museum of Contemporary Art Australia


Vivid Ideas: The Making of the Future of Performance

Speakers: Patrick Nolan, Julian Knowles, Donna Hewitt, Timothy Ohl

Museum of Contemporary Art, Australia

6 June, 2013. 9:00 pm – 10:00 pm


General – $20.00
Industry – $15.00
Student/Educator – $10.00

Donna Hewitt - Macrophonics
Donna Hewitt – Macrophonics

In late 2012 the artist collective Macrophonics visited internationally-recognised physical theatre company, Legs On The Wall, to experiment with sensor-based technology and how it can be adapted specifically for live performances. What emerged was a tool for creating live art, one that uses both the body of a performer and the space they move through. Now Macrophonics and Legs On The Wall offer audiences a glimpse into what comes next – the creation of a new work that transforms the dancer into the conductor, the musician into the actor, and the stage into the interface.

This presentation will feature Professor Julian Knowles of Macquarie University’s Department of Media, Music, Communication and Cultural Studies and Patrick Nolan, Artistic Director of Legs On The Wall, and a demonstration of the Macrophonics technology with performer Tim Ohl.


 In a series of fast-paced, one-hour tag-team sessions, creative teams from the worlds of television, gaming, music, and animation take us behind-the-scenes to introduce us to the people and processes that make huge creative projects a reality.

 These sessions explore strategies for making international real-time collaborations work, find out what tools are essential for co-ordinating multi-part projects, and meet some of the world’s most interesting companies and creatives.

Radio Interview with Julian Knowles – 2SER Sydney, Australia (from 17:20 onwards)

Legs on the Wall – Open Source Residency Blog 4

The accelerometer and wearable work progressed smoothly, without significant hitches. Donna’s wearable was completed in the final week. A few issues are worth mentioning here. She found the use of conductive thread and ribbon to be somewhat problematic. Great care needed to be exercised to prevent thread touching and shorting out (which reset the boards) and the high resistance also posed some issues. This caused Donna to make a decision to replace most of the conductive thread with conventional insulated hookup wire. Whilst initially concerned about appearance, the hookup wire actually looked great. Take a look.

Donna - showing off finished wearable with insulated wire
Donna – showing off finished wearable prototype with insulated wire

A second issue was the total power draw of the sensors and Xbee wireless system. Donna found that the single AA battery could not supply enough power to run the system properly. With insufficient time to source and fit a higher capacity battery system before the public showing of our work, she decided to power the system from a USB cable and make the necessary modifications to the battery system following the residency.

To everyone’s surprise, when we connected the wearable to our test audio and video patches for the first time, the results were beyond our expectations. The wearable was highly responsive, ‘playable’ and Donna reported that she felt a fine sense of control over the media. She felt immersed in the media and the interface was highly intuitive, providing a rich set of possibilities for gestural control. Here is a video showing the initial hookup of the wearable interface to a test audio patch with the triaxial accelerometer and flex sensor mapped to audio filter parameters.

Macrophonics – first wearable trial

In the final week we experimented with a range of sensing techniques that picked up the position of performers on a stage. We experimented with the cv.jit and with Cyclops systems within MaxMSP. Both proved to be excellent. However given that they are video tracking systems they are inherently light dependent and so their output and behaviours are fundamentally affected by changing light states and conditions.

Cyclops patch detail
Cyclops patch detail
Cyclops grid and hotspot 'zones'
Cyclops grid and hotspot ‘zones’

All of this is manageable with precise light control and programming (or the use of infra red cameras), however in the absence of more sophisticated lighting and camera resources, we decided to use ultrasonic range finders on the stage to locate the position of performers.

Ultrasonic rangefinder
Ultrasonic rangefinder

Two of these devices were placed on either side of the stage,  outputting a stream of continuous controller data on the basis of the performer(s) proximity to the devices. These sensors were connected to Tim’s computer via an Arduino Uno. This allowed us to have a simple proximity sensing system on stage. The system was robust in respect of light, but suffered from occasional random noise/jitter that would last for a few seconds without obvious cause. This meant we had to apply heavy signal conditioning to the source data in Max/MSP to smooth out these ‘errors’ which in turn resulted in a significant amount of latency. Given these constraints, we used these sensors to drive events which did not have critical timing dependencies and could ramp in and out more gradually.

Julian spent a number of days in the final week programming the relationships between the sensing systems and the video elements, working with ‘retro’ 70s scanline/raster style video synth processing and time domain manipulations of quicktime movies. Julian’s computer also operated as a kind of ‘data central’, with all incoming sensor data coming in to my computer and from there being mapped/directed out to the other computers from a central patch in MaxMSP+Jitter.

Detail of Jitter patch for Open Source residency
Detail of Jitter patch for Open Source residency