Ghost Ships Video Documentation Online

We’ve just completed some video documentation of our mediatized performance work ‘Ghost Ships’, which we shot in the studios at Macquarie University in Sydney.

‘Ghost Ships’ is a work for wireless wearable performance interface and intelligent lighting systems. The work uses of an unencumbered wireless performance interface to drive all media elements via free air gestures from a single on stage performer. Through the use of a wearable microcontroller and sensor system designed for the project by the artists, a single performer is able to play the entire theatrical space (light, sound, video) through arm and hand gestures. This includes an intelligent moving light system that moves theatrical lights and changes light parameters in response to the performer’s gestures.

This is an evolving prototype of an interface we’ve been working on for a couple of years now. You can read about some of the R&D work here.

Software/media programming: Julian Knowles

Wearable interface design and additional programming: Donna Hewitt

Music: Julian Knowles

Performance improvisations: Donna Hewitt

Macrophonics Open Source 2012 photo gallery

Vivid Ideas talk: The Making of the Future of Performance

Vivid Sydney. Museum of Contemporary Art Australia
Vivid Sydney. Museum of Contemporary Art Australia


Vivid Ideas: The Making of the Future of Performance

Speakers: Patrick Nolan, Julian Knowles, Donna Hewitt, Timothy Ohl

Museum of Contemporary Art, Australia

6 June, 2013. 9:00 pm – 10:00 pm


General – $20.00
Industry – $15.00
Student/Educator – $10.00

Donna Hewitt - Macrophonics
Donna Hewitt – Macrophonics

In late 2012 the artist collective Macrophonics visited internationally-recognised physical theatre company, Legs On The Wall, to experiment with sensor-based technology and how it can be adapted specifically for live performances. What emerged was a tool for creating live art, one that uses both the body of a performer and the space they move through. Now Macrophonics and Legs On The Wall offer audiences a glimpse into what comes next – the creation of a new work that transforms the dancer into the conductor, the musician into the actor, and the stage into the interface.

This presentation will feature Professor Julian Knowles of Macquarie University’s Department of Media, Music, Communication and Cultural Studies and Patrick Nolan, Artistic Director of Legs On The Wall, and a demonstration of the Macrophonics technology with performer Tim Ohl.


 In a series of fast-paced, one-hour tag-team sessions, creative teams from the worlds of television, gaming, music, and animation take us behind-the-scenes to introduce us to the people and processes that make huge creative projects a reality.

 These sessions explore strategies for making international real-time collaborations work, find out what tools are essential for co-ordinating multi-part projects, and meet some of the world’s most interesting companies and creatives.

Radio Interview with Julian Knowles – 2SER Sydney, Australia (from 17:20 onwards)

Legs On the Wall – Open Source residency

Legs On The Wall



We’ve just been advised that our Macrophonics collective will be one of two artist groups in residence at Legs On The Wall in Sydney in Nov/Dec 2012 as part of their Open Source program.

Legs On the Wall - Open Source program
Legs On the Wall – Open Source program

This project is supported by the Australia Council for the Arts and provides and opportunity for artists working with media technologies to undertake a month long residency with the company to explore  an aspect of  digital media in theatre/performance contexts.

The group proposes that for Open Source we build on the work and momentum of Macrophonics Brisbane Festival project in 2011 by establishing a group collaboration framework to explore the use of sensing technologies and instruments in a theatrical context. Each artist brings a different skill set and, in combination, they represent an extremely broad capability across sound and visual media technologies and physical computing. Much of our work to date has focused on sensing in an ‘instrumental music’ fashion, drawing upon music models for performer/technological interaction. In this project we would examine the ways in which we could work with live theatre performers as agents within a multi-performer sensing project.  Video tracking, position, distance and touch sensors would be used to create a responsive space for performance. The performance space would output data as a result of the performer’s movement and this data would be drawn upon by the Macrophonics team to create a responsive sound and media scape. The objective would be to augment the expressive range of the theatre performer by situating them in a responsive environment. Specifically to explore and define new direct relationships between the theatre performers, musicians and media artists who traditionally work with a network of indirect or interpretive relationships with one another.

Read more about this project here.