over2.jpg

Choreographic Coding Lab

The Choreographic Coding Lab (CCL) format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice. Working with patterns in movement scores and structures through finding, generating and applying them with results ranging from prototypes for artworks to new plug-ins for working with dance related datasets. CCLs also seek to support a sustainable collaborative practice among its participants encouraging ongoing exchange in a growing artistic research community.

CCLs are an outcome of Motion Bank, a four-year research project of The Forsythe Company focused on the creation of on line digital scores with guest choreographers.

Project Info

The Choreographic Coding Lab [CCL] format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice – more on CCL here.
 
Myself, 
Richard De SouszaPeter Walker, Steven Kallili, and Michael Havir participated as a team in the 2015 CCL X Motion Lab at Deakin University’s Burwood Campus in Melbourne.
 
Together we developed a prototype for a real-time augmented motion capture visualization system that uses streamed motion capture data to visualise motion in 3D. The visualizations augment the performer, with the ability for the user to interact with the visualizations – allowing the user to experience the digital performance from various perspectives, and with a level control over certain aspects of the visualizations – by interacting with the device where the augmentation takes place.

Correct perspective and scale alignment is achieved by tracking positional information of the device where the app is running [in this case a tablet], and applied to a 3D virtual camera that is viewing 3D visualizations of the motion capture data.
 
The entire application was designed and developed in TouchDesigner, with each of us contributing our own specialist skills to the framework – Peter on data interpretation, Richard on the interactive interface [tablet], Myself on integration and visualization, Michael on sound development, and Stephen on performance.
 
At the end of the lab, we had a functioning prototype for a new approach to a part digital part physical interactive performance experience, and a broader knowledge of the high end motion capture industry, and how to work with the data it produces.

 
Exploring the translation of choreography and dance into digital form.
 
ccl09.png
ccl06.png
 
ccl01.png
ccl02.png
 
over2_4.png