winterFest::3D projection mapping content


winterFest::3D projection mapping content





Demo of 3D projection mapping content, commissioned for the daily light show at Monash University Clayton’s WinterFest, in collaboration with Kit Webster.
 
All 3D content was designed and developed in TouchDesigner by myself, and composited with Kit’s content in After Effects by Kit for the final installation – more info can be found here
 
Music by OCDANTAR
 

 

 

 

 
claytonMonash
 

 

 

 

 

 

 

 



choreographic coding lab::augmented motion capture visualization application


 

choreographic coding lab::augmented motion capture visualization application





The choreographic coding lab [ccl] format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice – more info can be found here.
 
Myself, Richard De Sousza, Peter Walker, Steven Kallili, and Michael Havir participated as a team in the 2015 ccl X motion lab at Deakin University’s Burwood Campus in Melbourne.
 
Together we developed a prototype for a real-time augmented motion capture visualization system that uses streamed motion capture data to visualise motion in 3D. The visualizations augment the performer, with the ability for the user to interact with the visualizations – allowing the user to experience the digital performance from various perspectives, and with a level control over certain aspects of the visualizations – by interacting with the device where the augmentation takes place.
 
Correct perspective and scale alignment is achieved by tracking positional information of the device where the app is running [in this case a tablet], and applied to a 3D virtual camera that is viewing 3D visualizations of the motion capture data.
 
The entire application was designed and developed in TouchDesigner, with each of us contributing our own specialist skills to the framework – Peter on data interpretation, Richard on the interactive interface [tablet], Myself on integration and visualization, Michael on sound development, and Stephen on performance.
 
At the end of the lab, we had a functioning prototype for a new approach to a part digital part physical interactive performance experience, and a broader knowledge of the high end motion capture industry, and how to work with the data it produces.
 
Here’s a bit of documentation from the week –