spark::diwali light festival 2015


 
SPARK_AMD_by+Bas+van+Est+4
 

 
SPARK_AMD_by+Bas+van+Est+27
 

spark::diwali light festival 2015





Angus Muir Design was engaged by Spur to design and build an interactive Canopy of Lights for Spark as a way of celebrating Diwali with their staff, customers and the public.
 
The interactive canopy consisted of 864 custom LED lanterns and a surround sound audio system integrated into the lights, highlighting Sparks partnership with Spotify
 
Located at the main entrance of Spark’s head office, the public were able to interact with the installation, controlling the lights and music.
 
My involvement in the project was to design and develop the interactive control interface using the lemur scripting environment, and an audio responsive pattern generator to determine what colors and brightness values to be displayed on the DMX matrix.
 
The interactive control system allowed users to bounce, pull and move balls around a touch devices screen, that illuminated lights in the DMX matrix. The touch interface was also mapped to a series of sounds that would allow the user to compose musical sequences as they interacted with the light matrix.
 
The result was an interactive and highly engaging digital experience.
 
Photos by Bas van Est
 
Video by Sophia Jenny


SPARK_AMD_by+Bas+van+Est+23
 
SPARK_AMD_by+Bas+van+Est+22
 
SPARK_AMD_by+Bas+van+Est+15
 
SPARK_AMD_by+Bas+van+Est+13
 
SPARK_AMD_by+Bas+van+Est+11
 
SPARK_AMD_by+Bas+van+Est+10
 
SPARK_AMD_by+Bas+van+Est+6
 
SPARK_AMD_by+Bas+van+Est+3
 
SPARK_AMD_by+Bas+van+Est+2
 
SPARK_AMD_by+Bas+van+Est+1
 

choreographic coding lab::augmented motion capture visualization application


 

choreographic coding lab::augmented motion capture visualization application





The choreographic coding lab [ccl] format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice – more info can be found here.
 
Myself, Richard De Sousza, Peter Walker, Steven Kallili, and Michael Havir participated as a team in the 2015 ccl X motion lab at Deakin University’s Burwood Campus in Melbourne.
 
Together we developed a prototype for a real-time augmented motion capture visualization system that uses streamed motion capture data to visualise motion in 3D. The visualizations augment the performer, with the ability for the user to interact with the visualizations – allowing the user to experience the digital performance from various perspectives, and with a level control over certain aspects of the visualizations – by interacting with the device where the augmentation takes place.
 
Correct perspective and scale alignment is achieved by tracking positional information of the device where the app is running [in this case a tablet], and applied to a 3D virtual camera that is viewing 3D visualizations of the motion capture data.
 
The entire application was designed and developed in TouchDesigner, with each of us contributing our own specialist skills to the framework – Peter on data interpretation, Richard on the interactive interface [tablet], Myself on integration and visualization, Michael on sound development, and Stephen on performance.
 
At the end of the lab, we had a functioning prototype for a new approach to a part digital part physical interactive performance experience, and a broader knowledge of the high end motion capture industry, and how to work with the data it produces.
 
Here’s a bit of documentation from the week –