godComplex


godComplex





godComplex was completed in June 2013 as a final design project for the Computer Graphics for Interaction Design Masters paper at Victoria University of Wellington School of Design.
 
godComplex utilises interactive software that explores techniques and practices used in data visualisation, computational art, interaction design and programmatic modelling of behavioral patterns in weather systems.
godComplex gives its audience the ability to attempt to control the weather – physically manipulating the cloud flow above Wellington by pushing the weather around using 3D hand tracking hardware, Leap Motion.
 
The velocity of flicks, waves and swipes is calculated and magainst the motion of the cloud flow, introducing an element of control over the weather. However, nobody can truly control the weather, and the clouds eventually reform the state less satisying.
 
This project was a collabrative effort between Computer Science PHD student Andrew Chalmers, and myself. The project was programmed in Cinder (C++), and GLSL. Conceptual development, design and art direction was taken by myself. Cinder development was a pair programming excersise between myself and Andrew. Andrew directed C++ and GLSL shader development.
 
This project has been exhibitted as a finalist entry for 2014’s SCREENGRAB6 International Media Arts Award.
 
audio: Small Colin – Mutations


 

architectural intervention::volumetric portraiture


architectural intervention::volumetric portraiture





Architectural Intervention is an audio visual art installation that explores video-graphic portraiture through the use of 3D volumetric film making techniques. The piece demonstrates exploration into live feedback loops, and research into illusive depth and visual perception.
 
architectural intervention 01
 
Volumetric portraits are created through the use of the RGBDToolkit (rgbdtoolkit.com/). An Xbox Kinect sensor is calibrated with a DSLR Camera to capture both depth and high resolution color information, which is then translated into sequences of textured 3D captures of the subject matter.
Sequences I composed with the toolkit were then taken into TouchDesigner for real time audio reactivity and real-time interactive control over 3D animation and compositing techniques.
 
architectural intervention 02
 
The piece was installed in an out-of-commission commercial kitchen in Wellington, New Zealand, making use of the clean white surfaces in one of the large walk in chillers.
The intention of the piece is to completely transform the sterile industrial setting, and create a new immersive environment that provides the illusion of added depth and audio visual synthesis.
 
architectural intervention 03
 
 
Audio by Alva Noto (alvanoto.com/)

 

design research::interactive projection mapping


design research::interactive projection mapping





Above is an interactive audio/visual installation where values are generated by live audio input in order to create evolving parameters on effects that a user can interact with over wireless midi control.
The above video demonstrates the installation and setup process for this piece, as well as a 10 minute live performance to Alva Noto’s “Haliod Xerrox” (The choice of this particular audio was due to inspiration from a projection mapping installation created by Alva Noto himslef you can view here: derivative.ca/events/2012/Unidisplay/).
The content featured in this piece was sourced and decided on based on a live video mixing technique dubbed the Eclectic Method approach (vdmx.vidvox.net/tutorials/how-to-do-an-eclectic-method-remix-overview) where you set yourself a theme, source video material to mix with, and jam with the clips on a MIDI or OSC controller as if playing a drum machine or instrument. In this case, I set myself the task of hunting down various water and under sea life clips from creative commons and public domain online archives.
Using VidVox VDMX5, I constructed and developed through experimentation an interface preset with all my desired clips and effects. Using a base level sound patch for the mac called SoundFlower, I was able to route audio from iTunes into VDMX to be analysed and run through filters to generate parameters for the effects built on each layer.
Once the VDMX interface was running smoothly, and the effects and clip selection tools were all up and running, I programmed my own MIDI interface on my iPad using Liine’s Lemur app. I based my interface on a mixture of drum machines and slider controllers I have seen used in live production, designing it to be as simple and easy to control as possible for live performance. The pads on the right of the interface all representing clips in the VDMX media bins, on and off switches for effects above, and audio input parameter control over the effects on the left.
Once up and running, it was time to project it onto something!
While digging in the recycling bin behind Dick Smith, I uncovered a heap of fresh cardboard boxes which I used to build the structure to project onto, and built my own “ghetto” rig to mount the projector to the ceiling.
The live video output from VDMX was sent through to MadMapper via Syphon for mapping onto the boxes. I then mapped each face of the structure with three different layers of video.
Voila! The installation is all running
 

 
This clip displays a sneak preview of an interactive installation I contributed to Museum of New Zealand, Te Papa’s first ever maker’s fair Makertorium.
audio: The Oil Baron
James Crane
 

 
In this video James Crane and I perform an audiovisual investigation into live feedback loops. Koi (James Crane) is a culmination of experimental, industrial, and grunge electronica creating vast soundscapes that push your knowledge of sound into new dimensions. The live feedback loop I designed for this performance is a visual response to the dimensions Koi has pushed me.