jeals::up there::procedural graphics demo


 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-00-06-24)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-01-52-22)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-03-58-12)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-06-22-19)

 

jeals::up there::procedural graphics demo





A little demonstration of a very stylized aesthetic I have been developing with my live procedural graphics workflow for visual performance.
 
All graphics are real-time rendering at 60fps, and completely procedural (except for the head model which I made from a modelling tutorial a few years back).
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-00-11-12)
 
The content is interactive and audio responsive, and all controlled in real-time using various midi controllers with (yep you guessed it) TouchDesigner.
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-03-42-04)
 
Big thanks to Jeals for letting me jam this clip together to his awesome track ‘Up There’
 
I highly recommend listening to him on his soundcloud page, or dropping him a line here: marcin.sulewski1@gmail.com
 
Hopefully get to perform this amazing artist live one day!
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-04-07-21)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-04-29-07)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-05-05-05)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-05-48-09)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-06-06-04)
 


 


interactive LED sculpture experiment


 
MVI_1901.00_04_08_02.Still002
 
MVI_1901.00_06_54_23.Still004
 
 
MVI_1901.00_10_34_21.Still007
 
MVI_1901.00_28_09_05.Still008
 
MVI_1901.00_43_30_04.Still016
 

interactive LED sculpture experiment





In an attempt to get my head around custom lighting electronics, micro-controller board configuration and raspberry pi development, I designed this interactive LED sculpture for the launch of the Steeplejack music label in Melbourne.
 
The sculpture is comprised of addressable RGB strip with custom wiring and soldering to a fadeCandy micro-controller board that is attached to a raspberry pi2. The raspberry pi is running a server that receives pixel information over a wireless network and maps pixel byte data to the addressable LEDs. The pixel information is sent over the network from my laptop via a piece of custom software developed in processing that converts texture data to pixel byte data.
 
MVI_1901.00_04_06_02.Still001
 
Audio responsive and generative patterns run in TouchDesigner that are piped into the texture->byte data conversion software via the spout texture sharing framework. The TouchDesigner setup is interfaced with via osc and midi for live performance.


MVI_1901.00_05_03_11.Still003
 
MVI_1901.00_07_54_01.Still005
 
MVI_1901.00_07_56_09.Still006
 


Big thanks to Paul Milne for helping with the construction of the sculpture, and helping figure out a lot of the nitty gritty with the electrical and wiring. Another big thanks to Cale Sexton for letting me steal some of his tunes for the demonstration recording video above.


 
MVI_1901.00_31_36_08.Still009
 
MVI_1901.00_31_36_24.Still010
 
MVI_1901.00_36_21_02.Still011
 
MVI_1901.00_39_36_21.Still012
 
MVI_1901.00_41_23_03.Still013
 
MVI_1901.00_41_24_00.Still014
 
MVI_1901.00_41_24_15.Still015


winterFest::3D projection mapping content


winterFest::3D projection mapping content





Demo of 3D projection mapping content, commissioned for the daily light show at Monash University Clayton’s WinterFest, in collaboration with Kit Webster.
 
All 3D content was designed and developed in TouchDesigner by myself, and composited with Kit’s content in After Effects by Kit for the final installation – more info can be found here
 
Music by OCDANTAR
 

 

 

 

 
claytonMonash
 

 

 

 

 

 

 

 



choreographic coding lab::augmented motion capture visualization application


 

choreographic coding lab::augmented motion capture visualization application





The choreographic coding lab [ccl] format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice – more info can be found here.
 
Myself, Richard De Sousza, Peter Walker, Steven Kallili, and Michael Havir participated as a team in the 2015 ccl X motion lab at Deakin University’s Burwood Campus in Melbourne.
 
Together we developed a prototype for a real-time augmented motion capture visualization system that uses streamed motion capture data to visualise motion in 3D. The visualizations augment the performer, with the ability for the user to interact with the visualizations – allowing the user to experience the digital performance from various perspectives, and with a level control over certain aspects of the visualizations – by interacting with the device where the augmentation takes place.
 
Correct perspective and scale alignment is achieved by tracking positional information of the device where the app is running [in this case a tablet], and applied to a 3D virtual camera that is viewing 3D visualizations of the motion capture data.
 
The entire application was designed and developed in TouchDesigner, with each of us contributing our own specialist skills to the framework – Peter on data interpretation, Richard on the interactive interface [tablet], Myself on integration and visualization, Michael on sound development, and Stephen on performance.
 
At the end of the lab, we had a functioning prototype for a new approach to a part digital part physical interactive performance experience, and a broader knowledge of the high end motion capture industry, and how to work with the data it produces.
 
Here’s a bit of documentation from the week –
 

 

 

 

 

 

 

 



mmw::project swell


mmw::project swell





Interactive visual performance system designed and programmed in TouchDesigner in collaboration with electronic musician Jay Bhattacharya [EMERSE] especially for Swell at Melbourne Music Week 2014.
 
Simple black and white barcode, binary and mathematical patterns were created by the interpretation of data generated during EMERSE’s live electronic performance. The walls surrounding EMERSE as he played were projection mapped with the generative patterns, altering how the space was perceived, creating a false sense and illusion of added depth.
 
Melbourne Music Week 2014 Wrap Video
 



rat & co


ratandco12
ratandco05
ratandco07
ratandco03
IMG_0027
IMG_0073

rat & co





Since late 2013, I have been collaborating with Melbourne band Rat & Co on various audio visual performances, festival performances such as Sugar Mountain, galleries such as the National Gallery of Victoria, and the production of various local Melbourne shows.
 
Our collaborative creative vision for our live performances is to craft a seamlessly integrated and responsive audio visual set, designed to take viewers senses on an interplanetary journey.
 
My live setup consists of real-time 3D graphics, composited with a bank of traditional video footage I have collected and created over the years, a pipeline of real-time GLSL post production and effects, with seamless MIDI integration from both the bands live Ableton Set, and my own MIDI devices for triggering of visual elements, and effects control. The performance setup has been designed and developed in Derivative’s TouchDesigner, with a focus on portable visual banks and 3D graphics and a portable effects pipeline.
 
The above video is some documentation from our show at Sugar Mountain Festival 2015.
 
Big thanks to Phoebe Powell for the great photographs from our show at Shadow Electric on the 02.04.15
 
The second series of images below are from a performance at Estonia House with in collaboration with the creators of Shadow Electric in mid 2015.
 
ratandco01
 
ratandco02
 
ratandco04
 
ratandco06
 
ratandco08
 
ratandco09
 
ratandco10
 
IMG_0036
 
IMG_0038
 
IMG_0069
 
IMG_0076
 
IMG_0095
 
IMG_0133
 
IMG_0159
 
IMG_9960
 



google warming

tectonics02_0076

tectonics0076

google warming





Google Warming is a research scholarship project for Victoria University of Wellington, NZ that extends work already undertaken by Dr. Leon Gurevitch on the relationship between Google Earth and the visualization of climate change. Work prior to the commencement of this project has been accepted for publication in A rated international media journals and will form the foundation for extending such material into a 3D printed science communication project.
 
antarctica01_0076

co2Cloud0076
 
Locating parts of the earth that have been environmentally damaged or altered, I captured data from Google Earth and various other online repositories which was interpreted programatically in Rhino3D and Grasshopper to generate parametric 3D data visualizations. From Rhino, I experimented with comparative, rapid prototyped 3D printed models to communicate the scale of environmental change taking place on Earth.
 
From “before and after” mountaintop removal data from the Appalachian mountains in the United States to ocean acidification, deforestation in Brazil, CO2 emissions, the predicted rise of sea levels for Wellington, I created models that physically represent the scale of change taking place on the earths surface.
Using low cost 3D printers to print out models cheaply and quickly, this research tested out different modelling possibilities from individually printed small-scale models through to larger, tiled prints.
 
corelNew0076

deforestationNZDEV01_0076


 

sam hirschfelder::experiment 01


sam hirschfelder::experiment 01





Experiment 01 is a live improvisational drum performance by Sam Hirschfelder. Sam is also responsible for composing and all the electronic elements backing the live percussion.
 
Blunk were commissioned to develop an interactive lighting system using live MIDI triggers placed in the drum kit. The system was developed using TouchDesigner. Percussive hit data was programmed to trigger projections used to light Sam while performing.
Bob and I (Blunk) had real-time control over patterns in the projections in order to dynamically control how light responded to Sam’s improvisation.
 
The recording and film shoot took place at Smooch Records.
The track was mixed and mastered by the legendary Anrei Eremin.
The video was directed and filmed by Blunk.
Editing and post-production was carried out by yours truly.
 
experience Blunk
say hi to Josh at Smooch!
meet the master Andrei Eremin


 

blunk::channels


blunk::channels





Channels by Blunk is an aesthetic exploration as well as a technical demonstration of the Spout video sharing framework for Windows. Each channel is defined by a colour palette and a realtime graphics application. The user is able to switch between these channels using a TV remote control. The piece ties a thread between the new possibilities enabled by emerging technologies such as Spout and the new impossibilities imposed by disappearing infrastructure such as analog TV.
 
Under the hood, Resolume is used to switch between video sources being received via Spout. These sources are realtime applications created in different frameworks – Processing, Max, openFrameworks, VVVV and VIZZable. The TV remote signals are converted into OSC using Max and forwarded to Resolume for channel switching as well as TouchDesigner for controlling the video transitions and interactive, mapped projections. Two video feeds are sent from TouchDesigner, one to the analog TV display and the other to a projector to augment the TV and space surrounding it.
 
visit Blunk
visit Spout