jeals::up there::procedural graphics demo


 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-00-06-24)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-01-52-22)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-03-58-12)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-06-22-19)

 

jeals::up there::procedural graphics demo





A little demonstration of a very stylized aesthetic I have been developing with my live procedural graphics workflow for visual performance.
 
All graphics are real-time rendering at 60fps, and completely procedural (except for the head model which I made from a modelling tutorial a few years back).
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-00-11-12)
 
The content is interactive and audio responsive, and all controlled in real-time using various midi controllers with (yep you guessed it) TouchDesigner.
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-03-42-04)
 
Big thanks to Jeals for letting me jam this clip together to his awesome track ‘Up There’
 
I highly recommend listening to him on his soundcloud page, or dropping him a line here: marcin.sulewski1@gmail.com
 
Hopefully get to perform this amazing artist live one day!
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-04-07-21)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-04-29-07)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-05-05-05)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-05-48-09)
 
Jeals Up There - VJ Demo Reel Linked Comp 02 (0-06-06-04)
 


 


interactive LED sculpture experiment


 
MVI_1901.00_04_08_02.Still002
 
MVI_1901.00_06_54_23.Still004
 
 
MVI_1901.00_10_34_21.Still007
 
MVI_1901.00_28_09_05.Still008
 
MVI_1901.00_43_30_04.Still016
 

interactive LED sculpture experiment





In an attempt to get my head around custom lighting electronics, micro-controller board configuration and raspberry pi development, I designed this interactive LED sculpture for the launch of the Steeplejack music label in Melbourne.
 
The sculpture is comprised of addressable RGB strip with custom wiring and soldering to a fadeCandy micro-controller board that is attached to a raspberry pi2. The raspberry pi is running a server that receives pixel information over a wireless network and maps pixel byte data to the addressable LEDs. The pixel information is sent over the network from my laptop via a piece of custom software developed in processing that converts texture data to pixel byte data.
 
MVI_1901.00_04_06_02.Still001
 
Audio responsive and generative patterns run in TouchDesigner that are piped into the texture->byte data conversion software via the spout texture sharing framework. The TouchDesigner setup is interfaced with via osc and midi for live performance.


MVI_1901.00_05_03_11.Still003
 
MVI_1901.00_07_54_01.Still005
 
MVI_1901.00_07_56_09.Still006
 


Big thanks to Paul Milne for helping with the construction of the sculpture, and helping figure out a lot of the nitty gritty with the electrical and wiring. Another big thanks to Cale Sexton for letting me steal some of his tunes for the demonstration recording video above.


 
MVI_1901.00_31_36_08.Still009
 
MVI_1901.00_31_36_24.Still010
 
MVI_1901.00_36_21_02.Still011
 
MVI_1901.00_39_36_21.Still012
 
MVI_1901.00_41_23_03.Still013
 
MVI_1901.00_41_24_00.Still014
 
MVI_1901.00_41_24_15.Still015


spark::diwali light festival 2015


 
SPARK_AMD_by+Bas+van+Est+4
 

 
SPARK_AMD_by+Bas+van+Est+27
 

spark::diwali light festival 2015





Angus Muir Design was engaged by Spur to design and build an interactive Canopy of Lights for Spark as a way of celebrating Diwali with their staff, customers and the public.
 
The interactive canopy consisted of 864 custom LED lanterns and a surround sound audio system integrated into the lights, highlighting Sparks partnership with Spotify
 
Located at the main entrance of Spark’s head office, the public were able to interact with the installation, controlling the lights and music.
 
My involvement in the project was to design and develop the interactive control interface using the lemur scripting environment, and an audio responsive pattern generator to determine what colors and brightness values to be displayed on the DMX matrix.
 
The interactive control system allowed users to bounce, pull and move balls around a touch devices screen, that illuminated lights in the DMX matrix. The touch interface was also mapped to a series of sounds that would allow the user to compose musical sequences as they interacted with the light matrix.
 
The result was an interactive and highly engaging digital experience.
 
Photos by Bas van Est
 
Video by Sophia Jenny


SPARK_AMD_by+Bas+van+Est+23
 
SPARK_AMD_by+Bas+van+Est+22
 
SPARK_AMD_by+Bas+van+Est+15
 
SPARK_AMD_by+Bas+van+Est+13
 
SPARK_AMD_by+Bas+van+Est+11
 
SPARK_AMD_by+Bas+van+Est+10
 
SPARK_AMD_by+Bas+van+Est+6
 
SPARK_AMD_by+Bas+van+Est+3
 
SPARK_AMD_by+Bas+van+Est+2
 
SPARK_AMD_by+Bas+van+Est+1
 

abc television::the chasers media circus

mediacircus_banner
 
BOTN0011
 
BOTN0006
 
BOTN0010

abc television::the chasers media circus





While working for ABC Television on The Chaser’s Media Circus, my role was to design and develop interactive games and multimedia playback systems using TouchDesigner.
 
The following set of images is to give an overview of one of the multimedia and games playback systems programmed in TD for live operation during studio record on set.
 
Based on the original design by Boris Bagattini (Chris Wilson), the resident TD programmer at The Chaser’s Media Circus – I decided to look for ways to improve the workflow of adding content to the show, devise a system to be able to load and unload new games for the show as they were needed, and design a user interface that would allow for other members of the production team to operate if either myself or Boris was hit by a bus (apparently a common occurrence in Sydney).
 
Board of the News!
 
The premise of this playback system is to allow members of the game show panel to select an item from the board – usually the face of a politician in current media circulation – to which a series of video clips, images, and often embedded interactive games will flow on – relevant to the selected figure.
 
On the left is a floating UI window for the controls of the BOTN system, with options to flip board objects and reveal their respective content, open the board item, randomize the position of board items, output to the window, and mute and audio signal. Additionally there is an option to open a setup UI for each board item, for integrating and updating new content.
 
BOTN0001
 
BOTN0002
 
BOTN0003
 
Within the board setup UI, there are options to add or remove content items, control what content is displayed in the front of the board item before reveal, and control its display size.
 
BOTN0004
 
An option is also provided to open a separate TOX loading UI, where an interactive game can be loaded into the board item. The TOX loading UI provides options for path selection and folder checking, and the ability to kill and destroy TOX, so as once the game is completed during record – it can be unloaded from memory. This is to save any GPU resources, and keep all content embedded within the BOTN framework. All TOP and CHOP outputs automatically link and un-link with the BOTN outputs to save from having to open up the network editor during playback.
 
BOTN0005
 
BOTN0007
 
BOTN0008
 
BOTN0009
 
At the project level, a master ‘output’ system and UI was developed for master control over what is displayed during record.
Within this UI there are options for control over video and audio output, transformations, an additional TOX loader for when this may need to be re-purposed for additional games or playback systems, and the option to load in a static hold screen or trigger-able ‘breaker’ animation.
 
BOTN0012
 
BOTN0013
 
BOTN0014
 
BOTN0015
 
BOTN0016
 
When the TD file is launched, UI elements automatically open up, floating on the operation monitor. Floating and scale-able UI elements was inspired by a need for a modular operation environment, where efficiency can be improved by closing unnecessary windows. All UI elements are mapped to key commands for easy opening and closing during performance mode playback.
 
All in all – working for the ABC was great fun having had no experience in television prior – Would do it again!
 



winterFest::3D projection mapping content


winterFest::3D projection mapping content





Demo of 3D projection mapping content, commissioned for the daily light show at Monash University Clayton’s WinterFest, in collaboration with Kit Webster.
 
All 3D content was designed and developed in TouchDesigner by myself, and composited with Kit’s content in After Effects by Kit for the final installation – more info can be found here
 
Music by OCDANTAR
 

 

 

 

 
claytonMonash
 

 

 

 

 

 

 

 



choreographic coding lab::augmented motion capture visualization application


 

choreographic coding lab::augmented motion capture visualization application





The choreographic coding lab [ccl] format offers unique opportunities of exchange and collaboration for digital media ‘code savvy’ artists who have an interest in translating aspects of choreography and dance into digital form and applying choreographic thinking to their own practice – more info can be found here.
 
Myself, Richard De Sousza, Peter Walker, Steven Kallili, and Michael Havir participated as a team in the 2015 ccl X motion lab at Deakin University’s Burwood Campus in Melbourne.
 
Together we developed a prototype for a real-time augmented motion capture visualization system that uses streamed motion capture data to visualise motion in 3D. The visualizations augment the performer, with the ability for the user to interact with the visualizations – allowing the user to experience the digital performance from various perspectives, and with a level control over certain aspects of the visualizations – by interacting with the device where the augmentation takes place.
 
Correct perspective and scale alignment is achieved by tracking positional information of the device where the app is running [in this case a tablet], and applied to a 3D virtual camera that is viewing 3D visualizations of the motion capture data.
 
The entire application was designed and developed in TouchDesigner, with each of us contributing our own specialist skills to the framework – Peter on data interpretation, Richard on the interactive interface [tablet], Myself on integration and visualization, Michael on sound development, and Stephen on performance.
 
At the end of the lab, we had a functioning prototype for a new approach to a part digital part physical interactive performance experience, and a broader knowledge of the high end motion capture industry, and how to work with the data it produces.
 
Here’s a bit of documentation from the week –
 

 

 

 

 

 

 

 



mmw::project swell


mmw::project swell





Interactive visual performance system designed and programmed in TouchDesigner in collaboration with electronic musician Jay Bhattacharya [EMERSE] especially for Swell at Melbourne Music Week 2014.
 
Simple black and white barcode, binary and mathematical patterns were created by the interpretation of data generated during EMERSE’s live electronic performance. The walls surrounding EMERSE as he played were projection mapped with the generative patterns, altering how the space was perceived, creating a false sense and illusion of added depth.
 
Melbourne Music Week 2014 Wrap Video
 



rat & co


ratandco12
ratandco05
ratandco07
ratandco03
IMG_0027
IMG_0073

rat & co





Since late 2013, I have been collaborating with Melbourne band Rat & Co on various audio visual performances, festival performances such as Sugar Mountain, galleries such as the National Gallery of Victoria, and the production of various local Melbourne shows.
 
Our collaborative creative vision for our live performances is to craft a seamlessly integrated and responsive audio visual set, designed to take viewers senses on an interplanetary journey.
 
My live setup consists of real-time 3D graphics, composited with a bank of traditional video footage I have collected and created over the years, a pipeline of real-time GLSL post production and effects, with seamless MIDI integration from both the bands live Ableton Set, and my own MIDI devices for triggering of visual elements, and effects control. The performance setup has been designed and developed in Derivative’s TouchDesigner, with a focus on portable visual banks and 3D graphics and a portable effects pipeline.
 
The above video is some documentation from our show at Sugar Mountain Festival 2015.
 
Big thanks to Phoebe Powell for the great photographs from our show at Shadow Electric on the 02.04.15
 
The second series of images below are from a performance at Estonia House with in collaboration with the creators of Shadow Electric in mid 2015.
 
ratandco01
 
ratandco02
 
ratandco04
 
ratandco06
 
ratandco08
 
ratandco09
 
ratandco10
 
IMG_0036
 
IMG_0038
 
IMG_0069
 
IMG_0076
 
IMG_0095
 
IMG_0133
 
IMG_0159
 
IMG_9960
 



converse made by you

ConverseMobileStudio-2-2.jpg
 
HomeScreen_CameraSuccessful.jpg
 
ConverseMobileStudio-4-2.jpg
 

converse made by you





ConverseMobileStudio-3-2.jpg
 
To celebrate 100 years of the universally appealing Chuck Taylor All Star, Converse launched the Made by You campaign that highlights ways All Star owners around the world customise their favorite sneakers.
 
The advertising campaign takes the lived-in and loved kicks of the likes of Andy Warhol and Roy English, as well as everyday wearers to demonstrate the various ways they mate their Chuck Taylor All Stars their own original works of art.
 
ConverseMobileStudio-7-2.jpg
 
I was commissioned to design and develop a custom software solution for a mobile photographic studio that can roam free, capturing the heart and soul of All Star owners in Australia.
 
The software’s functionality includes the ability to remotely control any Canon EOS camera over USB, capture / photograph, color correct, compose to printable PDF, collect user data, email the composed lockups, print and finally share the composed imagery to Facebook.
 
HomeScreen_CameraSuccessful.jpg
 
ConverseMobileStudio-10-2.jpg
 
Social.jpg
 
FacebookUpload_Success.jpg
 
The system all runs out of the back of a van, where a miniature studio photography rig has been built for ultimate portability.
ConverseMobileStudio-6-2.jpg
 
Here is a clip Converse put together describing the global campaign: