An audio reactive, realtime rendered interactive visuals workspace


View a live performance demo of scenes in action

What is Dexterity?

Built in Touchdesigner, Dexterity is a creative workspace for exploring and compositing live rendered 2D & 3D scenes which react to audio in realtime. In short, it makes visuals that react to sounds. System master controls, scene selection & effects are controlled via MIDI while scene controls are mapped to a wireless joypad. This allows for extensive creative freedom during performances. While Dexterity can handle video loops, for artists who wish to have their own material displayed, I tend to lean more towards a procedural scene creation workflow, which means all aspects of the visual can be adjusted in realtime. Think of it as the difference between a DJ set and live hardware set, but for visuals.

Scenes consist of particle simulations, audio spectrum analysis mapped to geometry, instanced objects & anything else I can imagine. These scenes are applied as layers with compositing ability. For example, once scene can control the opacity of another, allowing for intricate textures and interesting shapes. I have developed a range of effects, from fluid sims to tiling and feedack, tweakable via the MIDI interface. All that is needed to get things started is an audio signal input. Custom visualisers can be created on demand, with more minimal but effective solutions available.

What are the practical applications?

Dexterity is mostly used for providing fast feedback audio reactive visuals to music events. It can be run from a small form factor PC which fits in a backpack. Portable and quick to set up, while also a joy to use. It's quick to add a bit of extra flair to any event logo / artist title.

  • Live rendering & Audio Reactive
  • Highly customisable
  • Keeps reacting when not being user controlled
  • Provides instant visual feedback
  • A far more intimate show than VJ Loops

The system has been built to be interaction friendly, no wrong answers! Audience participation is encouraged, users can control elements of the scenes using my wireless peripherals. This has lead to great response from crowd members at events, truly pushing the bounds of what is possible in the world of audio visualsations.


You can only judge a venue by the size of its disco ball

Chamber AZD

Live compositing of 3D renders with multi level lighting effects

The Lab

2000's style sci-fi user interface


Bass affected wire shedding particles


Swirling triangular repetition, a DXO classic re-made for live performance

Urban Light Machine

Grid locked particles set off course by user control

Aquamarine Moon

Orbital bodies with electric pulse atmospheres

Shadow Dust

Noise advected particles distorted by bass frequencies


Instanced geometry manipulated in 3D live


90's throwback to vapor vibes


19 band graphic EQ in triangular form


Full spectrum analysis in a grooverider flow

Particle Blaster

2D fluid sim on a 3D particle sim, luminosity based advection


Multi layered circle of fifths at high rpm