FRAY Studio was recently commissioned to create the visuals for Glass Animals’ - Tour of The Earth; their current tour promoting their new album 'I Love You so F***ing Much'. The initial brief challenged the team to explore the creation of a deep, dark, punk-inspired journey through space. The concept required seamless transitions between calm, intimate settings and explosive, dynamic moments.
From the project's inception, FRAY Studio collaborated closely with Glass Animals' lead singer, Dave Bayley, and production designers Cassius Creative. There was a clear expectation from the outset that this show would incorporate live cameras and effects, not only into the IMAGs but into the content also.
NVIDIA background removal
Our initial brief and creative development included many references to sci-fi figures, silhouette tunnels and outlines. During the creative development process, we discovered that NVIDIA background removal was the most effective way to achieve this vision for camera content. The technique allowed edge detection to pick up clean edges and create an androgynous looking figure on screen. From this we could generate tunnels using cloners, which led to our chorus look for song 'On The Run.'
We also utilised NVIDIA background removal in the song 'How I Learned to Love the Bomb.' This look replicated glitch block overlays, bad TV signals, and exaggerated slit-scan effects to transition between verse and chorus. The NVIDIA background removal was then used throughout the song, first as a subtractive silhouette against a bad signal background and then a colourful glitchy block silhouette.
How I Learned to Love The Bomb performed live in Atlanta.
The use of background removal to create intense graphic visualisations creates several issues. The biggest of course being reliability. This is because NVIDIA background removal is an artificial intelligence (AI) tool so comes with next to no external control. It either finds a body or it doesn't. In those moments, when it loses the figure, the whole effect is ruined. Which as you can imagine is a scary thought when an entire chorus during a live show relies on it working consistently. To try to increase the viability of this effect we worked very closely with the lighting programmer Alex Noel and Cassius, who worked hard to provide consistent and bright key light on stage, specifically focusing on lead singer Dave.
Essential to the success of this effect was close collaboration with Camera Director Ed Coleman, of Big Noise Films UK. We were working with two camera signals for the show. One of these the Ed would cut for the IMAG screens to provide the crowd with a dynamic overview of the show, the other was cut specifically for the NVIDIA background removal moments, with front-on, full-body and mid shots of lead singer Dave.
During rehearsals, we tested out many different camera shots to discover exactly what not to do; we found that any shot with LED wall in the background could confuse the AI and cause a lot of inconsistency in the outline. Shots of other members of the band were never intended to be used but we did discover that it was very inconsistent on a drummer with so much of their body hidden behind the drum kit.
In the video below you can see the figure tunnel from the chorus of On The Run, working in Notch.
On The Run performed live in Atlanta.
MIDI, audio and real-time Integration
Glass Animals' live performance style introduced an intriguing complexity to our content creation process. Unlike many acts, the band doesn't rely on timecode, which added a layer of spontaneity to their shows. Moreover, they extensively utilise MIDI-controlled instruments, including synths, keyboards, and electronic percussion. All of these are custom programmed by the band members themselves. This unique setup presented us with exciting opportunities to harness these MIDI signals, allowing us to generate and control real-time content onstage.
Interpreting the MIDI notes also required collaboration between our team, the band, and their technicians. Fortunately, using Ableton and QLab, we were able to condense the information into simple 'on' and 'off' notes, translating to 1 and 0. By utilising accumulators, condition modifiers, and a little bit of maths in Notch, to create visual representation of the beat, as well as creative accents and nuances corresponding to specific notes and rhythms. The content was as dynamic as the musicians on stage.
This technique was particularly effective for two songs, 'Wonderful Nothing' and 'Life Itself'. Both were built entirely in Notch, with each section featuring MIDI or audio-driven elements. For instance, without MIDI input, a section of 'Wonderful Nothing' would appear as a mere straight green line. When the input is active however the straight green line turns into a square oscilloscope with each beat adding and subtracting from a points vertical position to actively draw a pattern with music.
Notch demonstration of how the MIDI input is causing the shape.
While 'Life Itself' would display only a floor of blue blocks. When a midi input is active in ‘Life Itself’ we see different orange shapes appearing out of the floor and disappearing into space with a different selection of shapes happening on every 1st, 2nd, 3rd and 4th beat then resetting back to the beginning.
The above videos demo what the MIDI inputs are controlling in the first verse of Life Itself.
Wonderful Nothing is a fully audio and midi driven song. The song itself is divided up into 10 parts each: one using either audio reactivity, with audio direct from the band; or midi driven. For this song we used all five midi channels with notes coming from the keys, kick drum, snare drum and hi-hat cymbal.
The song opens with an audio-reactive point, flickering and scaling dramatically with the music's intensity. Second, a single line draws left to right based on MIDI notes from the keyboard, the width of the line increases with the number of midi notes received until it reaches the edge of the canvas. This line will begin to fade away if the band stop playing which created a beautiful effect that visually represents the music's ebb and flow.
The third part kicks in with the pre-chorus, mimicking a traditional analogue oscilloscope. This is audio-driven, with Notch-edited sound modifiers reacting to various parts of the waveform for added interest.
The first chorus, part four, ramps things up with dual-oscilloscope-like visuals which are MIDI-controlled for the kick drum and snare input. This is an intense visual accurately displaying every hit of the drum stick and every beat of the kick-drum, matching the choruses rapid development.
Two examples of midi from the kick, snare and hi-hat midi notes effecting the content.
As we shift to the verse, part five, the look changes to a sawtooth oscilloscope effect, again, purely driven by MIDI. The audience see a point in 3D space which is moving constantly, with the MIDI notes affecting its vertical position.
Where note one is adding to the position, and notes two and three are subtracting and then finally note four resetting the point back to the centre. Each of these movements, and the moment between each beat, creates a distinctive sawtooth wave pattern.
Parts six and seven of the song continue this movement but switch to hi-hat midi input, progressively getting glitchier and noisier in order to visually match the building audio intensity.
This pattern can be seen in the demo videos of receiving MIDI inputs.
During a musical pause, part eight, we reveal a single oscilloscope driven by the snare drum beat. The snare's MIDI note both creates and reveals the curve, which then decays to black until the next hit.
The final chorus, part nine, went through many iterations with us landing on a more “pared back” look, pulling the video back to allow a show stopping moment using awesomely intense lasers. This simpler look is made up of audio waveforms resembling heat monitors - five horizontal lines are stacked vertically displaying different bands of the waveform.
We wrap up with the only non-audio, non-MIDI section. Triggered manually on the final note, which sees two points on screen draw horizontal lines that meet and fade, bringing the song to close.
MIDI and rendered content
MIDI integration allowed us to create real-time content and enhance rendered content. We achieved this through Notch-controlled overlays and a Disguise-based solution developed by the very talented programmer, Ed White.
For 'Tokyo Drifting', we split the content into a rendered background and a real-time overlay to match the song's impactful beat. The drums bass-beat rotates the spaceship, while the snare triggers a target and spaceship movement.
The video above demonstrate the how the midi inputs affects the Notch aspect of Tokyo Drifting.
We applied a similar approach to other songs requiring precise beat modification. For 'A Tear In Space' we built a content layer that could be triggered to begin pulsing over the base layer. This pulse is then controlled by via the kick and synth MIDI inputs. This ensured that reactive elements stayed in time with the performance, avoiding the potential misalignment of pre-rendered content.
The above video shows the midi input causing the flash of the content.
Incorporating these musical influences enables visually accurate representations of the music. This means that if the band alters their tempo, extends a pause, or the drummer adds a new solo during a performance, the visual content dynamically adjusts in real-time. This approach connects the visual aspect of the show intimately with the band's performance, similar to how a lighting operator responds live to on-stage events. As a video content studio, we rarely have the opportunity for such dynamic interaction, making this project particularly exciting.
iMag grading
Our first and final element of this show was IMAG grading. To keep the show consistent, we spent a lot of time creating grading overlays. These are originally created in After Effects and then we recreate this as close as possible in Notch. This process involved extensive experimentation, leveraging our understanding of Notch software's abilities and pushing it to its limits. To achieve the specific filmic grain and texture without appearing overly digital we focused on a combination of internal post FX and external textures. These were then composited and overlaid onto the live camera feed in various ways, resulting in a cohesive and era-specific visual aesthetic for the show.
We created a master overlay of post-effects applied in various combinations to each look, incorporating elements like edge detect, VHS blur, and colour correction to achieve a subtle 16mm filmic grade. Beneath this, we developed diverse looks ranging from simple to complex grades with verse-chorus progression, some utilising MIDI inputs. For instance, in 'Tokyo Drifting', the camera view follows the spaceship's movement; the look itself is diamond-masked and duplicated to create a ghosting effect. This whole look is then rotated on screen from left to right by the kick drum MIDI input.
Demonstration in Notch of how the MIDI input effects the iMag look.
On the left the iMag screen, on the right the same moment on stage.
Contrastingly, our IMAG look for the show’s opener ‘What The Hell is Happening’ focuses on the audiences initial engagement with the band; with a minimalistic cinematic grade and colour correction we are still able to create a beautiful look without overpowering the bands presence on stage. Beginning the show this way gives us, as visual content creators, as well as the band, room to grow and take the audience on a visual and acoustic experience.
Comments