Get Adobe Flash player

[15507] Tempting Failure

[Performance serial #] [15507]
[Performance name / date] Tempting Failure, 6/4/13
[Performance location] The Island, Bristol
[Performance members] Lee Chaos, Adrain Giddings, Hellen Burroughs, Marcus Lanyon, Sarge, Audience
[Performance type] performance art + installation + audience participation + improvised performance
[Performance description]

temp0rary performed [15507] as the closing act of the extreme arts festival Tempting Failure festival at The Island in Bristol on 6th April 2013 in the site of an old police station. [15507] was a collaboration with performance artist Traumata.

The performance began as an installation piece in which an array of sonar sensors detected the proximity of audience
members and triggered sound waves at the resonant frequency of the space, turning the entire building into an instrument. The audience were then given ‘light vessels’ which allowed them to further manipulate the sound of the environment through interaction with the light sensors embedded in the roof, leading to collaborative play. The audience’s movements were scanned with infra-red cameras in 3 dimensions and were used as projected visual elements.

We then performed a full body suspension, at the same time reading the EEG brain waves of the suspended body and
translating this into music and visual elements whilst the audience and performers onstage worked together to create
an evolving audiovisual experience.

At the finale of the piece, millions of electrostatically charged particles were released into a wind machine in order to play havoc with the sonar sensors, whilst strobe lights and lasers were triggered to further modulate the light sensors, resulting in an intense finale of sound and light.

[photographs]

[movies]

This video is from the second and third movements of the performance:

[sound]

Below is some of the music & sound design for [15507] – The performance itself lasted about an hour. During the first movement, the audience were triggering sounds via sonar and light sensors. Later elements were triggered by reading EEG brain waves.


[development photos]

Below are the performance notes, production photographs and test videos for our performance [15507].

Slideshow:
Fullscreen:

[production notes]

The design

After a couple performance and technical meetings, the [15507] show began to take shape. It was defined as much by what we didn’t want it to be – We wanted to avoid the more obvious tropes of suspension (ritual; dominance / submission; medical or butcher metaphors) and instead show a shared consensual experience. This led to designs inspired by interconnection and collaboration, which tie into the audience paricipation aspects of performance that temp0rary have been exploring in recent multimedia performances. This led us to the idea of matching, interconnected uniforms (both visually and literally) and of wiring the five performers and the audience together to participate and share in the extreme state of mind required for full body suspension.

The EEG sensors

The EEG sensor that was worn during the suspension was a circuit-bent MindFlex toy. The toy in its unadulterated state reads brain activity with a headset, and transmits the data vie Bluetooth to a base unit. The base then usues this to adjust the output of a fan, causing the ball to levitate. The hack was conducted on the base unit to leave the headset intact. The speed of the fan is transmitted to a Miditron circuit board with a custom patchbay which is programmed with Max/MSP. The output from this board is a stream of continuous controller data, which outputs low values during meditative states and higher values during concentration or excitement. This data is received by the master computer running Ableton Live and is used to determine the pitch of the main arpeggiated synthesiser line for the performance as well as the relative levels of the 4 oscillators that make up the Operator synth patch.

The sonar sensors

The sonar rig was designed and created in collaboration with Simon Ogden (SiyTek). Using inexpensive sonar sensors designed for use with Arduino boards, housed in modified head-mounted torch enclosures, the sensors allowed the audience to move around the space activating sounds specific to each movement of the performance. The sensors were spaced along the girders in the space and had an active range of just under a metre. The data was then transmitted to the stage via a custom built patchbay and into a speically commissioned circuit board built by Siytek that translated the data into MIDI continuous controller data. Although six sensors were used in this performance, sixteen inputs were available in the first prototype.

The light sensors

The light sensors were a development of the technology that was created for a previous temp0rary performance at the Mayan Apocalypse Survivor’s Party at the end of 2012. Light sensors are housed in plastic capsules and for this event were positioned above the audience in a ‘reverse chendelier’ configuration. This data was transmitted to the Miditron board alongside the Mindflex data and it was configured in Ableton Live to add extreme distortion effects to the sounds that were triggered by the sonar sensors to encourage the audience to work collaboratively to uncover the different sound combinations. The audience were then given light Vessels – small pots containing a simple LED circuit with which to activate the sensors with custom art insers by Adrian Giddings.

The suit sensors

The suits were designed so that they could be further connected into the sensor rigs. Each suit featured a light sensor on the front right lapel, and a drum trigger constructed from a piezo mic on the right lapel. This allowed for the performers wearing them to temporarily patch thier data outputs into the patch bay to further interact with each other and the lighting within the performance space.

The VR glove

The P5 Virtual Reality glove was used to send MIDI information into Ableton to create an additional performance layer. Each of the flex sensors on the finger played a different note when extended. Left and right movement was translated into the pan placement of the synthesised tone so that it could be matched to the movement of the suspension. Upwards movements caused a filter to open, and moving the glove closer towards the receiver created additional echoes of the audio.

The other instruments

The performance also employed other electronic toys & instruments which were manipulated further through the software effects in Ableton. These included a circuit-bent Speak & Math, circuit-bent Fisher Price ABC Robot, a BugBrand WOM Noise generator and a Korg Monotron Delay which were performed and controlled live.

The projections

The projection rig was based around three projectors – one based at the rear of the space pointing towards the audience and casting the suspencion in shadow, and two projectors stage left and right, one of which was mirrored. These were pointed at the audience for the beginning of the performance and angled onto the spspension for the latter part of the performance. The rear projector was receiving data from a laptop running Arkaos, with it video clips triggered by MIDI from the master audio laptop ensuring it was synchronised to the music. The left and right projections in the installation phase were displaying data from a Kinect which was detecting the audience position and allowed the performers to determine how many people were in the audience and pace the performance accordingly. Once the suspension began, the projectors were rotated and the visuals were performed and manipulated live using the suspended body as the canvas. Loops for both performance rigs were primarily created in Blender or with hand-drawn animations.

The screens

The projection screen was double-layered, using a partially opaque plastic layer to intentionally obscure some of the suspension rigging process, which was then removed to reveal a black net curtain which gave the impression of the projections suspended in mid-air between the audience and the performers. In addition, the venue had mirrored partitions which we were able to use to bounce the projected light back into the performance space and create further distorted projection elements.

The static

As the finale of the piece, a wind turbine was turned on at the back of the stage, and 20 cubic feet of polystyrene beads were released, creating light refractions in the performance space and giving the impression of ‘physical static’ – since these beads became electrically charged, they also began to create intentional interference with the sensors and began to coat the equipment, leading to an unpredictable chaotic end to the performance.

Further developmental videos of the work in progress are available on our Youtube Channel

Leave a Reply

Your email address will not be published. Required fields are marked *