SpiralSet

SpiralSet is a sound toy with an animated visual interface designed for non-experienced users. The sound toy’s primary theme is user/player influenced exploration of sound spectra. The project could therefore also be considered to be an open form spectral work. 

A simulated physics system, and users physical interactions determine the motion of three balls contained within the virtual 3D structure, consisting of interconnected pipelines, with these symbolically representing sonic pathways. The motion, (speed, direction & route) and position of the balls within the pipeline dynamically control the sonic output, allowing the user to manipulate the spectral characteristics of the sound output and shape the structure of the piece. The player engages with the work using the sensor interface or mouse, allowing them to tilt and rotate the pipeline structure, subsequently affecting the degrees of motion in both the visual and audio domains. 

The project was originally developed as an interactive installation piece for presentation at the Sonic Arts Network Expo 2008, in Brighton, UK. A custom-made sensor interface is used for control of the system in an installation context. A home edition version has subsequently been designed which uses a standard computer mouse as the input device, to allow the project to be more widely disseminated.

 User interactions are designed to be quickly accessible, whilst maintaining scope for sonic variation. The intended accessibility of the project informs the nature of the physical interactions implemented, which require little or no instruction for operation. The project utilises the game engine software Unity 3D, which is used in conjunction with a real-time additive synthesis sound engine constructed in Max/MSP/Jitter.


VIDEOS




LINKS TO RELATED PAPERS


NIME 2009 - New Interfaces for Musical Expression (2 Page Paper)


GIC 2009 - Games Innovation Conference (More Detailed Paper)


OVERVIEW


The SpiralSet virtual pipeline structure contains three coloured balls, each with a related additive synthesis voice. The player tilts the structure to influence the simulated physics of the three balls, with ball coordinate data monitored in Max and mapped to recall spectral data in the synthesis engine. Each section of the SpiralSet pipeline has its own sound characteristic, allowing different combinations of sound materials to be explored when a ball enters a different spiral section (sound zone). There are eleven different sound zones in total, with these representing optional sonic pathways. The individual sound zones are the three linking rings, four upper spirals, and further four lower spirals. When a ball enters a new zone the sound type changes, with the ball's motion and position directly recalling spectral data for this new sound type in the additive synthesis engine. When a ball stops moving, the timbre of its corresponding synthesis voice will stay static. When a ball moves more quickly, this results in quicker transitions in timbre. The shape of the pipeline allows the player to capture and hold a ball within different sections of each spiral, allowing sounds to be ‘frozen’ at a single point in time.


Each of the three balls within the pipeline emits a colored light, (red, green and blue). The lights allow each ball to be identified by the player, and generate additively mixed colors that refract around the transparent structure according to their position and proximities to each other. The lighting effects represent the mixing and varied additive combinations of spectra heard in the sound world, reflecting the synthesis techniques utilized in the project. 


SOUND MATERIALS & SOUND ENGINE


The sound materials are derived from the spectral analysis of both acoustic recordings and synthesized tones. Vaguely recognizable timbres such as a re-synthesized saxophone motif coexist alongside more abstract electronic sounds.  In the development stage of the project the software application SPEAR was used for sound analysis to create SDIF files that provide the frequency and amplitude data used in the synthesis engine. The frequency and amplitude data is then transferred to Jitter matrices for more efficient access to the large data sets during game-play, and to facilitate spectral transformations through Jitter matrix operations. 


All three ball's coordinates and progress through the different sections of the pipeline is monitored within Max, with ball position mapped to recall each successive spectral data time slice relating to the sound zone the ball is currently within. The data is stored in a Jitter matrix, with time represented in the y-axis of the matrix, and the data for all frequency components of each time slice stored along the x-axis. Altering the read position of the matrix allows access to the spectral component data for every individual time slice. Mapping ball coordinate derived data to the matrix read position results in the rate of timbre progression, or sound evolution time being controlled directly by the ball's position and motion within the virtual pipeline structure. Matrix data specific to each of the eleven sound zones is dynamically recalled by the sound synthesis engine once a ball enters the related zone.


The Jitter spectral matrices are multiplied by further scaling matrices allowing the attenuation and transposition of individual or groups of frequency components. The values of each scaling matrix are predetermined as part of the development process by the composer/designer and stored. These stored settings are automatically recalled when a ball enters the appropriate sound zone. Each ball may sometimes recall different matrix scaling data within the same sound zone, creating sonically transformed variations of the sound type. 


IR (INFRARED) USER INTERFACE


The player tilts the 3D pipeline structure using the IR (infra red) distance sensor interface to influence the simulated physics of three balls that roll around inside. The height of the player’s hands above each IR sensor determines the direction and rate of rotation. The interface design shares aesthetic similarities to the virtual pipeline structure. The resulting patterns of motion and positions of each of the three balls within the pipeline subsequently control the progression and development of  the output sound.


SYSTEM STRUCTURE

Bidirectional communication between the game engine software and the synthesis engine is achieved through an internal UDP network connection. The visual domain is constructed within Unity 3D. All sound synthesis and sensor data management is dealt with within Max/MSP & Jitter. Max communicates the sensor data to Unity, influencing the simulated physical behaviour of the balls. The resulting ball coordinate data is communicated from the game engine back to Max for tracking.

andy@dysdar.org.uk                                                                                                                                                                                                  © 2014