Sound and three-dimensional forms
This article discusses a prototype that explores the simultaneous manipulation of three-dimensional digital forms and sound. Our multi-media study examines the aesthetic affordances of tight parameter couplings between digital three-dimensional objects and sound objects based on notions of process and user-machine interaction. It investigates how effective cohesion between visual, spatial and sonic might be established through changes perceived in parallel; what Michel Chion refers to as 'synchresis'. Drawing from Mike Blow's work On the Simultaneous Perception of Sound and Three-Dimensional Objects and processual art, this prototype uses computer technology for forming and mediating a creative practice involving 3D animation, sound synthesis, digital signal processing and programming. Our practice-based approach entails the rendering of a three-dimensional digital object in Processing whose form changes over time according to specific actions. Spatial data is sent via Open Sound Control (OSC) to Max MSP in real time, where sound is synthesized and then manipulated. Sonic parameters such as amplitude, spectral density/width and timbre are controlled by select spatial parameters from the three-dimensional object. Sound processing is realized based on the changing of the three-dimensional object in time through basic actions such as splitting, distorting, cutting, shattering and rotating. We use digital technology to look beyond basic synchronisation of sound and vision to a more complex cohesion of percepts, based on changes to myriad sonic and visual parameters experienced concurrently.
Please ensure that you have read and understood our Open Access, Copyright and Permissions policies.