Sound Applet Presentation

I presented my hastily rebuilt PD sound applet, along with a short documentation of the project. Below is an except from that documentation.

Sound1

Objectives

Being new to the field of electo-acoustics, I used this assignment as a research exercise in testing and developing skills in a number of key areas and technologies. The outcome of the project was to provide a basis for further development of the applet and extension of those technologies. It was also intended to give me a grounding in electro-acoustic and tactile interface theory which may be of use in future projects.

Technologies

The hardware data is interpreted through PureData, the open source visual programming language.

Using the GEM external, this input data is interpreted through a colourful 3D visual interface. As an experiment in electric sound, I also assigned a couple of oscillators to the input data. Using a [line] object, the input values are ramped to provide a smooth transition through frequencies.

The hardware controller itself, rather than previous experiments with computer mice, provides to analogue inputs. By connecting these sensors to separate LDRs, varying numeric data can be generated through body movement and environmental affects.

To allow the LDRs to receive light, I moulded a couple of faux-handles to connect to the back of the controller. Whilst aesthetically wanting, they afforded some protection for the device and more comfort if holding it.

Conclusions

The use of PD rather than Max/MSP caused a steep learning curve for this project. Coupled with my inexperience of the subject, my objective to explore the various theories and technologies meant that the resultant patch was mostly a platform on which several different experiments converged.

The applet has certainly given me a clearer idea of the potential of PD as an interface between devices, and I should in future hope to explore the possibilities of outputting data to physical devices. Ideally, this shall probably be contained in reactive installation space in which people engage through movement or gesture. One of the key problems I need to solve in order to develop this prototype is the MIDI out functionality of PD on OS X, and what devices can be used.

I have begun exploring potential ideas with the Arduino board which appears to offer USB/MIDI conversion and various hardware outputs for this purpose.

PD Applet & Full Documentation (.zip)

Full Documentation (.pdf)