I’ve been thinking alot about making a soundbased installation in Adobe Flash, using sensors and switches, and I’ve gotten around to making a few small eksperiments/prototypes as research, which I’m planning to share on this site later.
But I also found this old experiment I wrote in Flash 5 (!) and wanted to share it. It is a visual sequenzer (*) / sound toy that lets you drag icons onto a “soundstage”, each icon representing a sample. As you press play a line starts moving vertically, and as the line hits one of the icons, the corresponding sound is played. You can also click, drag and hold an icon, and move it on top of the moving line to trigger the sound. There are two types of sounds. The yellow icons trigger different “wet finger on glass” sounds, and the greywhite icons trigger sonar ping sounds.
The code is pretty old, and there are WAY better ways of making something like this with AS3, but still, here is the source code (fla) for it (it also includes the samples, which you are free to use in any way you like).
I still think it is a pretty nice little project, but I am toying with the idea of replacing the click and drag with a webcam mounted in the ceiling, and letting people moving around on the floor trigger the sounds. I have quite a lot to learn before I can make something like that, but I’ll get there!
* Ok, so I guess it is a stretch to call this a sequenzer, but I wanted to create a fun, easy and interesting way to generate a sound collage.
I like the idea of triggering different sounds depending where a person is in a space, and especially of this is done in a large outdoor arena, like a park. As you move to different positions in the park, the soundscape changes, and you litereally use your body as a dj would use his hands to remix a track.
Mapamp uses existing structures and systems (architecture of a city, navigation and radio systems) to layer an artificial acoustic space over the original one.
The participant walks the streets wearing a special vest that allows him/her to navigate through different sound data fields. These virtual spaces differ from the geographic city scape. Changing his/her position, the walker can pick up and mix the sounds, which come into connection with the architectural features of the public space: the noise of the surroundings, distant radio stations and abstract sound samples intermingle in the space, depending upon the position, direction and velocity of the visitors.
At its simplest form SonicWireSculptor is a novel 3D drawing tool and a unique musical instrument, but perhaps most important – its just fun to play with. The project started out as a personal instrument for Pitaru to perform on. During concerts, audience members often inquired whether they could experience the tool first hand. This encouraged Pitaru to transform the software into an immersive public installation. The installation included enhancements to the original work, allowing a wider range of users to intuitively interact with the environment. Gallery visitors would enter a dark room with a surround-sound system, a projection and a unique drawing station. Opening nights for these exhibits would often double as performance and workshop events where the audience and Pitaru explore the tool together. Participants would be encouraged to add their work to a steadily growing collection of beautiful and surprising sonic-sculptures. Today, this collection includes work from professional illustrators, poets, 9 year-olds and their parents, musicians of various genres, as well as Pitaru’s own personal compositions (which he considers to be the least interesting in the collection).
The software was designed and optimized to work at 120fps (or better) on a regular household dell and a home-theater 7.1 surround system. It was important to have the system deployable as small koisks as well as fully immersive surround-sound environments. To do so, the software was written in C++, using OpenGL for nVidia/ATI optimization and the FMOD sound library with optimization for the Audigy sound cards. A Pressure/Tilt sensitive Wacom Cintiq driver was written as a preferred input device, although a regular mouse can be used as well. A RF telecommunication API was written for enabling gallery attendant to save audience work with a touch of a button via remote-control. All code was then ported to Mac OSX for flexible deployment.
3D matrix math was written at a low-level to allow the novel interactive experience of the tool. This interaction method has proven efficient in several other applications, including medical imaging and commercial 3D modeling tools.
The project SEVEN MILE BOOTS is a pair of interactive shoes with audio. One can wear the boots, walk around as a flaneur simultaneousy in the physical world and in the literal world of the internet. By walking in the physical world one may suddenly encounter a group of people chatting in real time in the virtual world. The chats are heard as a spoken text coming from the boots. Wherever you are with the boots, the physical and the virtual worlds will merge together.