Taal is my final project for the class of algorithmic composition at ITP. The purpose of the class is to create music through algorithms, basically programming. The main platforms that we used for the class were CSound, Mini audicle, Max/MSP and processing.
For my final project, I wanted to create an installation that would allow the users to create music to body movements and jestures.
The project comprises of a projected display and a camera that captures the users movements. The it works is that each frame of the camera is captured in processing, and then subtracted from the previous frame. Hence detecting any movement in the scene. Now the entire screen is divided into a grid of sounds. Each square of the grid is mapped to a unique sound. The top squares play different “soor’s“(notes) of tabla and the bottom ones play that of a santoor. Depending on where the motion is detected that particular sound is played.
Hence once the user gets used to the system, he/she can then start composing music through his/her movements and gestures.
On the technical side, since processing isnt well equipped for dealing with sound, I had it pass UDP messages to mini-audicle, which would them produce the mapped sound in a separate thread, and hence could produce several sounds together, almost at the same time as well.
The installation was put up at the Exit art gallery for the final NIME and algorithmic composition show on the 16th of December 2007.
The video below shows a screen capture of the display playing different sounds based on movement. The movement is shown through green pixels and the grid areas stimulated are depicted using gradient colors from yellow to red.