Taal – Algorithmic Composition Final Project

Taal is my final project for the class of algorithmic composition at ITP. The purpose of the class is to create music through algorithms, basically programming. The main platforms that we used for the class were CSound, Mini audicle, Max/MSP and processing.

For my final project, I wanted to create an installation that would allow the users to create music to body movements and jestures.
The project comprises of a projected display and a camera that captures the users movements. The it works is that each frame of the camera is captured in processing, and then subtracted from the previous frame. Hence detecting any movement in the scene. Now the entire screen is divided into a grid of sounds. Each square of the grid is mapped to a unique sound. The top squares play different “soor’s“(notes) of tabla and the bottom ones play that of a santoor. Depending on where the motion is detected that particular sound is played.

Hence once the user gets used to the system, he/she can then start composing music through his/her movements and gestures.

On the technical side, since processing isnt well equipped for dealing with sound, I had it pass UDP messages to mini-audicle, which would them produce the mapped sound in a separate thread, and hence could produce several sounds together, almost at the same time as well.

The installation was put up at the Exit art gallery for the final NIME and algorithmic composition show on the 16th of December 2007.

The video below shows a screen capture of the display playing different sounds based on movement. The movement is shown through green pixels and the grid areas stimulated are depicted using gradient colors from yellow to red.


RoboLamp – Physical Computing Final Project

Robolamp was my final project for the physical computing class at ITP.

The idea behind the project was to put life into the regular day inanimate objects around us and make the interactions with them more interesting. So here the user can control or rather direct the position of the lamp by moving his/her hands around it.

The image below shows the first prototype, which had two degrees of freedom, and was manually commanded to move to specific positions.

The image below shows the final working prototype of Robolamp. The user gently moves his/her hand around the lamp to position it.

On the technical side it is equipped with four range sensors, one of each side of the head, which detect the presence of a persons hand around it and command the servo motors to move the lamp in the other direction. Its equipped with one servo motor in the bottom, for left-right movement and two in the arms for up-down movement, making it possible to place the head of the lamp almost anywhere.

The video below show the final working prototype.