Semantic Structure, Meta Language, Dance, Music
This project first emerged as a series of points, drawn on the floor, in order to give my dance students a visual system with which to follow. Seing where to place their hands and feet really helped the learning process, by allowing theem to learn visually. Tactile learnears benefitted from this as well.
The second stage of it's evolution was to build a matte with resistance sensors imbedded in it, below the points corresponding to the visual system mentioned above. As students stepped on the points, they triggered sounds. "BOOM, CLACK, B-BOOM, CLACK". This helped my students to understand their body movements in the same logical structure as music. If they were doing the movement correctly, then they would hear the right rhythm. This also allowed students to learn audibly.
The third stage of this project was to place a small camera to the front of the pad. The pad alone could only gather data from movement on a horizontal axis along the floor. It did not motivate my students to move their arms much. Therefore, I added a camera to the front of the pad. The camera sends the feed to a software that analyzes differences from frame to frame in order to detect motion. Motions in different areas of the matrix trigger different sounds. Now my students could swing their arms in different directions to activate sounds as well.
Once I was able to capture a full range of motion (wihout having to actually wear any sensors) I then programmed the software to be able to recognize BPM, record and loop multiple phrases, and swich the sound-banks. In addition, all the loop and switch points can be automated, so that as long as the dancer keeps up with the choreography, they will get the desired result...as shown on the video below.