ilgMV International Research Exchange
The interactive Interface for generative Music and Visualization is a modluar system for generating music and visualization through movements and gestures in real- time.
ACCAD/Dance Faculty and Staff Researchers:
Maria Palazzi, Oded Huberman
Exchange students from the Fachhochschule Duesseldorf, Germany (University of Applied Science):
Felix Hofschulte, Martin Kutz, Michael Kutz
Students from the Ohio State University Department of Dance:
Sophie Ann Clemmensen, Rachel Barker, Lauren Bedal
The Motion Lab facilitated research by exchange students from Germany who were interested in collaborating with dancers who were technology savvy and could help test and evolve their Interactive Interface for Generative Music and Visualizations (IIgMV). The iIgMV is a modular system to generate music and visualizations through movements and gestures in real-time. Because of its scalability, the students who created it hope that its possible applications could range from small public installations to professionallly conceived live performances. The system consists of algorithmically generated visualizations and musical compositions. The real time interface between those components is provided by the movement- and gesture detection of Microsoft's Kinect. The flexibility and compatibility of the system the students are creating enables the use of many different potential visualizations (3D, 2D, illustrations, video embedding, collages) and musical genres (support of all MIDI-compatible instruments, interfaces and sound generators).