It is amazing what you can achieve with a class of advanced interaction design students, fuelled by 12 pizzas and a couple of toys (Wii Bluetooth remote controllers). In our class, students are designing spatial interaction projects that can take the form of art installation, informative sonification/visualisation or augmented hyper-instrument design (gesture performance interface). We look at a variety of input technologies used with Max/MSP, ranging from iSight webcams for motion/colour/outline/centre tracking with Pelletier's computer vision externals, Gypsy midi suit, La Kitchen Kroonde wireless radio frequency receiver and motion sensors (gyroscopic, accelerometer, binary) and these in turn can be applied to other physical objects or embedded in spaces to radically alter the interaction context. An interesting example designed by a skateboarder is 'the real simon' which uses something similar to our RF or WiFi wireless sensors to transmit the motion data from the skateboard to sound controllers. Some of the more well known examples include the IRCAM and NIME designers' works in which dancers intricately control soundscapes for live music performance. Of course, it is not the technologies that are the goal, though they can influence the design of the interaction. Translating spatial experiences into an abstract or distinctive experience of interaction differentiated from conventional screen-based, mouse-driven intervention is still sometimes challenging for students to fully conquer. It is tempting, for instance, to use a Wii or a coloured glove as a pointer. One of our ambitions is to encourage students to really engage the physical and spatial freedom facilitated by unwired, large scale modes of interaction. Spatial interaction, generative design and digital experiences of engagement distinguishes this design 'problem' from more conventional interfaces and contexts of digital design. Many sources of inspiration and background from which to distinguish work are posted by online communities, including Youtube videos, musical performance, and open source, e.g. Create Digital Music using the Wii as a sound controller.
I have been experimenting with the Wii remote to manipulate attributes of the IRCAM SOGS granular synthesis and loop processing examples, building on Cycling 74 Max examples and Masayuki Akamatsu's Universal Binary objects. It remains difficult to extract mono-dimensional data streams from the Wii because it is extremely sensitive and therfore difficult to tilt it (yaw) or rotate without synchronously affecting other axes. The direction buttons and other controller buttons allow for an extra set of data controllers, e.g. the A and B buttons to load and sustain sounds, arrow buttons to modulate through sample loops.