A look at collaborative game mechanics, relevant for the design of serious games.
Formula D interactive turns 10 this year and to celebrate we’re creating an interactive dancefloor experience to play with at our 10-year birthday celebration. Naturally, we’ve been on the hunt for inspiration, rather serendipitously, Google came out with an incredible Doodle for Oskar Fischinger’s 117th Birthday.
Oskar Fischinger, Kinect and the Interactive Dancefloor
Oskar Fischinger was a German-American filmmaker, painter and animator, and a pioneer in the realm of motion graphics and animations, remarkably, decades before the existence of computer aided graphical manipulation. Fischinger was famous for creating, by hand, beautiful abstract animations synchronized to music. The Doodle payed homage to him, by allowing users to compose their own piece of music using a simple music sequencer-like interface.
Users could click on diamonds in a grid to enable notes and a 16-beat score would play back in real time. Additionally, the instrument, tempo and key could be changed via UI panels. With very simple parameters, a wonderfully engaging and often surprisingly beautiful piece of music emerges.
This Doodle was a good starting point and I set to work applying a few simple rules and structures inspired to experiment with generative music that is influenced by dancers in real time.
As with all good forms of inspiration, the evolution and or repurposing of this piece for our interactive dance floor seemed to flow very naturally – we would project a grid similar to the diamond one on the floor, and users would enable notes with their bodies instead of mouse clicks. Gestures would change instruments and further down the line other tracking data could be mapped to additional variables.
The system will leverage off the robust Kinect Tracking system we created for River of Grass – a large-scale audiovisual interactive installation about the Everglades’ ecosystem for the Frost Museum of Science in Miami, Florida.
Our tracking software draws data from 7 networked Kinect motion tracking sensors and allows for capturing, processing and communicating the presence and location of bodies, feet, hands, gestures and physical props in the entire interactive area.
Our software was created in a modular, extensible and generic manner so that one can dynamically configure the amount of Kinect sensors, the tracking data we are interested in, and the application of such data. This makes it easy to repurpose the system for our interactive dance floor, where we will use body and gesture tracking in the first iteration, and possibly more for further iterations.
The dance floor application is designed in Openframeworks and receives tracking data from the Kinect tracking application. A grid of random colours per block make up the visuals as well as a moving red rectangle that represents the metronome. Each column in the grid is mapped to a set of instruments that we set up in the sophisticated DJ tool Ableton and each row within a column is mapped to a different key. When a person is detected standing in a particular block for longer than 5 seconds that block will be enabled (in the same way a user would enable
a diamond with a mouse click in the Doodle) and when the metronome passes the enabled block a message will be sent from our app to Ableton to trigger the corresponding instrument note and key. This requires participants to work together to arrange themselves in a particular orientation on the grid to generate the music.
We set up a demo for this in our boardroom and tested out what the interaction felt like. It was satisfying trying out different arrangements with a group of people and having to work together to produce a piece of decent sounding music. The possibility of composing something as a team and then performing this standing arrangement/composition on the night of the party was exciting but we immediately recognised the need to add more gesture and free movement-based interaction to the experience. Interactive Dance Floor version 2.0 will definitely explore modulating sounds
with arm gestures and large free movements across the dancefloor, possibly transforming the tightly structured grid idea to something more organic with larger, less rigid regions of interaction. Watch this space for IDF 2.0!