The boss is enraging at 7% health and you're locked on target, hunched over your keyboard in a white-knuckled frenzy to squeeze every last drop of DPS from your avatar. Finally, the beast succumbs to your assault, and you sit back, exquisitely aware of the tension crumpling your neck and shoulders and radiating into your fingertips. As you pull in a deep, shuddering breath of relief, you wonder if perhaps it might be more natural to simply stand in front of your screen and show the computer, using gestures similar to those of your character, what to do.
Now, you can.
Dr. Skip Rizzo, associate director at the University of Southern California's Institute for Creative Technologies, is head of a research project that's applying the same kind of technology used in the Xbox Kinect to the World of Warcraft. The aim of the project, however, is not so much to turn games like WoW into virtual tarantellas of movement and gesture but to make games more accessible to disabled players and to open new avenues for rehabilitation, therapy and even education. The project's Flexible Action and Articulated Skeleton Toolkit (FAAST) middleware integrates full-body control with games and virtual reality applications, using tools like PrimeSensor and the Kinect on the OpenNI framework.