“The goal of any game design is to immerse users in a digital world,” says Jeff Bellinghausen, CTO of technology firm Sixense , which is partnering with peripherals manufacturer Razer to bring next-generation motion-sensing to desktop computers. “Interaction is what really engages players … and a true one-to-one motion-tracking system can simplify design in important ways. However, gameplay design for these systems also becomes much more physical. Designers who are more experienced with toy design or mechanical engineering will find that their experience maps directly to the space.”
One benefit (and challenge) of motion gaming is the idea of making a player’s body the controller. “The range of movement human bodies can manage is huge compared to traditional gamepads,” says Mike Nichols, head of Softkinetic Studios , which designs games that use a proprietary 3D real-time camera-tracking interface. “But full-body reaction time is also slower than thumbs alone, and available physical space and field of view can present limitations.”
The Technology Behind the Movements
Even though there are hurdles, there’s also growing incentive for game developers to surmount them.
Take Razer and Sixense’s new dual-remote-control-like device, which noticeably enhances play in first-person shooters such as Left 4 Dead 2, letting fans hurl grenades or hack up zombies with laser-like precision. By employing magnetic fields, it can painstakingly track movements to within 1 millimeter of actual physical position and one degree of spatial orientation. Compatible with physics engines such as Havok and PhysX, and middleware platforms like Unity and Unreal, it offers an edge in real-time strategy games, run-’n’-gun outings, and titles that rely on pixel-perfect accuracy.
Likewise, PlayStation Move uses an accelerator- and gyroscope-equipped controller to track motion, and a camera to monitor orientation. The technology turns swipes into Frisbee throws in Sports Champions, and jabs into punches in The Fight: Lights Out. Kinect even goes a step further by using a built-in laser scanner and depth sensor to create a 3D representation of play environments, as well as detect users and objects. Players are then assigned virtual skeletons, which are used to track individual joints’ movement, velocity, direction and positioning within given playfields, eliminating the need for handheld hardware entirely.
Challenges of Moving from 2D to 3D Motion Design
Whichever platform game developers opt to work with, making the transition from 2D to 3D motion-tracking design can present significant obstacles.
“You’d be hard-pressed to find anyone on our team who hasn’t needed to learn new skills, or evolve old ones, to design for motion controls,” says Senior Designer Dean Tate of Dance Central. “Challenges include the need to find ways to prove out design theories and prototype gameplay when a title’s underlying technology is still massively in flux.”
Motion-gaming development can also involve more physical work than traditional games. “Typically designers, animators and developers work closely together, but sit behind computer screens,” says Takashi Yamaguchi, producer of snowboarding-inspired outing Adrenalin Misfits, which, like Dance Central, utilizes Microsoft’s gesture-tracking Kinect. “We spent more time getting people to try our game out. Creating motion-based games is less about coding and more about understanding the player. Rather than resort to motion capture, we spent painstaking hours watching people enjoying early builds to see how they moved. Honestly, we just had the production and QA teams play over and over until they collapsed.”
Movements that Substitute for Button Pressing
Another challenge of motion-gaming development is building a range of simple movements that are capable of doing the same job as button-pressing — while still keeping the shapes and sizes of all players in mind. Rapid prototyping is also more vital with 3D game development, given the need to create intuitive experiences that will physically challenge users without making them collapse from exhaustion. As a result, programmers must act more like traditional product designers when working with these systems. Because players expect instant reactions, animators also have to rely on algorithm-generated sequences over canned animations.
“The biggest difference from 2D controls is that after nearly 40 years, we have a well-established language for what buttons and joysticks do,” explains Hardley Baldwin White-Wiedow, design director for PlayStation Move Heroes. “But with motion controls, every time someone adds new functionality, expectations are up in the air again.”
Accessibility, simplicity and user-friendliness are vital to solving these dilemmas, says Frederic Blais, director of technology for Your Shape Fitness Evolved. Ironically, he adds, producers must also consider fresh design aspects like available play space in users’ living rooms.
Fortunately, insiders largely concur that the growing pains of motion gaming will soon be worked out. “Motion-controlled gaming is here to stay, and it will quickly become a new central component of interactive entertainment,” says Kinectimals executive producer Jorg Neumann. “In a few years, we’ll look back on gaming without body movement and think of it as old and limited.”