When Microsoft introduced the Kinect sensor in 2010, the company said the motion-capture system would transform gaming. That was only partially true; gamers could do novel things like swing an imaginary golf club or dance, but the Kinect wasn't sensitive enough to register intricate maneuvers. The system, however, has become most popular among hackers, who used it to build smart shopping carts and gesture-controlled quadrocopters. In November, the company will launch an upgraded Kinect with the Xbox One console. With that release, Microsoft could finally disrupt gaming at the level it had originally intended, changing not only how we interact with games but also how games interact with us.
Successful videogames have one thing in common: immersion. When drawn in, players lose track of time, their pulse rises, they become unaware of their surroundings, and, according to a recent study at the University College London, they have difficulty returning to reality. In short, their point of view shifts from the real world to the virtual world. But while it's easy to identify an immersive game (or scene within a game) after the fact, developers have never had feedback of a player's engagement in real time.
With the new Kinect, reams of information will flow from the gamer. And that data will be granular enough to detect extremely subtle signals. A high-speed 1080p camera can detect minute movements, including eye blinks, wrist twists, and muscle flexes. Using a combination of the camera's color feed and the active infrared, the Kinect can also pick up fluctuations in a gamer's facial blood flow to estimate heart rate.
Developers could mine that data to change the way games unfold. Along with a player's skills—response time, shooting accuracy—his reactions could factor into gameplay. For example, the intensity of a game could ratchet up as a player leaned forward or his heart began to race. Games could even respond to facial expressions. Granted, precise emotions are hard to nail down (intense fear and intense joy both raise the heart rate). For that reason, applications may be basic at first—adjusting difficulty based on a player's posture, for instance.
That probably won't be the case for long, as sensors become more powerful, affordable, and easily integrated into devices. Already, Israeli company Umoove has created compact head- and eye-tracking systems that could adjust a player's viewpoint based on head movements. And Irish start-up Galvanic has developed a prototype skin-conductivity sensor that can better correlate a player's stress level and in-game performance. Consoles with such heightened senses will allow for games that are progressively more immersive—and blur the once stark line between the real world and the virtual one.
This article originally appeared in the September 2013 issue of Popular Science.
>> Kinect can also pick up fluctuations in a gamer’s facial blood flow to estimate heart rate.
Developers could mine that data to change the way games unfold...
There's also a very useful and serious-game like use for such a cost-effective controller in Health-care and Monitoring for Senior Citizens living alone(or anyone really), if an amped-up version of the Kinect2 was placed to cover rooms in households.
From the hard science novel Memories with Maya:-
..."Eventually he spoke again: “So, maybe the AI accesses the current frame, references time of day, and using a gestalt subroutine, figures something is not right with a person in a slumped position. The skeletal overlay could do that–”
I interrupted him. “Uh huh. Yeah, I'll leave the jargon to you. In essence what I mean is if the AI sees a person lying askew on the floor it can figure something's not right by accessing feedback from the cameras.
Facial expressions can easily be isolated from a stereo pair and perhaps pulse irregularity can be gleaned from image brightness, right?”
“Pulse reading from image brightness would depend on the fidelity of the image, but facial expressions, yes,” he said. “And if someone's collapsed on the floor, chances are there will be a skeletal mismatch with the superimposed human IK rig.”
You said: " The system, however, has become most popular among hackers, who used it to build smart shopping carts and gesture-controlled quadrocopters." Could you please link me to such " smart shopping carts " applications you know of?