SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Put the stereotypes out of your mind. Forget the zits, the Cheetos, the smell of too much time on a couch with the curtains drawn. Today´s videogames draw on sophisticated science like biomechanics, fluid dynamics and computational geometry to be lifelike and exciting. Here are the 10 greatest challenges of
making them
. See for yourself-it´s virtual reality, but it´s real work

<em>Like re-creating the ceiling of the Sistine Chapel with a couple magic markers</em> <strong>Problem:</strong> If a computer can´t keep up with the instructions the game issues, the image stutters, ruining the experience. That threshold is the number-one headache of making games. It compounds every challenge listed here, from building artificial intelligence to obeying physics. <strong>Status:</strong> Although games now take advantage of âa'¬multi-core processingâa'¬-the simultaneous use of multiple central-processing or graphics-processing units-programmers are always seeing their fantasies throttled by the need to budget processing capacity. (The PlayStation 3 contains eight 3.2-gigahertz processors, but few designers yet know how to program for them.) The budgetary mindset is so entrenched, programmers use the language of accountants (âa'¬ugh, this rain´s too expensiveâa'¬) to describe the problem. <strong>What´s next:</strong> Moore´s law-which states that the number of transistors on a chip doubles roughly every two years-means that there will always be more processing power available. (Nvidia, a leading manufacturer of graphics chips, claims it has managed to do Moore one better, doubling its chips´ processing power in less than a year.) But programmers´ ambitions will always outstrip the hardware because, as one designer puts it, âa'¬the more we can do, the more excited we get, and the more we want to do.âa'¬ With every jump in CPU and GPU capacity, that frustration only mounts.-Jacob Ward <em>Image: The PS3's Cell processor</em>

1. Processing Power

Like re-creating the ceiling of the Sistine Chapel with a couple magic markers Problem: If a computer can´t keep up with the instructions the game issues, the image stutters, ruining the experience. That threshold is the number-one headache of making games. It compounds every challenge listed here, from building artificial intelligence to obeying physics. Status: Although games now take advantage of âa’¬multi-core processingâa’¬-the simultaneous use of multiple central-processing or graphics-processing units-programmers are always seeing their fantasies throttled by the need to budget processing capacity. (The PlayStation 3 contains eight 3.2-gigahertz processors, but few designers yet know how to program for them.) The budgetary mindset is so entrenched, programmers use the language of accountants (âa’¬ugh, this rain´s too expensiveâa’¬) to describe the problem. What´s next: Moore´s law-which states that the number of transistors on a chip doubles roughly every two years-means that there will always be more processing power available. (Nvidia, a leading manufacturer of graphics chips, claims it has managed to do Moore one better, doubling its chips´ processing power in less than a year.) But programmers´ ambitions will always outstrip the hardware because, as one designer puts it, âa’¬the more we can do, the more excited we get, and the more we want to do.âa’¬ With every jump in CPU and GPU capacity, that frustration only mounts.-Jacob Ward Image: The PS3’s Cell processor
<em>Like painting the sea-while it´s moving</em> <strong>Problem:</strong> Whole careers in mathematics have been built on calculating the tiniest movements of less than a square millimeter of fluid. Videogames have to show an entire raging ocean´s worth of the stuff. âa'¬Less than a year ago, there wasn´t enough processing power to dynamically generate the movement of water in games,âa'¬ says Lee Bamber, a programmer for 20 years and founder of The Game Creators, Ltd. <strong>Status:</strong> âa'¬Viscosity is the difficulty,âa'¬ says Ron Fedkiw, an associate professor of computer science at Stanford University who worked on the effects for movies like Star Wars: Episode III-Revenge of the Sith and Transformers and now works with Industrial Light &amp; Magic. âa'¬High viscosityâa'¬-as with solid objects-âa'¬is easy to do. Clay, with a slightly lower viscosity, is harder, and water is harder still.âa'¬ The math is there, Fedkiw says, but it takes supercomputers to pull it off. <strong>What´s next:</strong> Game developers are experimenting with particle systems, in which groups of particles respond to events while observing certain rules. And slowly, turbulence modeling-the equivalent, in computational physics, of a ballpark estimate-along with improved processors and algorithms, are beginning to allow for even more lifelike splashes, bubbles and waves.-J.W. <em>Image: Riding atop a dragon in</em> Lair_ is made more thrilling by realistic water-a major mathematical challenge_

2. Water

Like painting the sea-while it´s moving Problem: Whole careers in mathematics have been built on calculating the tiniest movements of less than a square millimeter of fluid. Videogames have to show an entire raging ocean´s worth of the stuff. âa’¬Less than a year ago, there wasn´t enough processing power to dynamically generate the movement of water in games,âa’¬ says Lee Bamber, a programmer for 20 years and founder of The Game Creators, Ltd. Status: âa’¬Viscosity is the difficulty,âa’¬ says Ron Fedkiw, an associate professor of computer science at Stanford University who worked on the effects for movies like Star Wars: Episode III-Revenge of the Sith and Transformers and now works with Industrial Light & Magic. âa’¬High viscosityâa’¬-as with solid objects-âa’¬is easy to do. Clay, with a slightly lower viscosity, is harder, and water is harder still.âa’¬ The math is there, Fedkiw says, but it takes supercomputers to pull it off. What´s next: Game developers are experimenting with particle systems, in which groups of particles respond to events while observing certain rules. And slowly, turbulence modeling-the equivalent, in computational physics, of a ballpark estimate-along with improved processors and algorithms, are beginning to allow for even more lifelike splashes, bubbles and waves.-J.W. Image: Riding atop a dragon in Lair_ is made more thrilling by realistic water-a major mathematical challenge_
<em>Like trying to impersonate a living person using a finger puppet</em> <strong>Problem:</strong> Decoding and re-creating the subtlest and most familiar aspect of human expression has dogged artists and scientists for centuries, and now game makers, seeking to make believable human characters, are stuck trying to improve on those efforts. If a character´s face is too close to human, players will reject it, a psychological phenomenon known as the âa'¬uncanny valleyâa'¬: Objects more familiar to the human eye are inspected with greater scrutiny, leading to a drop-off in acceptance as the simulated object nears the point of being lifelike. âa'¬Things look creepy as they get too real,âa'¬ explains Bill Van Buren, a producer for Valve´s <em>Half-Life 2</em>. âa'¬You really notice each little detail that´s off.âa'¬ <strong>Status:</strong> To generate rules for the endless combinations of expression and emotion that game characters need to show, the Valve developers relied on a taxonomy of 60 basic facial actions defined by psychologists Paul Ekman and Wallace V. Friesen to study anatomy and human communication. The proper gaze, it turns out, is important for a natural-looking face. âa'¬If the eyes are wrong, the characters look dead,âa'¬ says Valve senior software engineer Ken Birdwell. He spent a year studying the anatomy of the eyeball, learning things like how the corneal bulge affects light reflection and how characters appear cross-eyed if their pupils aren´t set four degrees off-center. <strong>What´s next:</strong> Crossing the uncanny valley is hard work. Programmers are working to code variety into human expression and to add the sheen of skin stretched across facial muscles. Of course, as game makers delve deeper into recognition psychology and computer-human robotics, processor speed will have to catch up to what they learn.-Doug Cantor <em>Image: The faces in Bioware's</em> Mass Effect_ are so realistic, they skirt the "uncanny valley." _

3. Human Faces

Like trying to impersonate a living person using a finger puppet Problem: Decoding and re-creating the subtlest and most familiar aspect of human expression has dogged artists and scientists for centuries, and now game makers, seeking to make believable human characters, are stuck trying to improve on those efforts. If a character´s face is too close to human, players will reject it, a psychological phenomenon known as the âa’¬uncanny valleyâa’¬: Objects more familiar to the human eye are inspected with greater scrutiny, leading to a drop-off in acceptance as the simulated object nears the point of being lifelike. âa’¬Things look creepy as they get too real,âa’¬ explains Bill Van Buren, a producer for Valve´s Half-Life 2. âa’¬You really notice each little detail that´s off.âa’¬ Status: To generate rules for the endless combinations of expression and emotion that game characters need to show, the Valve developers relied on a taxonomy of 60 basic facial actions defined by psychologists Paul Ekman and Wallace V. Friesen to study anatomy and human communication. The proper gaze, it turns out, is important for a natural-looking face. âa’¬If the eyes are wrong, the characters look dead,âa’¬ says Valve senior software engineer Ken Birdwell. He spent a year studying the anatomy of the eyeball, learning things like how the corneal bulge affects light reflection and how characters appear cross-eyed if their pupils aren´t set four degrees off-center. What´s next: Crossing the uncanny valley is hard work. Programmers are working to code variety into human expression and to add the sheen of skin stretched across facial muscles. Of course, as game makers delve deeper into recognition psychology and computer-human robotics, processor speed will have to catch up to what they learn.-Doug Cantor Image: The faces in Bioware’s Mass Effect_ are so realistic, they skirt the “uncanny valley.” _
<em>Like teaching 1,000 kids to think for themselves overnight</em> <strong>Problem:</strong> Once upon a time, the bad guys in videogames wandered around mindlessly, shooting at you while they waited to die. That doesn´t cut it anymore. Players demand sophisticated enemies to fight and reliable in-game allies with which to fight them. Thing is, it´s freaking complicated, and it eats up processor speed. âa'¬We´re faking just enough smarts to make it work,âa'¬ says Mathieu Mazerole, lead engineer on Ubisoft´s <em>Assassin´s Creed.</em> <strong>Status:</strong> Imbuing characters in a game with lifelike decision-making ability involves employing the kind of high-level logic theories-learning decision trees, mobile navigation, finite-state machine models-used by top robotics engineers. In games like <em>Assassin´s Creed</em>, such calculations cause your pursuers to form squadrons, scale buildings, and dash across rooftops to get you. âa'¬Sometimes they find paths we´ve never thought of,âa'¬ says Mark Bresner, the game´s lead AI programmer. âa'¬You really feel threatened.âa'¬ It takes particular skill to make the AI seem unskilled. âa'¬Missing a target believably is really hard,âa'¬ says game programmer Neil Johnson, of the World War II game <em>Brothers in Arms: Hell´s Highway</em>. âa'¬They automatically shoot straight along the sight line, deadly accurate every time, so we have to adjust for dramatic spurts of fire at the player´s feet, things like that.âa'¬ <strong>What´s next:</strong> The next level of AI, Mazerole predicts, is dynamic, independent interactions between characters in a game. âa'¬Two of them move and bump into one another, and the two happen to be aggressive, and they begin to fight-that´s when we´ll have turned a corner.âa'¬-J.W. <em>In</em> Assassin's Creed,_ passersby act independently, lashing out or shrinking away when provoked_

4. Artificial Intelligence

Like teaching 1,000 kids to think for themselves overnight Problem: Once upon a time, the bad guys in videogames wandered around mindlessly, shooting at you while they waited to die. That doesn´t cut it anymore. Players demand sophisticated enemies to fight and reliable in-game allies with which to fight them. Thing is, it´s freaking complicated, and it eats up processor speed. âa’¬We´re faking just enough smarts to make it work,âa’¬ says Mathieu Mazerole, lead engineer on Ubisoft´s Assassin´s Creed. Status: Imbuing characters in a game with lifelike decision-making ability involves employing the kind of high-level logic theories-learning decision trees, mobile navigation, finite-state machine models-used by top robotics engineers. In games like Assassin´s Creed, such calculations cause your pursuers to form squadrons, scale buildings, and dash across rooftops to get you. âa’¬Sometimes they find paths we´ve never thought of,âa’¬ says Mark Bresner, the game´s lead AI programmer. âa’¬You really feel threatened.âa’¬ It takes particular skill to make the AI seem unskilled. âa’¬Missing a target believably is really hard,âa’¬ says game programmer Neil Johnson, of the World War II game Brothers in Arms: Hell´s Highway. âa’¬They automatically shoot straight along the sight line, deadly accurate every time, so we have to adjust for dramatic spurts of fire at the player´s feet, things like that.âa’¬ What´s next: The next level of AI, Mazerole predicts, is dynamic, independent interactions between characters in a game. âa’¬Two of them move and bump into one another, and the two happen to be aggressive, and they begin to fight-that´s when we´ll have turned a corner.âa’¬-J.W. In Assassin’s Creed,_ passersby act independently, lashing out or shrinking away when provoked_
<em>Like trying to reenact the first few verses of</em> Genesis <strong>Problem:</strong>The jungle is not for the faint of heart-especially if you´re a game developer. Creating miles of playspace covered by dense vegetation, Crytek, the company behind the first-person shooter <em>Crysis</em>, had to add realistic light to a nearly infinite array of moving surfaces. And unlike movie animators who can labor interminably over a single frame and leave the computer to render it overnight, the Crysis team had to keep the design processor-friendly enough to be rendered in real time. âa'¬Usually,âa'¬ says Crytek CEO Cevat Yerli, âa'¬photorealism and productivity are mutually exclusive.âa'¬ <strong>Status:</strong> The solution is Polybump 2, a Crytek shortcut that turns a complex surface into relatively few polygons. The developers even came up with new technologies to make the player´s view more true-to-life as well. âa'¬We can simulate eye behavior based on how bright a light is,âa'¬ Yerli says. Walk around in the dark, and your onscreen eye adapts, revealing more features of your surroundings. But when you emerge, your avatar´s vision blanks out, blinded by the onscreen sunlight. That´s usually when the gunfire begins. <strong>What´s next:</strong> Crytek has made better visual effects possible, but there´s a limit to the visual magic any processor can handle. As processor speed improves, lighting effects will filter through ice, rain and falling objects across enormous vistas.-D.C. <em>Image: Combat adds complexity to games like</em> Crysis_, but it's light and shadow that take time and manpower_

5. Light and Shadows

Like trying to reenact the first few verses of Genesis Problem:The jungle is not for the faint of heart-especially if you´re a game developer. Creating miles of playspace covered by dense vegetation, Crytek, the company behind the first-person shooter Crysis, had to add realistic light to a nearly infinite array of moving surfaces. And unlike movie animators who can labor interminably over a single frame and leave the computer to render it overnight, the Crysis team had to keep the design processor-friendly enough to be rendered in real time. âa’¬Usually,âa’¬ says Crytek CEO Cevat Yerli, âa’¬photorealism and productivity are mutually exclusive.âa’¬ Status: The solution is Polybump 2, a Crytek shortcut that turns a complex surface into relatively few polygons. The developers even came up with new technologies to make the player´s view more true-to-life as well. âa’¬We can simulate eye behavior based on how bright a light is,âa’¬ Yerli says. Walk around in the dark, and your onscreen eye adapts, revealing more features of your surroundings. But when you emerge, your avatar´s vision blanks out, blinded by the onscreen sunlight. That´s usually when the gunfire begins. What´s next: Crytek has made better visual effects possible, but there´s a limit to the visual magic any processor can handle. As processor speed improves, lighting effects will filter through ice, rain and falling objects across enormous vistas.-D.C. Image: Combat adds complexity to games like Crysis_, but it’s light and shadow that take time and manpower_
<em>Like holding air in your hands</em> <strong>Problem:</strong> Playing with fire is no joke in videogames. Programming its behavior is just like programming the behavior of water, except that a) fire is a faster- moving and more intricate element, and b) it has to be able to consume other objects. In the past, games used pat animations to represent flame, but modern games deal in vorticity and viscosity to handle dynamic smoke and heat. <strong>Status:</strong> Ask a programmer about fire, and talk inevitably turns to hardware-a sure sign that a programmer´s up against something hard. Currently, fire tends to be brief and explosive. Raging infernos are too hard to program. âa'¬We just don´t have the processing power to really do its physics accurately,âa'¬ says Game Creator´s Lee Bamber. He mentions that graphics-processing units have, in recent years, begun taking over from CPUs when it comes to the work of crunching calculations for elements like fire. He points to Nvidia´s Quad SLI-a graphics system that ropes together four standalone GPUs-as an example of the kind of hardware improvement that fire has forced videogame developers to engineer. <strong>What´s next:</strong> Stanford´s Ron Fedkiw has created algorithms related to vorticity and viscosity that allow for many kinds of fire: smoky gas fires, burning paper, even oil on water. But researchers don´t face the processor problems that programmers do. âa'¬We´ll have great visuals in the next five years,âa'¬ Bamber says, âa'¬but it´ll take longer to get fire right.âa'¬-J.W. <em>Fire, in games like</em> Dark Sector_, is in high demand, yet it's the most difficult effect to make physically realistic_

6. Fire

Like holding air in your hands Problem: Playing with fire is no joke in videogames. Programming its behavior is just like programming the behavior of water, except that a) fire is a faster- moving and more intricate element, and b) it has to be able to consume other objects. In the past, games used pat animations to represent flame, but modern games deal in vorticity and viscosity to handle dynamic smoke and heat. Status: Ask a programmer about fire, and talk inevitably turns to hardware-a sure sign that a programmer´s up against something hard. Currently, fire tends to be brief and explosive. Raging infernos are too hard to program. âa’¬We just don´t have the processing power to really do its physics accurately,âa’¬ says Game Creator´s Lee Bamber. He mentions that graphics-processing units have, in recent years, begun taking over from CPUs when it comes to the work of crunching calculations for elements like fire. He points to Nvidia´s Quad SLI-a graphics system that ropes together four standalone GPUs-as an example of the kind of hardware improvement that fire has forced videogame developers to engineer. What´s next: Stanford´s Ron Fedkiw has created algorithms related to vorticity and viscosity that allow for many kinds of fire: smoky gas fires, burning paper, even oil on water. But researchers don´t face the processor problems that programmers do. âa’¬We´ll have great visuals in the next five years,âa’¬ Bamber says, âa’¬but it´ll take longer to get fire right.âa’¬-J.W. Fire, in games like Dark Sector_, is in high demand, yet it’s the most difficult effect to make physically realistic_
_Like predicting how sand will spill from a broken hourglass _ <strong>Problem:</strong> It´s no longer enough to make buildings look realistic. Now videogame makers have to be able to knock them down realistically, too. As graphics expectations skyrocket for next-generation console games, designers are looking for ways to improve on the old methods of depicting destruction (a building swapped with several preformulated broken chunks of the building, or a set animation of it falling down) and break things in real time. <strong>Status:</strong> Pixelux Entertainment has created Digital Molecular Matter, a physics engine that approximates how objects should crumble, crack, twist, and tear. First it turns the building into a 3-D collection of hundreds of tiny tetrahedrons. As, say, a flying boulder makes contact, the projectile displaces the tetrahedrons at the impact point, and complex mathematics takes into account the material´s assigned density, strength and mass to determine how these tetrahedrons should shuffle and collide with surrounding tetrahedrons, creating a ripple effect through the damaged structure. âa'¬When we first simulated glass breaking, I played with it for half an hour, just throwing people through windows and watching them break in totally realistic fashion each time,âa'¬ says Pixelux software engineer Vik Sohal. <strong>What´s next:</strong> Each object in a DMM-controlled world must follow physics principles such as Young´s modulus, which controls an object´s rigidity, and Poisson´s ratio, the parameters of which describe how squishy to make, say, a balloon or a soda can. DMM also determines how the forces add up inside an object, so as to break things at their weakest point, such as where a tree branch joins the trunk. Down the road, Sohal says, DMM will be applied to character modeling, so 300-pound offensive linemen will jiggle as realistically (and repulsively) as they do on the gridiron.-Bjorn Carey _Image: Using DMM, a brick wall breaks and falls realistically,. Its characteristics are controlled by the sliders at right. _

7. Material Physics

_Like predicting how sand will spill from a broken hourglass _ Problem: It´s no longer enough to make buildings look realistic. Now videogame makers have to be able to knock them down realistically, too. As graphics expectations skyrocket for next-generation console games, designers are looking for ways to improve on the old methods of depicting destruction (a building swapped with several preformulated broken chunks of the building, or a set animation of it falling down) and break things in real time. Status: Pixelux Entertainment has created Digital Molecular Matter, a physics engine that approximates how objects should crumble, crack, twist, and tear. First it turns the building into a 3-D collection of hundreds of tiny tetrahedrons. As, say, a flying boulder makes contact, the projectile displaces the tetrahedrons at the impact point, and complex mathematics takes into account the material´s assigned density, strength and mass to determine how these tetrahedrons should shuffle and collide with surrounding tetrahedrons, creating a ripple effect through the damaged structure. âa’¬When we first simulated glass breaking, I played with it for half an hour, just throwing people through windows and watching them break in totally realistic fashion each time,âa’¬ says Pixelux software engineer Vik Sohal. What´s next: Each object in a DMM-controlled world must follow physics principles such as Young´s modulus, which controls an object´s rigidity, and Poisson´s ratio, the parameters of which describe how squishy to make, say, a balloon or a soda can. DMM also determines how the forces add up inside an object, so as to break things at their weakest point, such as where a tree branch joins the trunk. Down the road, Sohal says, DMM will be applied to character modeling, so 300-pound offensive linemen will jiggle as realistically (and repulsively) as they do on the gridiron.-Bjorn Carey _Image: Using DMM, a brick wall breaks and falls realistically,. Its characteristics are controlled by the sliders at right. _
_Like teaching a rag doll to play dodgeball _ <strong>Problem:</strong> As a Jedi, the thrill of mowing through Stormtroopers is lost if, as you slash and hack with your lightsaber, each of them collapses in a standardized, repetitive way. Because of a lack of processing power and an overreliance on set animations, that´s always been the case. Human movement in games just doesn´t look real. <strong>Status:</strong> Software developer NaturalMotion has created Euphoria, a simulation- based AI engine that makes game characters react dynamically to danger and injury. Euphoria endows each character with a virtual anatomy-nerves, muscles, bones and mass. In combination, these attributes give each character an active, individualized response to stimuli, so that some will try to dodge a flying crate while others will throw their hands up to brace for the impact. If the crate hits the character, his anatomy settings determine how he stumbles upon impact. âa'¬It´s all based on some really crazy algorithms derived from a deep understanding of natural human biomechanics,âa'¬ says Haden Blackman, LucasArts´s project leader on The Force Unleashed, the first game expected to run Euphoria. Real-time simulations also free up the memory budget, allowing developers to add other visual flourishes without worrying that characters will end up slow and wooden as a result. The benefit is unique game play every time. <strong>What´s next:</strong> Better virtual nervous systems. When two Stormtroopers fall over a railing and one of them catches onto the edge while the other grabs for his friend´s feet, âa'¬it really heightens the level of authenticity and raises the level of surprise,âa'¬ Blackman says. With new physics systems like Euphoria, developers will be continually looking to offer new surprises in onscreen behavior.-B.C. <em>Image: Thanks to the Euphoria system, LucasArts's Stormtroopers cringe, crumple, and fall realistically</em>

8. Realistic Movement

_Like teaching a rag doll to play dodgeball _ Problem: As a Jedi, the thrill of mowing through Stormtroopers is lost if, as you slash and hack with your lightsaber, each of them collapses in a standardized, repetitive way. Because of a lack of processing power and an overreliance on set animations, that´s always been the case. Human movement in games just doesn´t look real. Status: Software developer NaturalMotion has created Euphoria, a simulation- based AI engine that makes game characters react dynamically to danger and injury. Euphoria endows each character with a virtual anatomy-nerves, muscles, bones and mass. In combination, these attributes give each character an active, individualized response to stimuli, so that some will try to dodge a flying crate while others will throw their hands up to brace for the impact. If the crate hits the character, his anatomy settings determine how he stumbles upon impact. âa’¬It´s all based on some really crazy algorithms derived from a deep understanding of natural human biomechanics,âa’¬ says Haden Blackman, LucasArts´s project leader on The Force Unleashed, the first game expected to run Euphoria. Real-time simulations also free up the memory budget, allowing developers to add other visual flourishes without worrying that characters will end up slow and wooden as a result. The benefit is unique game play every time. What´s next: Better virtual nervous systems. When two Stormtroopers fall over a railing and one of them catches onto the edge while the other grabs for his friend´s feet, âa’¬it really heightens the level of authenticity and raises the level of surprise,âa’¬ Blackman says. With new physics systems like Euphoria, developers will be continually looking to offer new surprises in onscreen behavior.-B.C. Image: Thanks to the Euphoria system, LucasArts’s Stormtroopers cringe, crumple, and fall realistically
<em>Like cramming the sum of all automotive engineering knowledge into a joystick</em> <strong>Problem:</strong> Gone are the days of Pole Position, when all it took to draw motorheads in was a steering wheel and something vaguely shaped like a car. Now programmers across all game types try to pack so many realistic elements into their work-historically accurate weaponry, detailed flight controls-that drivers might even look for in-game fast food when they´re hungry. <strong>Status:</strong> To achieve wind-in-the-hair realism, developers are studying automotive engineering. For Electronic Arts´s Need for Speed: ProStreet, the development team laboriously extracted telemetry data from real cars, going so far as to measure professionally driven vehicles at Porsche´s test track in Leipzig, Germany. With the aid of GPS and the automaker´s software, EA´s team was able to record-every 40 milliseconds-how each input on the car (steering angle, throttle, clutch, brake, gear) affected its reaction on the road (position, speed, acceleration, yaw). The info plugs directly into the game, so damage and erratic driving cause the same problems as they do in real life. <strong>What´s next:</strong> Refining the competitive experience. EA has been studying drivers´ behavior, so the game´s AI-based opponents would exhibit different styles rather than running perfect laps every time. It´s easier said than done: âa'¬The Porsche drivers couldn´t do bad laps,âa'¬ says Jacques Kerner, an EA software engineer, âa'¬so we got the engineers to do it.âa'¬-D.C. <em>Image: Real-world telemetry data went into replicating every burnout and crash in</em> Need for Speed: ProStreet

9. True-to-Life Simulation

Like cramming the sum of all automotive engineering knowledge into a joystick Problem: Gone are the days of Pole Position, when all it took to draw motorheads in was a steering wheel and something vaguely shaped like a car. Now programmers across all game types try to pack so many realistic elements into their work-historically accurate weaponry, detailed flight controls-that drivers might even look for in-game fast food when they´re hungry. Status: To achieve wind-in-the-hair realism, developers are studying automotive engineering. For Electronic Arts´s Need for Speed: ProStreet, the development team laboriously extracted telemetry data from real cars, going so far as to measure professionally driven vehicles at Porsche´s test track in Leipzig, Germany. With the aid of GPS and the automaker´s software, EA´s team was able to record-every 40 milliseconds-how each input on the car (steering angle, throttle, clutch, brake, gear) affected its reaction on the road (position, speed, acceleration, yaw). The info plugs directly into the game, so damage and erratic driving cause the same problems as they do in real life. What´s next: Refining the competitive experience. EA has been studying drivers´ behavior, so the game´s AI-based opponents would exhibit different styles rather than running perfect laps every time. It´s easier said than done: âa’¬The Porsche drivers couldn´t do bad laps,âa’¬ says Jacques Kerner, an EA software engineer, âa’¬so we got the engineers to do it.âa’¬-D.C. Image: Real-world telemetry data went into replicating every burnout and crash in Need for Speed: ProStreet
<em>Like training a computer to see the world as humans do</em> ** Problem:** The old method of getting realistic human movement into a computer-dress a person up in a clunky ping-pong-ball-studded suit-is out of fashion. It takes hours to calibrate the computer to recognize the markers, which can fall off as the person moves. Worse yet, if one of the markers becomes obscured by a moving body part, the computer can no longer connect the dots to create the entire stick-figure skeleton, so high-priced engineers must spend long hours redrawing a lost leg or arm. <strong>Status:</strong> With Organic Motion´s Stage, the first-ever markerless motion-capture system, subjects step in front of the camera in their street clothes, and instantly their avatar forms onscreen. The system uses input from 10 to 14 cameras to create a 3-D cloud of data points. The computer triangulates the location of each point, and the cloud takes the shape of the person, providing designers with full-body motion and 3-D texture that flows into existing animation software for easy manipulation. âa'¬And nothing gets lost,âa'¬ says developer and CEO Andrew Tschesnok. âa'¬We still know where the elbow and arm are if the person´s hand goes behind something.âa'¬ <strong>What´s next:</strong> As developers adopt the tech, Stage has the potential to make entire games more realistic. With traditional motion capture, you might have the time and money only to record the ball-handling skills of a couple star players and then translate their attributes to other players. âa'¬Now,âa'¬ Tschesnok says, âa'¬you could conceivably record every player in the league.âa'¬ When systems like Stage can capture an entire field´s worth of movement, videogame action will be indistinguishable from the real thing.-B.C. <em>Image: Old motion-capture systems involved special clothing. The Stage system grabs info without a costume change.</em>

10. Motion Capture

Like training a computer to see the world as humans do ** Problem:** The old method of getting realistic human movement into a computer-dress a person up in a clunky ping-pong-ball-studded suit-is out of fashion. It takes hours to calibrate the computer to recognize the markers, which can fall off as the person moves. Worse yet, if one of the markers becomes obscured by a moving body part, the computer can no longer connect the dots to create the entire stick-figure skeleton, so high-priced engineers must spend long hours redrawing a lost leg or arm. Status: With Organic Motion´s Stage, the first-ever markerless motion-capture system, subjects step in front of the camera in their street clothes, and instantly their avatar forms onscreen. The system uses input from 10 to 14 cameras to create a 3-D cloud of data points. The computer triangulates the location of each point, and the cloud takes the shape of the person, providing designers with full-body motion and 3-D texture that flows into existing animation software for easy manipulation. âa’¬And nothing gets lost,âa’¬ says developer and CEO Andrew Tschesnok. âa’¬We still know where the elbow and arm are if the person´s hand goes behind something.âa’¬ What´s next: As developers adopt the tech, Stage has the potential to make entire games more realistic. With traditional motion capture, you might have the time and money only to record the ball-handling skills of a couple star players and then translate their attributes to other players. âa’¬Now,âa’¬ Tschesnok says, âa’¬you could conceivably record every player in the league.âa’¬ When systems like Stage can capture an entire field´s worth of movement, videogame action will be indistinguishable from the real thing.-B.C. Image: Old motion-capture systems involved special clothing. The Stage system grabs info without a costume change.