Tuesday, September 2, 2008 -- Bristol, CT -- EA Sports Virtual Playbook -- Merril Hoge on the set
Tuesday, September 2, 2008 -- Bristol, CT -- EA Sports Virtual Playbook -- Merril Hoge on the set. John Atashian
SHARE

With each iteration, the Madden video game has inched a little closer to reality. Now reality is starting to embrace the virtual. ESPN has introduced the EA Sports Virtual Playbook in its NFL coverage this season by using green-screen technology to bring life-size Madden 3D players into the studio. We dive into the inner workings.

For several years a part of ESPN’s coverage consisted of middle-aged anchors standing in the studio and demonstrating specific skills, formations, or schemes expected in a key match up.

“It used to be just Tom Jackson and the other guys playing flag football. It was four guys in their suits who were in their heyday 20 years ago,” said an EA spokesman. “This gives a whole new reality.”

That whole new reality allows the anchors to run through specific motions and schemes with the actual players of interest, while they critique and comment on their motions.

In the week before the show, producers request situations with specific players from the EA campus in Orlando, Florida. The engineers in Orlando then create that situation using the Madden 3D engine, down to the specific spin move, stunt move, or swim move using nothing more than the game you play every Saturday night. An EA employee will literally start up Madden ’08, pick the appropriate teams, call the relevant play, and pull the trigger or press the appropriate button on the controller to move the player accordingly.

“We have all those moves at our disposal. If it’s Mario Williams doing a swim move then there’s a player with a controller in his hand using the certain buttons to create that motion,” said Jason Parker, a software engineer with EA. “If you want the quarterback to roll out of the pocket, we rely on the actual human player to make the quarterback do that.”

What’s recorded though is not the video footage, but the code dictating the three-dimensional data on each player in the game. That same data is stored internally on your XBox, allowing gamers to watch instant replays from any angle and at any speed. That file is then sent to the next step in the process.

Two off-the-shelf XBoxes are linked up to two standard in-studio cameras. The play is then started in the XBox, and the players appear in the studio using basic green-screen technology which allows the anchors to look at various TV screens to see where the players are located. But where the Virtual Playbook extends beyond the weather maps on your local news is how the in-studio cameras interact with the virtual players. If a producer wants to zoom in on the left hand of a player, the in-studio cameraman merely looks at his screen (which is showing the superimposed composite image) and zooms towards the virtual hand. The camera position data is transmitted to the XBox, allowing an immediate zoomed-in view of the virtual hand. The cameras essentially become oversized XBox controllers, allowing complete seamless freedom to the producers without any additional effort. The anchors can pause, rewind, and spin around the players as if they were real.

Currently the producers need to choose the plays they want analyzed days in advance. The potential to analyze a key move or play in a game from that day would bring another level of reality to the system. There are also plans to ditch the XBox and run things off of a PC which would allow even better resolution of the virtual players. The collaboration is part of a larger focus by EA to take their technology out of the game and bring it into reality. Think the inverse of the classic tagline, “If it’s in the game, it’s in the game.”