‘Interactive Dynamic Video’ Could Make Pokémon Go Even More Engaging
SHARE

Sure, Pokémon Go has taken over the world (and our website), but what comes next?

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have figured out an algorithm that allows simple video clips of an object to be manipulated at will. A few seconds of footage of a tree moving in a breeze is enough to allow someone looking at the footage to manipulate the tree’s branches into almost any movement. Ents, eat your heart out.

In the video above, released today, lead author of the research Abe Davis shows how videos of a wire figure on a vibrating surface, a bush blown by the wind, and even YouTube clips can be manipulated using this technique — which Davis and his colleagues call Interactive Dynamic Video.

“If you want to model how an object behaves and responds to different forces, we show that you can observe the object respond to existing forces and assume that it will respond in a consistent way to new ones,” Davis said in a CSAIL press release.

Eventually, the technique could be used to make augmented reality animations integrate even more with the ‘reality’ part of augmented reality, help engineers model how structures will react when different forces are applied, or as a less expensive way to create special effects in movies (think Jurassic Park, without as many green screens or CGI).

The paper detailing the research was published earlier this year after being presented at SIGGRAPH Asia last November. The research is also part of Davis’ PhD dissertation.

And yes, someday it could be adapted to work with Pokémon Go too.

The dynamic video research comes out of the same lab that produced the visual mic algorithm in 2014, which allowed researchers to listen in on conversations by watching the vibrations of a potato chip bag.