MIT and Google’s New Algorithm Removes Reflections and Fences From Photos

All the better to #trend with

Share

There’s something that Little League baseball games, the zoo, and the top of the Freedom Tower all have in common. If you want to take a picture, you’ll probably have something obstructing your view, whether it be glass or a fence.

Soon, smartphone cameras may have a way around these barriers. A team from Google and MIT CSAIL have created an algorithm that specifically looks to remove reflections and obstructions (like chain-link fences) from images. Or rather, short videos.

Photographers have been able to take obstructions out of images for years with programs like Photoshop—a few quick clicks with the clone tool, and a pesky streetlight or telephone wire is gone. The problem with this technique is it’s not real. It’s a fabrication.

That’s why the Google/MIT algorithm is so neat. By taking an input of five images, from a short video where the camera is slightly panning, the algorithm detects what obstruction is in the foreground, and tracks it throughout the five frames based on their different motions. The computer then replaces the space that would be missing, when the foreground and backgrounds layers are separated, with pixels from the other photos.

httpswww.popsci.comsitespopsci.comfilesscreen_shot_2015-08-05_at_1.01.10_pm.png
Google and MIT’s algorithm uses movement, edges, and color to distinguish foreground and background elements. Tianfan Xue/ Screenshot

The program can output both layers, the obstruction and the final image, because it finds and tracks each one separately. It’s the closest we’ve come so far to recreating reality—and it’s because we’re borrowing from another reality, split seconds later.

This idea isn’t new, but it’s the most complex version to date. Samsung actually released a similar feature beginning with their Galaxy S4, called Eraser Mode. This mode tracks singular moving objects over a series of frames, and uses the background to fill in unknown pixels. The difference is the scale—Eraser mode couldn’t take out something as detailed and complex as a entire reflection over the entire photo or a chain link fence.

It’s the closest we’ve come so far to recreating reality

Google and MIT have created another iteration of an interesting tool with a wide variety of uses, especially for the smartphone photographer. But we have to be careful with technology like this, as we do with any piece of technology that alters our perception of the world. This blurs the line of “real” and “fake” photography. This isn’t going to make all your photos look incredible, but it does have the power to remove what we might see as nuisances. And like so many other post-processing tools, like Photoshop and beauty ads, that can have a lasting effect on how we see our world.

While you chew on that bit of philosophical rumination, you can watch a video on how the algorithm works.