<em>After you've fixed the exposure you can determine if your image is too warm (red), cold (blue) or yellowy green (like the shot above right). Go back into Levels and use the pull-down menu to bring up a histogram for each color (starting with the most pronounced), then adjust the sliders just like you did for Exposure. Overcorrect and then ease the slider back, but be gentle: Adjusting one color affects other colors as well (increasing red removes cyan, adding blue removes yellow, and more green means less magenta).</em>
After you've fixed the exposure you can determine if your image is too warm (red), cold (blue) or yellowy green (like the shot above right). Go back into Levels and use the pull-down menu to bring up a histogram for each color (starting with the most pronounced), then adjust the sliders just like you did for Exposure. Overcorrect and then ease the slider back, but be gentle: Adjusting one color affects other colors as well (increasing red removes cyan, adding blue removes yellow, and more green means less magenta).. Mikkel Aaland
SHARE

Last summer, a group of MIT scientists debuted a new video amplification algorithm that exaggerates slight changes in movement or color, like a magnifying glass for moving images. Since then, they’ve made the open-source code available and started allowing anyone to upload videos and see the effect for themselves. The New York Times got inside the lab to see what they project is doing in this video.

Scientists at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT first presented “Eulerian video magnification” last year at SIGGRAPH, a computer graphics conference. Originally, the system — which lets you identify from afar if a person is breathing, how fast their heart is beating and where blood is traveling in their body — was designed to monitor the vital signs of neonatal infants without having to touch them.

It measures the color intensity of pixels and then amplifies any changes in that intensity, registering the slight reddening of your face in conjunction with your pulse, for example. You can apply the system to videos retroactively, to a scene from your favorite Batman movie, perhaps.

The researchers posted their code online in August, making it available for anyone to use for non-commercial purposes, though running the program was somewhat complicated. Now you can upload your own videos to the website for Quanta Research Cambridge, a CSAIL sponsor, and see the system work its magic.

NYTimes