Two MIT researchers have cracked some fundamental problems with high resolution 3-D imaging using a novel gelatinous interface and computer-vision algorithms that, in tandem, can easily and portably provide imaging resolutions that were previously only possible with large and expensive laboratory gear. The resulting high-quality, 3-D models can be manipulated on a computer screen to a variety of ends ranging from quality control to criminal forensics to dermatology.

Described simply (you can get the more in-depth description via the video below), the system’s key component is a piece of transparent, synthetic rubber coated on one side with a metallic paint composed of very tiny particles. When the non-painted side is pressed against an object–even an object with very small features like the ink on a piece of paper (see image above)–the metallic paint deforms to capture those features.

Cameras set at various angles then capture that deformation from all sides, and computer-vision algorithms turn them into 3-D images. Contrast that with the usual method of obtaining a 3-D image with similar resolution–expensive and sensitive microscopes, vibration isolation tables, high-powered computers–and GelSight, as it is known, looks like a pretty big leap forward for both resolution and sheer simplicity.

GelSight also gets around a key problem with 3-D imaging. By translating an object’s most minuscule features–GelSight can measure features down to less than one micrometer in depth and roughly two micrometers across–through the gel to the metallic paint, it circumvents imaging problems introduced by the various optical properties of various materials (like, for instance, an opaque gel or a clear crystalline object, both of which interact with light differently than, say, a solid object that lets no light pass through).

Potential applications range from distinguishing moles from cancerous growths to quickly and cheaply inspecting manufactured goods to matching spent bullet casings to the firearms that fired them. See GelSight in action below.

MIT News