Social Media photo
SHARE

In its three-year development phase, the Oculus Rift headset has been a sandbox for coders interested in creating immersive virtual reality experiences. And when developers have three years to play with a piece of technology, they put together some outrageous mashups, like a gaming exoskeleton or the omnidirectional treadmill. So far, these innovative virtual experiences have been just that — virtual, that is, confined to the inner world of the Oculus Rift headset. But that’s beginning to change, thanks to a new project that lets you see and manipulate virtual objects in the air in front of you as though they were tangible holograms.

The project relies on the Leap Motion controller, a small puck-like device that tracks the motion of your arms and hands. Oculus and Leap Motion started supporting each other’s software last year, making it easier for developers to make exactly this kind of software.

In his video made for an office hackathon, Leap software engineer Raffi Bedikian shows off the new system by managing app windows in the air in front of him, scrolling through lists, and even lowering the brightness of his surroundings. He imagines possibilities for the switch between VR and AR space far beyond an office environment.

“You could toggle the passthrough on when the flight attendant is handing you your drink, then toggle it off again to resume your immersive 3-D movie experience,” Bedikian told Wired.

This feature could also help dispel some concerns that Oculus and other burgeoning VR platforms are too disconnected from the real world. With this feature, users could quickly switch out of their virtual space and get their physical bearings — a danger especially when working with motion capture.

But Leap developers aren’t the only ones exploring the controller’s use for Oculus. For example, graduate students at France’s École Nationale Supérieure d’Ingénieurs Sud Alsace recently took on a six-week project to create an interactive 3D space using Leap Motion and Oculus, coding their environment with the Unity game engine.

Last year, Popular Science wrote that “medical students might take nanoscale walking tours of the brain,”, and that’s pretty close to what these enterprising students, Paul Bourgeois and Cyprien Beaucreux, are trying to do. After building a virtual operating room, they’re turning towards specialized medical training like introductory courses for surgeons.

“With VR headset and Leap, these beginners could train in a world created for them, with the same tools as in the real world, a patient with the same diagnosis and a body with real characteristics,” wrote Bourgeois in an email. “It surely will not change the feeling of a true tool in his hand but he can train or learn the right movements for a specific surgery.”

For their next project, however, Bourgeois and Beaucreux are working on a “Room of Errors,” according to ENSISA professor Germain Forestier, who led the initial project. Forestier says that a medical student would be placed in the virtual room, which would have certain elements that could be dangerous to patients. The medical student would then have to identify what was wrong based on what they observed, a better learning experience than just answering questions on a test.

Microsoft also has been exploring medical training with its Hololens, their augmented reality headset. They’ve partnered with Case Western Reserve University to deploy the Hololens to students, seen marveling at the technology in a video on their YouTube channel. The video shows the Hololens in action, and most notably, the device’s limited field of view.

All of these independent experiments point to a rapidly developing new landscape of computing, one that frees apps from the screens of our phones and PCs and other devices, and instead brings them right into the physical world alongside us.