VR has a hard time showing you things up close, but Oculus might have a fix

The company is focusing on focus.

Share

Even though VR headsets are small enough to strap onto your face, they can make objects in the virtual scene seem far off in the distance. The headsets accomplish this immersive, visual trick by having two key optical parts: screens inside that display the images, and magnifying glass-like lenses between your eyes and those screens. It’s those lenses that allows a virtual dinosaur to look as if it’s in the scene in front of you, and not just on small screens inches from your eyes.

But this configuration presents a problem for virtual-reality makers to solve. How do you keep different elements of the virtual world—mountains in the distance, flowers in the foreground, everything in between— in focus?

Virtual-reality headsets tackle this problem by generally setting the focus distance in VR land to the equivalent of about 6.5 feet away from your eyes, approximating the gap between your couch and your television. So, if you’re watching a movie in virtual reality, the virtual screen may look like it’s hovering in front of you roughly that far away.

“That means, based on vision science literature, that anything beyond arm’s reach is going to look really good, and is going to be very comfortable to look at,” says Douglas Lanman, a research scientist who focuses on computational imaging at Oculus, which makes the Oculus Rift headset. A view of the Grand Canyon? A circus performer nearby? Those spectacular scenes are going to be good, in terms of focus, Lanman says, since they’re a nice distance away.

A tougher nut to crack is what happens when things are up close—like if someone wants to pick up a book in virtual reality, bring it to her face, and read it. Presented with that virtual book, “your eye will focus correctly,” Lanman says, “but that will throw the whole screen out of focus.” It’s historically been a tough problem to solve.

But Oculus Research thinks they may have a fix, and it involves adding another piece of equipment into the headset between the lenses (which are near your eyes) and the screens (which display the images). That extra hardware is called a phase spatial light modulator.

“They’re used to change the focus of light,” says Lanman, who points out that the technology can be used to, say, sharpen the optics on a telescope. Putting the phase spatial light modulator inside a VR headset would be like adding “a programmable lens.”

The result would let them “vary that focus across the scene,” Lanman says, so ideally, in those tough VR situations like a book held close, the focus could be adjusted dynamically and everything would be sharp, whether it was very close, very far away, or somewhere in between.

“There’s promise here, but [the tech is] brand-new,” he says. Lanman and two other researchers will publish an article about it in the journal ACM Transactions on Graphics.

Another possible bonus to the tech? Making it easier for your eyes to focus on everything could possibly reduce issues like eye strain and headaches that people sometimes experience in virtual reality.

While this is research that may never make it into an Oculus headset, or a headset made by another company, like HTC or Sony, which produce the Vive and PlayStation VR, respectively, it’s still something that Lanman thinks is new and exciting. “It’s an emerging display technology,” he says.