This VR accessory is designed to make your mouth feel stuff

Stuff being spiders and raindrops—not kisses.
VR headset with mouth haptics add-on
VR headset with mouth haptics add-on. Future Interfaces Group

Share

If virtual reality is supposed to ultimately mimic our current reality, then touch is an important sense to address. Haptics research—research around tech that can integrate this feeling of touch—has been a hot topic among virtual- and augmented reality-focused companies like Meta

Now, researchers from the Future Interfaces Group at Carnegie Mellon University have developed an add-on mouth haptics device that can be attached to the bottom of a VR headset. Their paper on the tech is being presented at the CHI Conference on Human Factors in Computing Systems in New Orleans this week. Their system can allow users wearing VR goggles to experience what it might feel like to drink water from a fountain, or have a spider running on their face.

This is how it works: A grid of tiny ultrasound speakers are lined up on a board that is stuck onto the bottom of a VR headset. These speakers, called transducers, produce sound, or pressure waves, at a frequency that human ears can’t hear. The speakers can be fired in specific ways so that all of their waves focus together, like light through a magnifying lens, explains Chris Harrison, the director of the Future Interfaces Group. “It creates this sensation of pressure. Pressure over time creates a vibration,” he says. “If you put those together in interesting ways and you couple them with good sound design and good visual design, it actually is quite immersive.” 

[Related: Meta gave a sneak peek of one of its first VR wearables]

Vivian Shen, a PhD candidate at Carnegie Mellon University and first author on the paper, says that they wanted to work with ultrasound haptics because it gives a more localized effect and is very “expressive.” The mouth is a good target to test this on because it is a particularly sensitive area on the human body. There are many parameters for the tech that they could play with, including the strength of the effect, where the effect is, the duration of the effect, patterns of the effect over time, and whether the effect is stable or modulated. 

They tried out combinations of these to animate basic effects, and used the strongest and most compelling ones to build their animation library that includes the basic haptic commands for different motions. These included actions like swipes in x, y and z directions, “because any sort of movement on the lips is a very interesting effect that can go very well with a bunch of types of VR animation,” says Shen. “And of course, ultrasound being a very localized node, it’s very easy to do taps and persistent vibration. We can change the timing, spacing and modulation frequency.” 

The device prototypes drew from some open-source designs and had a custom printed circuit board. The first version of the device looked like a typewriter keyboard, with mini spotlights pointed at the user’s mouth. The researchers also gave a sneak peek into a newer, still in-progress, version of the device featuring thinner panels that are a more advanced version of the transducers. 

Before it was debuted, the team demoed the device on a small group of participants. As part of this demo, users were asked to go through a haunted forest scene, a real world object simulator, and a racing game, based on popular categories of VR applications. Users, as they went through the virtual worlds, could feel spiders crawl across, rain drop pattering, mud splashing, and water fountains bubbling across their lips. The feel of the scenes was constructed with code (soon to be open-sourced).

“We want it to be drag-and-drop haptics. How it works in [user interface design] right now is you can drag and drop color on objects, drag and drop materials and textures and change the scene through very simple UI commands,” says Shen, referring to how programmers could alter the look and feel of interactive software applications. “We made an animation library that’s a drag-and-drop haptic node, so you can literally drag this haptic node onto things in scenes, like a water fountain stream or a bug that jumps onto your face.” After dragging and dropping, programmers can then tweak the code parameters.

After running through the different scenarios, “the eight users in the user experience study said they preferred it over not having any haptics at all,” Shen says. During the study, some participants even involuntarily slapped at their face when they “felt” the spider crawling across their mouth. 

Despite the impressive results they got, the mouth haptics system did not always work perfectly. “The main thing we struggled with was calibration,” Shen notes. Since everyone has a different facial geometry, it can sometimes be difficult to position the transducers in a place where the animations are correctly translating on the mouth. There were also a few users who were not able to feel any effects from the device. Harrison notes that these issues present open questions that they aim to explore through future research.

[Related: This wearable fabric microphone can listen to the world—and your body]

Shen, Harrison, and their colleagues are certainly not the first researchers to explore mouth haptics. FaceHaptics from a German group in 2020 used a robotic arm attached to a headset to hold a spray nozzle, a heating wire, a fan, and a rubber tip to create a range of sensations on the face. Multi-sensory VR mask FeelReal has temperature, smell, vibration, and mist as part of its system. But Harrison notes that these devices with too many added features all evolve into clunky helmets that may feel heavy over long periods of wear. For their prototype, they wanted to concentrate on something that was small, lightweight, but still “visceral,” and could be feasibly incorporated into existing VR headsets. Potentially, it’s a concept they could further develop with product design partners going forward. 

“What’s interesting is a lot of news articles so far have focused on the possibility of kissing, and I’d actually like to say that that’s not possible because right now our software just renders one node. It’s good for small effects like having something crawling on your mouth, or if you have rain, so it’s very small points hitting your mouth,” Shen says. “We can’t do larger objects because the way ultrasound focuses, the wavelength is very small, the peaks are very small in space.” 

To generate a kissing effect, they would need multiple nodes controlled by a massive array, which in all likelihood would not be compatible with a VR headset. But even with limitations on what can be accomplished with mouth haptics, this prototype shows that enhancing touch sensations in the VR can be done fairly practically.

See the technology in action:

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.