In New York City’s ONX Studio, bits and pieces of the universe, as seen through the eyes of the James Webb Space Telescope (JWST), are on display. It’s a new exhibit that opened last week from Mozilla Hubs, artist Ashley Zelinskie, and NASA called “Unfolding the Universe: A NASA Webb VR Experience.” It was created to commemorate the launch of the space telescope last December.
Dispersed throughout the exhibition space are rooms with projected movies, desktop computers for users to try the online experience, silk prints, fake fog and laser lights (emulating the birth of stars), and conceptual sculptures inspired by interstellar travel.
At the center of the exhibit’s main room is a spot reserved for the virtual reality aspects of the experience—a digital gallery modeled after the images of galaxies and other celestial bodies from JWST.
Last Wednesday night, former astronaut Mike Massimino was decked out in a VR headset, headphones, and hand controllers, and ambled around an area whose virtual and physical boundaries have been marked out in the gallery with an outline of white masking tape. (Viewers at home can also join in this part of the exhibition from browsers on their phones, laptop, or desktop here.)
“I’m an astronaut but I’m not a young person who does a lot of virtual reality gaming. I don’t know if I controlled it as well as it could be controlled,” Massimino tells PopSci. Massimino, who once went on spacewalking missions to repair and update the various elements on the Hubble Space Telescope in 2002 and 2009, has a special type of appreciation for the engineering it takes to collect the information needed to make science discoveries in space. ”I worked on Hubble. I can appreciate the images. What [Zelinskie] has been able to do is apply an artistic interpretation of that wonder and discovery to it,” he says.
The virtual experience runs kind of like an online game. Viewers can navigate around a series of corridors in outer space and visit animated artworks or interactive avatars of scientists that Zelinskie interviewed in the process.
“She kept a lot of the details. What she made here is true to the science behind it and the way that the telescope works,” Massimino adds. “What I like in general about all of this stuff is that it’s taking very technical scientific discovery and it shows the beauty of images, and the beauty of the science behind it, but in a very artistic way so you can engage it at a different level.”
The James Webb Space Telescope in VR
Zelinskie’s collaboration with NASA and the JWST team started around seven years ago. Since COVID, they had been brainstorming creative ways to engage the public, and landed on the idea of creating a VR experience. They enlisted London-based virtual architects Metaxu Studios and Mozilla Hubs to develop the concept they had in mind.
“We were able to host a viewing party of the James Webb telescope launch on Christmas with a bunch of scientists and the public and we watched NASA Live TV in our Hubs space. We had each of the scientists in VR as avatars, and we streamed it to YouTube,” Zelinskie, a conceptual and mixed media artist, tells PopSci.
When the JWST images were released by NASA in July, she wanted to incorporate some of the updated visual elements into an exhibit.
She added a window of aurora borealis based on the spectroscopy graphs and data from JWST’s first images of exoplanets. There’s also a recurring motif of hexagons that appears in multiple installations, both in person and online. “The reason that they’re hexagons is because they had to fold up into the space capsule. That’s why the show is called ‘Unfolding the Universe,’ because the telescope had to unfold,” Zelinskie explains. “The cool thing about the hexagonal shape of mirrors is it makes this six-pointed star. You’re going to know it’s a Webb image because the stars in that image are going to have the same shape. It’s kind of like an artist signing its work.”
Zelinskie also conducted interviews with several scientists and engineers, asking them about their career journeys, and their experiences working with JWST.
“I wanted to house different portraits of the scientists; we did all the sound mapping so when you walk up to them, you can hear the sound of the interview, but then when you walk away, you’re not hearing it,” Zelinskie says. There’s a soundscape running across the virtual gallery that changes depending on where you are in the space. “That’s what [Mozilla] Hubs is really good at—sound tracking.”
Building out the virtual space
John Shaughnessy, Mozilla Hubs’ senior ecosystem and engineering manager, attests that enabling this kind of spatial audio in a device-agnostic browser setting is definitely challenging work.
There are lots of features to consider, like distance-based fall-off of sound, so conversations close to users are loud, and those further away are quieter. There are also considerations around how sound propagates in the real world. Sounds are different in a room with curtains on the walls versus in a room that has solid metal surfaces. “In fact, we’ve had blind users in Mozilla Hubs who have built add-ons for themselves, customizing the code so they can send audio pings out into the world and listen to how sound bounces off of virtual surfaces to navigate the 3D space without the use of eyesight,” Shaughnessy says. Plus, they have to consider the different qualities of microphones from different users, and noise from things like keyboard typing sounds.
But it’s part of a larger effort to build the tech backbone that will one day power all types of immersive virtual and metaverse interactions. And these are problems that all metaverse and virtual reality platforms face.
“I think groups of people are going to want to meet in virtual spaces with one another, and we’re going to take that for granted. What we’re trying to do is build the bare bones, basic necessities so that it happens in an open and decentralized way,” Shaughnessy says. “For that we need two things. We need people to have a shared spatial awareness. The second one is a shared sense of presence.”
To this end, Shaughnessy says that they have been borrowing 3D graphics tricks used in game rendering to give the illusion of realism. For example, they use baked lighting to calculate shadows and reflections for fixed objects in the scene ahead of time, so that math doesn’t need to be done in real-time. They also use “level of detail” to keep objects close to the user high-definition while conserving overall memory.
In this project specifically, Shaughnessy and Mozilla Hubs built the technology that renders the 3D scene of the meeting space and virtual gallery that Zelinskie and the JWST team came up with. “We gave them a tool where they can customize the look, the avatars that are in there, and how they can present this experience. We don’t control who comes and goes. We don’t monitor what you’re doing in that space,” says Shaughnessy.
The sound of the universe cannot travel through the actual vacuum that is outer space. “Inside your space suit, when you’re space walking, it’s really quiet. You can bang with a hammer, and they’ll hear it inside the spaceship because the sound can travel within the structure, but you can’t hear anything,” Massimino notes. “You can hear yourself breathing inside. You can hear people talking to you in your headset. But what you always hear in the background is the whirring of a fan, which tells you your space suit is working, that air is being circulated, that you have power.”
While the soundscape broadcasted inside his VR headset uses a bit of artistic license, he can just pick up the faint, yet familiar whirring of equipment in the background during his virtual space walk. “It’s a comforting sound.”