I was first exposed to XR- Virtual reality, Augmented reality, etc- when I started working at the Reese Innovation Lab last summer. After a staff meeting one afternoon, my manager Alexis handed me a VR headset and let me play a National Geographic Oculus Quest that transported me to Antarctica. The first thing you do, after you get transported, is hop into a boat. I felt like I was going to fall into the water. Then you start paddling. As you paddle you see all sorts of animals around you. Eventually, I reached the whales and I felt so afraid. The ability that forms of XR give you to feel emotions fascinates me. That ability is why I believe they are one of the most engaging forms of technology.
We have found a way to experience things we might not ever experience in real life, like standing next to a tiger in our kitchen.
Google recently released a new augmented reality feature to its search function. To use it, all you have to do is grab a smartphone, type an animal in the search bar, click on “view in 3D”, aim your camera towards an open space in your house and wait for the animal to appear. Once it does you’ll be able to walk around the animal and view it, life-size, as if it were actually there in your house. Pretty cool right? The animal Google chose to promote this feature was the tiger, so that’s the one I tried first. It was awesome.
Tigers are my favorite land animal (manatees are my favorite marine animal, but unfortunately Google didn’t give them an AR button) and being able to see one incredibly up close and in relation to the furniture in my house was awesome. I even made my dog stand next to it so I could take pictures of them together.

The next animal I chose to see was a bear. The bear was huge. And… It moved and stood up on its two hind legs. Now, I know that the bear isn’t actually standing in front of me. And I know that it’s just a model of a bear. But I’m not going to lie, my stomach had some butterflies.
Playing around with Google’s new feature reminded me of a project that the Lab worked on a few months ago called “TIME Immersive.”
This is an AR app that allows the user to see what the Amazon looks like after it’s been affected by climate change and deforestation. Similar to Google’s AR, to emerge yourself into the Amazon rainforest you open the app, select a scene, point your camera towards a clear flat surface and wait for the Amazon to appear.
The app is fun to use because not only are you seeing the forest, but you also get to learn about it and interact with it. There are hotspots laid across each scene. When clicked on, an image or video hovers over the forest and Jane Goodall tells you about what you’re seeing. One of the last scenes in the Amazon experience tells the story of how the Karipuna tribe lost their land. For me, since more of my senses are evoked in a virtual experience, hearing the story in the app, while rather than reading it, allowed me to feel a stronger sense of empathy.
As I mentioned earlier, there is emotion attached to each of these virtual experiences. So my question is, if a picture is worth 1000 words, how many words is virtual reality worth?