When we think about the technological future, we so often invision a dystopian reality. This doesn’t have to be the case. Tactile sensations, audio and hearables, digital smell and taste have the potential to help the blind “see”. They can bring us closer together by sending the smell of roses to a loved one.
No matter what we can or can’t do with augmented reality, it’s clear that AR is more than the digital laid over reality. AR, at its best, disappears around us. It is three dimensional in how we interact with it. Augmented reality becomes something we sense. We interact with it without thinking.
The future of AR depends on breaking “the glass”
Before we get to AR seamlessly embedded in our reality, we first have to move away from playing with AR through “the glass”. The glass being your iPad or smartphone. Papagiannis quotes Bret Victor, author of A Brief Rant on the Future of Interaction Design, who says iPads and smartphones, “sacrifice all the tactile richness of working with our hands” (Papagiannis 25).
The future of interaction is “a dynamic medium that we can see, feel, and manipulate” (Papagiannis 25). What he means is that we can feel the grass in the movies we watch or feel the tiny explosions from the mobile game we play. Whatever we’re watching or playing will become more real because we can see and feel it.Not only can we feel through the screen but we can also use our hands to “manipulate and pull virtual objects and data directly out of a 2-D display and into the 3-D world” (Papagiannis 27).
I’m glad people are developing ways to work and interact with the digital in 3-D. At VR Day Atlanta, Dr. Grace Ahn talked about how kids, when placed in virtual reality, innately draw all around themselves. For instance, when asked to draw an igloo, they drew themselves inside a 3-D igloo. Adults, on the other hand, tried to draw a traditional 2-D igloo in 3-D VR. Being able to mold digital 3-D objects, scenes, and data will train us on how to work, feel, and experience in 3-D.
Augmented smell. I can already “smell” trolls sending fart scents to each other on the internet. Virtual smell is nothing new. The original Smell-O-Vision debuted at the 1939 New York World’s Fair. In 1960, the first Smell-O-Vision film, A Scent of Mystery was released in theaters. Special piping was attached to the theater seats to emit certain smells at the right time.
“AR could also be in danger of becoming a gimmick if the focus remains on the technology rather than on delivering an impactful and compelling experience.” – Helen Papagiannis
Smell-O-Vision had its ups and downs. Now, scientists and technologists think it has a better chance of being personalized such as using smell as “a nonverbal method of communication” (Papagiannis 59). Instead of a sound notification, you would smell a notification. A Facebook “like” would be a different smell than a new email. I’m not sure that I want a bunch of different smells in my house. I can’t see myself wearing a personal scent device up my nose either.
“Digital Smellscapes” as Papagiannis calls them, are an interesting augmentation idea. I did find it fascinating to read how one smell device, eScent, is used as a diagnosing tool. Doctors can use eScenet for “neurodegeneration purposes by preprogramming a personalized timed-release scent system for a patient” (Papagiannis 63). This can help doctors test patients for early Alzheimers or Parkinson’s disease. “Smell technology…can have a real impact on the healthcare industry” which I think is pretty cool (Papagiannis 63).
This post is part of the VR Book Club Series. This month we’re reading Augmented Human by Helen Papagiannis.
*Disclaimer: Lilyotron is a participant in the Amazon Services LLC Associates program. Each of your purchases via our Amazon affiliation links supports this site at no additional cost to you.