Close

Is Augmented Reality the Next Disruptive Technology?

By Alexander MacDougall |  December 17, 2021
LinkedInTwitterFacebookEmail

The acronyms AR, XR, and MR — that’s augmented reality, extended reality, and mixed reality, to the unitiated — are all being tossed around in tech sector. But fundamentally, all three terms refer to superimposing layers of digital information and imagery onto what we used to call the “real world.” 

A new book from entrepreneur and former Warby Parker VP David Rose tries to make sense of this new landscape. We’ve already seen applications of AR in games like the popular “Pokémon Go” mobile app and using Snapchat filters to play around with friends. But how else might it be applied to other facets of everyday life? 

David Rose’s new book, “SuperSight: What Augmented Reality Means for Our Lives, Our Work, and the Way We Imagine the Future.”

Rose tackles that question in “SuperSight“, which looks at potential uses everywhere from schools to hair salons to architecture firms. He also explores some of the ethical dilemmas that may emerge — such as how to protect  privacy rights when everyone has a camera attached to their glasses.

Rose sat down with InnoLead to share a few insights from his new book, including how large organizations should be approaching this technology. He’ll be among the featured speakers in January at our Impact Reconnected online conference.

With all this talk of the metaverse now, it would seem like your book is more relevant than ever.

I would draw a distinction between the metaverse [and augmented reality.] I think most people conceive of the metaverse as a VR shared environment, and what I’ve written about is using computer vision in order to understand the world around us and also superimpose virtual layers on the real world.

My conception, and what I really focus on, is augmented reality or mixed reality. And because the nomenclature is so confused, like AR, XR, mixed reality, there’s always this scramble for any new category. I decided it would just be easier to say “SuperSight,” because I feel like that is the human benefit or value that we’re getting from this technology. The cameras are getting smaller and cheaper, and able to be pasted into doorbells and ovens and everywhere else. And that has profound implications for work and life, but also these cameras are getting pasted onto the temples of our glasses in order to see what we see. [That enables] other people [to] see what you see, coach you, and algorithms can also see what you see and see if you’re paying attention.

Outside of games like Pokémon Go, what are some current examples we’re seeing of this SuperSight being implemented?

David Rose, Author, “SuperSight”; former VP of Vision Technology, Warby Parker

For me, the most important thing that this technology should help us do is, it should help us with this kind of lack of imagination problem that a lot of us have. The reason that we hire an interior designer, a landscape designer, a city planner, an architect, is because it’s hard for us to see the vision that these people see. We’d love to have that skill — to be able to look at an empty parking lot in Southie and be like, “I can see this would be an amazing pocket park,” or, “This would be amazing event space.”

There’s a company in New York called In Situ that is helping architects with this problem of how do you show people the plans for a building or a park that’s going up, and then gather community inputs? Traditionally, that’s just been done in a town meeting, where very few people show up. It’s not a representative set of people, and mostly they look at these kinds of optimistic sketches that aren’t really in context and they give feedback. But it’s a really important step in the approval process. But it usually goes badly, and they don’t get enough voices. Instead, if you can just hold up your phone and instead of seeing a Pokémon character, you see the new building, and then you can create a quick selfie video about what your reactions to that are, that’s a good way to get community input. 

I really believe that what the web was in the 90s and what mobile was in the aughts, this is what AR will be in the next decade.

How relevant is this new technology to larger businesses and organizations?

Not to be too prescient, but I really believe that what the web was in the 90s and what mobile was in the aughts, this is what AR will be in the next decade. In the last 10 years, voice interfaces have been the new way to interact, whether that’s on mobile or in your car, in your kitchen. I think it’s inevitably the next platform for computing, which means it’s a new way of interacting with customers. I think if you’re selling any physical products, what companies like Wayfair and IKEA have discovered, and many others like Nike and Reebok — if you can show people your products, not just in 2D but also in 3D and also in the context in which it’s going to be used, it cuts down on returns and increases conversion rates. You can size the couch for your living room, you can do the visual matching.

What are some of the big ethical issues regarding the new technology?

I call them the hazards of SuperSight, and there are six of them in the book. I try to outline what I think the problem is, and also how we might do something about it. The first I call social insulation, and this is, if everyone sees their own personal privatized view of the world, then it’s “filter bubble” problems all over again, except even larger. I think there’s actually an opportunity for design there — where you can be able to see that someone is seeing something different, or you need to be able to see that somebody is recording something, and then there should be a way to kind of synchronize what you’re seeing with what they’re seeing.

 

Featured photo by Patrick Schneider from Unsplash

LinkedInTwitterFacebookEmail