Book Excerpt: What Problems Can the Metaverse Actually Solve?

August 30, 2023

Remember when Bill Gates went on “Late Night” in 1995, and tried to convince David Letterman that the Internet was gonna be big?

Nokia exec Leslie Shannon can sometimes feel that she’s playing the Gates role when she makes the case for the Metaverse’s potential — especially at a moment when Meta (the company formerly known as Facebook) has been dialing back its resources, and Apple’s much-hyped Vision Pro headset hasn’t hit the market.

But in Shannon’s new book, Interconnected Realities: How the Metaverse Will Transform Our Relationship with Technology Forever, she makes the case that we’re already seeing immersive digital worlds attract significant audiences, and a growing number of companies finding success by blending physical and digital experiences. Shannon is Head of Trend and Innovation Scouting for Nokia, the Finnish telecommunications and IT service company.

In this exclusive excerpt, she writes about what makes new technologies successful, and lays out a case for why the Metaverse will get there.

Leslie Shannon, author of Interconnected Realities, and Head of Trend and Innovation Scouting at Nokia.

• • •

What I’ve learned in the multiple decades that I’ve been working in this space is that new technology is only successful if it solves a problem without costing too much. Let’s spend some time unpacking this phrase’s implications with some examples before we connect this thought to the Metaverse.

My poster child for this statement is 3D TV. It might be fun to see “more realistic” broadcasts, but . . . is it really a problem to watch 2D images on your TV screen at home? Especially when producing 3D TV content is likely to be fairly expensive, and limited at first? The resounding answer from the global market (at least for now) has been no, watching 2D content on our TVs is not a problem that needs to be solved. 3D television has not become successful, because, in large part, it does not solve a problem.

Back in the 1980s, though, the mass market was interested in solving the problem of not being able to watch a TV show if you weren’t able to be present at the one and only time your show was broadcast over network TV. Enter the video recorder – except which format should you buy, the higher-quality Betamax or the lower-quality VHS? As we all know, the VHS format is the one that won out. But it wasn’t because it literally cost less than Betamax. A more substantial driver behind VHS’s success was that it initially offered two-hour tapes, while Betamax’s first tapes were only one hour long. The problem that the TV watchers of the world had wasn’t just that they wanted to tape one-hour shows from live TV; it was that they wanted to record TV movies – which were usually two hours long. VHS owes its eventual victory in the VCR Format Wars not to its cheaper price, but rather to its offering being a better solution to the problem.

So What Problem Does the Metaverse solve?

“Cost” can be monetary, as in the VCR example above – it certainly didn’t hurt the VHS standard that it was cheaper than the Betamax equivalent. Or in the case of my family’s revelatory experience with the iPad, we only had that experience because the iPad itself wasn’t prohibitively expensive, and we were able to buy one even without having a very firm idea about what we were going to use it for.

Cost can come in other forms, though, including time, inconvenience, frustration factor, and so on. Back in the late 1990s and early 2000s, for example, you could read your email on your mobile phone, but only if you could specify your phone’s IP address and POP server during setup. What, you don’t have that information easily to hand? No mobile email for you, then. The time and engagement cost for mobile email in those early days was just too high for it to catch on widely, which created the perfect market opening for the BlackBerry.

This now brings us to the gazillion-dollar question: What problem does the Metaverse solve?


Frankly, any conversation that we have with ourselves or others about this question is, at this particular moment of Metaverse development, very likely to sound like Bill Gates talking with David Letterman in 1995. Let’s run through a few scenarios:

Metaverse Skeptic: What is the Metaverse, exactly?

Metaverse Fan: An immersive world where you can build a digital mansion and have all your friends’ avatars come over.

Metaverse Skeptic: Do “Zoom rooms with a palace background” ring a bell?

Metaverse Fan: Yeah, but you don’t have to look like yourself. You can be a giant banana if you want!

Metaverse Skeptic: Does “Snap Camera with a giant banana filter” ring a bell?

And so on.

The reason that we’re here, the reason that I’m writing a book about the Metaverse and you’re reading it, is that somewhere in that Metaverse concept there must be some kind of a solution to some kind of problem, even if we can’t quite articulate it yet.

And there is one problem that the Metaverse directly addresses, although it’s not a problem that we talk about or even acknowledge to ourselves very often. This problem is that most of our computing is currently locked behind 2D screens. To access it, we have to engage with a computer or tablet or phone screen using both our vision and our cognitive attention, thus withdrawing our looking and thinking from the people, places, and things that surround us in the physical world.

We make jokes about restaurant outings in which every person at the table is looking at their phone screen rather than talking to each other, but at the same time it’s well-understood that distracted driving kills thousands of people annually in the United States alone [1], and a majority of US parents are concerned that their teenagers are spending too much time on screens for both social media and gaming [2]. You can probably name an incident in your own life within just the past week in which looking at a screen, rather than being present in your immediate surroundings, created a situation that caught you out socially, or made you neglect someone, or was even potentially dangerous. Yeah, I can, too. We’re all complicit in this one.

Smartphones and computers have done too well at solving the problem of delivering information and entertainment to us, exactly when and where we want it.

The problem is that smartphones and computers have done too well at solving the problem of delivering information and entertainment to us, exactly when and where we want it. To get this spectacular convenience, we’re prepared to pay a surprisingly high cost in terms of our connection to the people, places, and things physically around us, and it’s a cost that we’re paying quite thoughtlessly today. But make no mistake, we’re paying it.

A Union of the Digital and the Physical

Let’s now rethink what a relevant, problem-solving Metaverse could be. If one of the problems that it’s solving is our overcommitment to screens, then how is an immersive digital world the answer? Well, it’s not. But if we start thinking about a spectrum of experience, in which the far-left-hand side is 100% physical experiences, the far-right-hand side is 100% digital experiences, then there also exists a middle point that is 50% physical and 50% digital, and sliding proportions of digital/physical mixes on either side of that middle point. It’s the digital/physical mixes that deserve our attention – the interconnected realities.

Interconnected Realities: Digital/Physical Fusion

Now we’re getting somewhere. This concept of the Metaverse is a world in which we can have the compelling, fascinating, relevant content that we currently access on screens, but integrated visually into our physical world in a way that enhances our lives, rather than removing us from them. This concept of the Metaverse imagines the digital and physical aspects being incorporated with each other on a constantly sliding scale, so that sometimes we are fully immersed in a digital world, when that serves the purpose of the moment, but it is also possible to spend significant time fully immersed only in the physical world. The main Metaverse action takes place in a mix of the two, which we would activate on an as-needed basis, controlled by the end user, with the focus on digitally enhancing our experience as we continue to stay mostly aware of and present in our physical surroundings and the people and things that are there with us.

This Metaverse of interconnected realities will be a place where we combine digital information (or entertainment) from the world of the Internet with our physical surroundings so that we can be more efficient, more informed, more delighted, and more aware…

This Metaverse of interconnected realities will be a place where we combine digital information (or entertainment) from the world of the Internet with our physical surroundings so that we can be more efficient, more informed, more delighted, and more aware than we are today. A simple example of this enhanced future might be a sensor in my oven that connects with my augmented reality (AR) glasses and, when the oven is on, displays its current temperature in a visual digital overlay when my gaze lingers on my oven for more than one or two seconds – useful when I’m on the other side of the kitchen. (If that example isn’t very compelling, hang on – later in the book I’m going to tell you about some really exciting mixed reality concepts that will blow your socks off.)

This, then, is my definition of the Metaverse:

    The Metaverse is a partly- or fully-digital experience that brings together people, places, and/or information in real time in a way that transcends that which is possible in the physical world alone, in order to solve a problem.

This definition covers the “two people meeting as avatars in a virtual reality space” concept that most people think of when they think of the Metaverse, as well as the very simple oven example that I just gave. In the case of the oven, it’s a partly-digital experience that connects me with the (invisible) current temperature of my oven, from any distance away. The problem that it solves is me wondering how much longer it’s going to take before my oven reaches the temperature I need to cook dinner, without having to walk over to it and peer at its small, poorly-illuminated readout. A minor problem, to be sure, but sometimes it’s the solutions to the little irksome things in life that bring unexpected delight.

Noted Metaverse thinker and author Matthew Ball has said, “If there’s any aspect of the Metaverse on which everyone . . . can agree, it’s that it is based on virtual worlds.” [3]

Well, no, I don’t agree – which just goes to show how fluid the definition and understanding of the Metaverse is in these early days. Purists from Ball’s camp will object that the inclusion of “partly-digital experiences” deviates from the “immersive virtual world” Metaverse concept suggested by early visualizations Snow Crash and Ready Player One, and they’re right. And that’s okay [4].

If we’re going to build the “next iteration of the Internet” and have it be an actual successful technology, it does us no harm whatsoever to thoughtfully expand our definition so that it includes the ways in which adding digitalized content to our lives will transform our experience at every point of the day, not just those moments when we’re sitting in front of a computer or wearing a virtual reality (VR) headset. The Metaverse is a new paradigm for how humans and computers will interact, not a specific digital world.




[3] Matthew Ball, The Metaverse: And How It Will Revolutionize Everything (New York: Liveright Publishing, 2022), 29.

[4] This was recognized by Merriam Webster when they added the word “Metaverse” to their dictionary in September 2022. Their definition of the Metaverse is of a “persistent virtual environment that allows access to and interoperability of multiple individual virtual realities.” No immersion required.

Featured image by James Yarema on Unsplash.