Everything You Wanted to Know About the Metaverse (But Were Afraid to Ask)

Episode ID S1E08
June 8, 2022

The metaverse is the next frontier in technology: personalized experience in digitally enhanced physical spaces with augmented and mixed reality. In other words, the world becomes your desktop or your home screen. What does this mean to enterprises, consumers, and communication providers? CoBank’s Jeff Johnston breaks it down with Qualcomm’s Brian Vogelsang, the company’s metaverse product management leader. 

Transcript

Jeff Johnston: Hello there and welcome to the All Day Digital podcast, where we talk to industry executives and thought leaders to get their perspective on a wide range of factors shaping the communications industry. This podcast is brought to you by CoBank’s Knowledge Exchange group and I am your host, Jeff Johnston.

On today’s episode, we get to hear Brian Vogelsang, senior director of product management at Qualcomm, to help us understand the metaverse and what it means to enterprises, consumers, and communication providers. Some of the largest technology companies in the world are investing billions of dollars into creating the metaverse. For example, social media giant Facebook recently changed its name to Meta, and pledged to invest at least $10 billion per year into building the technology.

Qualcomm is playing a key role in the metaverse and given that Brian is leading the company’s product development efforts in this area, we thought he’d be a great person to talk to help us understand the technology. And spoiler alert, he didn’t disappoint.

So, without any further ado, pitter patter, let’s hear what Brian has to say…

Jeff: Brian, welcome to the podcast. Thanks for being here.

Brian Vogelsang: Thanks so much, Jeff. Great to be here.

Jeff: Great. Let's talk metaverse here. I think when we think about the metaverse it's a really broad category with a lot of different applications. I think, for today, Brian, what I'd love to do is focus on the metaverse from an augmented reality and virtual reality standpoint. Maybe help the listeners understand, what is the metaverse from that perspective, and why should we care?

Brian: Fantastic. We at Qualcomm, we truly believe in the potential of the metaverse. We've actually been investing in the underlying technology to enable the metaverse for over a decade, and we'll continue to do so to help our partners build and realize its full potential. Our technology is machine learning, and connectivity, and computer vision, and graphics processing, as well as augmented and virtual reality technologies, which I think are really important for bringing the metaverse to life.

When we think about it, it's really a persistent spatial internet that is personalized, it's social enabled, digital experiences, and it's going to span both our physical and virtual worlds. Today, in technologies like virtual reality, you can really immerse yourself in those virtual spaces and visit them as an avatar, but ultimately, it's going to evolve to digitally enhanced physical spaces as well with augmented and mixed reality. It's something that's not new to Qualcomm, and we've been investing in it for many years, and are just really excited to see it now coming to the forefront of the discussion.

Jeff: Let's drill down on that a little bit more. Is there going to come a day where I'm going to be able to wear a pair of glasses, and instead of watching the Super Bowl on my 65 inch flat panel display in my living room, am I going to actually watch TV through my glasses? Is that the kind of stuff that we can expect people to be doing in the metaverse, or what?

Brian: For sure. If we think about how digital information has been consumed over the past few hundred years, it's been primarily in 2D, from pen and paper, or paintings, evolved into computing. The computers' interfaces have evolved. We've gone from mice and keyboards to touch interfaces in our smartphones, now to using our voice to communicate with computers. Those have evolved, but the way that information is displayed has primarily stayed the same, it's in two dimensions through a rectilinear screen, whether that's a television, or a monitor, or a laptop, or a smartphone.

One of the things, I think, that's really key with the metaverse is we'll be evolving to spatial computing. This is basically the screen disappearing, and the world becomes your desktop or your home screen. When it comes to computing in the real world, in the real world metaverse, we're definitely going to see the use cases like you described, where you can make a virtual screen anywhere.

That might be a 2D screen, or even more important, I think, the way content is created is going to change, and instead of it being built for a two-dimensional viewing pattern, it's going to be immersive 3D content that lives in the real world, that you interact with in 3D.

Jeff: Wow, that's really interesting. Could we eventually live in a world where we don't use smartphones anymore, and instead are consuming all of our content and communicating through headsets? 

Brian: I think that it's going to take some time for this technology to become pervasive. I think that in terms of evolutions of computing, this is going to be an extremely large one. We look at PC computing, we went to the desktop and then we went to the cloud and then we went to mobile computing, and spatial computing is next evolution of how we'll compute.

I think that perhaps, the kinds of experiences, once you experience them in 3D, in the real world or in a simulation in something like virtual reality, there's so many inherent benefits to that that many experiences that will become the way we want to use them.

Perhaps, the kids that are born today, by the time they're teenagers, they're not going to want to use the social media product of like let's say TikTok or something if it's not a 3D immersive application, if it's on a 2D smartphone screen, that's not going to feel like the right way to experience those things. Is the smartphone going to go away? I'm not sure it's going to go away. I think the kinds of applications that people are going to want to use are increasingly going to become immersive, 3D, and live in the real world.

That means we'll perhaps be using those phones less over time. This is going to take decades, I think, to happen. Look at smartphones versus PCs, we still use. We compute today with our smartphone, we compute with our PCs, but our PCs aren't gone. We still are using them for certain use cases, but more of our time is spent on the smartphone. I think we'll see maybe a similar pattern repeat itself with these augmented reality glasses.

Jeff: I think that's a really good comparison. I think while a lot of this sounds so futuristic, when you've got heavyweights like Qualcomm investing the money, they are in the metaverse, and then you've got Facebook, who's been very public, of course, change their name to Meta and they're spending $10 billion plus a year on the metaverse and I'm sure Apple and Google and others are making similar investments.

When there's that money being thrown out to new technology, we certainly got to pay attention because there's obviously something there.

Brian: For sure. I think it's not a question of if, I think it's just a when. When will these technologies be in the mass market and in consumers' hands. It's already starting to happen in enterprise and in consumer virtual reality but I think, with time, it's an inevitability.

Jeff: Actually just real quick, on that, and you mentioned enterprise. When you think about the adoption of these technologies, should we expect to see, primarily, this being adopted enterprise first and then make its way to consumer or do you see parallel tracks here?

Brian: It's a great question. If we look at VR and AR, we have to look at enterprise and consumer separately in each of those categories. Of course, consumer VR is available today. People are using it for gaming and entertainment, for social, for fitness, for viewing concerts, and other things. On the enterprise side for VR, this is an area of the market we're really excited about. We see training as being really an area that the industry's achieved great product market fit around virtual reality technology.

You can imagine if you're trying to learn a hard skill or a soft skill to be able to immerse yourself in a simulation and learn in an embodied way, it just triggers that learning process in a different way.

It feels like you're experiencing it in real life, whether this is something in a corporate setting like diversity inclusion training or it could be something where you are in learning a hard skill about how to assemble or operate a piece of machinery or equipment that might be too expensive to shut down and train you on. Putting you in a simulation, the costs are lower and then people retain the information and learn about it as if they'd been working with the real thing.  

It's really interesting to see the commercial use cases side of VR start to flourish. We see devices like the HoloLens 2 or we see devices like- we call them assisted reality. It's more of an AR experience where you're wearing a camera and a display and you're doing things like getting guided work instruction, so walking you through step-by-step instructions of doing a specific task or remote mentorship.

If you're in the field and you need help with something, you can call back to headquarters, they can see through what you're seeing through the eyes of that camera on your head, and then they can walk you through or annotate on the real world and that thing. That's where we're seeing commercial AR uptake. Then on the consumer AR side, this is very nascent. That's actually part of the market that Qualcomm is working to-- is to help accelerate is how do we get to a point where one day we can all have access to consumer AR glasses.

Jeff: Wow. That's amazing how many different applications and use case scenarios there will eventually be based on what you just said. That's really exciting. I want to talk a little bit about what Qualcomm's doing because Qualcomm really, I think, has been at the center or really a key technology enabler in mobile since I know the mid-90s. Every phone pretty much out there has got some Qualcomm technology in it. No doubt Qualcomm I'm sure is playing a critical role in the development of this new technology.

Maybe you can help listeners understand a little bit about what Qualcomm is doing to help make all of this reality.

Brian: One of the things we're best known for are the chips and the system-on-a-chip that we build, and these really power the devices at the platform level. We deliver the hardware graphics processing AI, hardware blocks, computer vision blocks that allow these devices to be built. You think about the Meta Quest or devices like the HoloLens 2, those are built on Qualcomm's chipsets.

Then we also provide the persistent connectivity to the metaverse. If you look at, if you're connecting one of these devices to a digital twin, or you need to be connected to an augmented reality map of the world that you're interacting with, that's done through wireless and Qualcomm's a leader, of course, in 5G and Wi-Fi, but then we're also enabling the core technologies in augmented and virtual reality.

These are graphics in spatial audio, the perception technologies. When you put these headsets or glasses on, they need to be able to perceive the world around you and understand it, whether that's, let's say, tracking your hands or tracking your body and its physical location in a room, understanding the physical geometry of the room, so that if you're placing augmented content into it, that it's oriented in the room in a way that it feels real.

These kinds of perception technologies we're investing heavily in and enabling those foundational building blocks that others are building upon to create these devices and then also with Snapdragon Spaces. Between the devices and the chips and the connectivity and the core technologies, we see Qualcomm's the ticket to the metaverse because we're enabling all the foundational technology and the end devices that allow people to experience it.

Jeff: I'd like, Brian, to transition a little bit here and talk about the networks that need to help enable all of this to come to life. We've talked about the chipset and the devices and some of the applications, but we need wireless networks, we need wireline networks, fiber, and so forth to bring all of this connectivity together. Of course, it obviously exists today, but I'm curious to get your thoughts.

As we think a few years down the road, it seems like these applications are going to be pretty bandwidth-intensive and require fairly low latency. I guess, A, is that true? Then, B, help us think through what all this means to the likes of wireless operators and wireline operators. Are they ready for this? Do they have the right architecture in place to support these, again, what sounds like bandwidth, low latency type applications? I'd love to get your thoughts on that.

Brian: Yes, it's absolutely the case that this technology will leverage the networks and the capabilities we get from 5G in terms of latency and throughput. If you think about what we're trying to do with these both augmented and virtual reality headsets is make them smaller, and we want to get them-- virtual reality headsets are 300 to 400 grams. Today, augmented reality headsets maybe 130 to 300 grams. A normal pair of glasses is 30 grams.

We want to reduce the size of these things and in order to do that, we need to be very power efficient and deal with the heat that all the components like the displays and other things generate. In order to get them to the smaller form factors, we need to be thinking about where should we be doing the processing? One of the areas that we have been working in is around distributing the processing between the glasses and the smartphone and the smartphone and the network, the edge of the network.

I think that operators will play a critical role in enabling the edge computing resources, particularly around graphics and rendering to allow these devices to become smaller and offload some of that processing to the network. We're doing this today, we're seeing this in VR, and I think in augmented reality we're also going to see it.

It's really about doing the right processing on the right part of the system and that may be partially in the glass and partially inside the phone and then in the edge of the network as well.

Jeff: I get the edge computing part of it. Do you think this also means that wireless operators need to start thinking about making their networks denser, I guess, from an access point standpoint? Do we need to think about higher bands of spectrum with small cells and more access points to address some of the latency requirements? How do you think about all that from a broader architecture standpoint for these operators?

Brian: Yes, we've done some experimentation where we deployed proof of concepts using VR headsets on private networks, where the antennas are closer to the device using unlicensed spectrum and, in some cases, licensed spectrum. Latency is really key to maintain. Throughput's important, but I would say latency is even more critical for this kind of thing.

You think about what we're doing if we want to render off of the device. These devices they track the user's head position and orientation, and then they track their eyes potentially, as well as if, let's say, they're holding a controller or maybe they're using their hands, so we're tracking their hands. All that information needs to be sent up to an edge graphics processor and then rendered, and then encoded, and streamed back.

In that low-latency circle, when it gets back to the headset, we need to adjust for any changes that the body, or the eyes, or the controller hands have made in that round trip. We do some things around prediction technologies to account for and make the latency seem lower, but at the end of the day, you can't have low enough latency for this kind of use case. If you do have latency, then what you visually see isn't going to be correct, and that could translate to feeling nauseous or just the experience is not very good.

I think that edge computing is really important and being able to have lower-latency connections and the topography of the network, how things get deployed is definitely something that we need to be thinking about.

Jeff: I guess I would think that from an enterprise standpoint, the path might be a little bit easier because we've got private 5G networks that they can build, so they can really customize their network to make sure that the latency needs are met, so that people aren't getting vertigo and things like that, right?

Brian: Yes. Those are the kinds of trials that we've done where it's been in more of a controlled private network-type environment. As we get to wide-area, let's say, augmented reality, where let's say we're using augmentations on the real world and we're out and about doing our daily lives, then we're going to need wide area, low-latency connectivity as well if we want to offload more of that intense graphic processing to the network.

Jeff: Got you. Hey, let's just jump back quickly to the device side of things, for a second. I'd love to get your thoughts on batteries. Hear me out here for a second. When I think about these AR/VR glasses or headsets, it sounds like the processing capabilities in these devices is going to be pretty powerful from a graphics rendering standpoint and so forth. Obviously, Snapdragon processes are quite powerful.

Rumor is that Apple is going to put their M1 chip, which is currently in the Mac. Rumor is, is that that's what's going to end up going inside of their whatever version of AR glasses they do. That seems like a pretty powerful chip to go inside of a pair of glasses. Brian, my question is, do you think, today, that we have the battery density, power supply that will do two things? One, allow manufacturers of these goggles or these glasses to make stylish-looking, pretty things that aren't big and bulky. Can we get a battery small enough to put inside of there to make that happen but then at the same time, are we where we need to be for that battery to be able to support what I think are going to be pretty intensive graphic processing applications which tend to be, I think, a bit of a power hog?

Can we meet those two requirements without users having to reach for the power cord every couple of hours to recharge their glasses? Because it feels like to me that's a pretty big challenge but maybe I'm not thinking about it the right way.

Brian: No, it's a huge challenge. Dealing with thermal limitations as we get smaller and smaller glasses, how do we dissipate that heat load and how do we cram more-- the display and optics technology require power, they're power-hungry. One of the things that we're spending a lot of time thinking about right now is how do the smartphone and the glasses work together in order to deliver an experience that is distributed across those two devices?

If we take a pair of air glasses today, we'll put a processor inside the glasses with wireless connectivity, in the future, to the smartphone, and this will allow the smartphone to handle some of that rendering workload with its larger battery so that you're not as reliant on the GPU in the glasses themselves for that rendering pipeline. The glasses processor can handle the perception-type workloads. How do you understand the environment around you and tracking the hands or the eyes and that sort of thing?

The actual graphics process lift would either happen on the phone or happen on the mobile edge. We're spending a lot of time now thinking about how do we do that remote rendering and then it's all about power. It's all about how to balance that system. It’s also about as we think about wireless, and we mentioned previously we've done these proof of concepts and low latency wireless connections for VR headsets and it's what data do you transmit over between the glasses and the phone, or the glasses and the phone and the network, and how do you optimize that system in a way so that you're not transmitting everything. You want to transmit as little as possible to still get the experience that you need.

I think that it comes down to systems design. It comes down to thinking about power in every design decision that's being made.  

It's about distributing the processing and about moving more of the workloads into the hardware. That'll maybe get us to smaller glasses that'll have longer battery life. Now, the battery technology needs to improve as well. We can't stop there, but then it's always a trade-off. Where do you put the battery? If it's a VR headset, you have more flexibility because it's a little bit bigger device, but if it's AR, you've got to put them typically in the sides of the glasses, and then that makes them thicker.

Jeff: Yes, they got to look nice.

Brian: Then you have to require for more wires.

Jeff: Yes, wow. Geez. A lot of moving parts between the device and the network. That's really something.

Brian: That's right.

Jeff: Well, Brian, we've covered a lot today, so this has been just fantastic. Is there anything that I didn't ask? Well, I'm sure we could go on and on forever and ever, but is there any other salient points that you'd like to make, or do you think we covered the main things?

Brian: I think we did. I think that we're just really excited with the momentum that we're seeing in the market now around augmented and virtual reality. We're really excited about the role that Qualcomm plays in enabling the ecosystem, so with technologies like Snapdragon Spaces and working with developers.

Jeff: Yes. Well, it feels like the old adage of it takes a village applies here, and you guys are certainly leading the charge, so that's great. Can't wait to see how this all plays out.

Brian: Awesome.

Jeff: Great. Well, Brian, thanks so much for being here. This is fascinating. I'm a technology geek, so I'm really excited to see how this all evolves and we'll be watching closely. Thank you so much for being with us here today.

Brian: Thank you, Jeff. I appreciate it.

Jeff: I know it’s difficult to think about using virtual reality glasses to watch the Super Bowl instead of on your 75 inch 8K TV. Or beaming a hologram of grandma and grandpa into your living room. But as Brian mentioned, this isn’t a question of if, it’s a question of when. And when you have tens of billions of dollars being thrown at this by some of the most influential and largest tech companies in the world, well history would suggest we shouldn’t be too dismissive. And of course, all of this promises to have a profound impact on communication networks given the bandwidth intensive nature of these applications.     

Hey, thanks for joining us today and watch out for our next episode of the All Day Digital podcast.

Disclaimer: The information provided in this podcast is not intended to be investment, tax, or legal advice and should not be relied upon by listeners for such purposes. The information contained in this podcast has been compiled from what CoBank regards as reliable sources. However, CoBank does not make any representation or warranty regarding the content, and disclaims any responsibility for the information, materials, third-party opinions, and data included in this podcast. In no event will CoBank be liable for any decision made or actions taken by any person or persons relying on the information contained in this podcast.

Where to Listen

Anchor Apple Podcasts Spotify RSS