The Extended Reality Spectrum Delivers a Variety of Metaverse Experiences

On its face, the term metaverse implies a single place. This might seem to mean that the metaverse has to be experienced by everyone in the same way. Yet the reality is that users can and should experience the metaverse in different ways. 

And when one closely examines Extended Reality (XR), the core technology that most people use to access the metaverse, it quickly becomes apparent there are a spectrum of XR technologies, each of which delivers a different metaverse experience.

This XR spectrum includes:

  • Virtual reality (VR), which attempts, as much as possible today, to plunge users into an all-digital, fully immersive metaverse experience.
  • Augmented reality (AR) and mixed reality (MR) whereby the metaverse augments or is mixed with the user’s experience of physical reality.
  • Assisted reality, whereby small, bite-sized amounts of digital content provide the user with an intentionally limited metaverse experience designed to assist them in the completion of real-world tasks without interfering with their situational awareness. 

This spectrum of XR technologies allows digital transformation leaders to develop and deploy a wide variety of metaverse applications, and ensure that each one delivers its user – be they a CEO in an office, business line director in a meeting room, or frontline worker in the field – a metaverse experience that engages, empowers, and elevates them, without compromising workplace safety.

The Extended Reality Technology Spectrum

Information Density and Environmental Intensity

In examining which XR technology is best suited for a particular use case, two terms – information density and environmental intensity – can help those developing and deploying metaverse applications determine which XR technology will best allow them to achieve their business objectives. 

Information density refers to the richness and detail of content delivered by an XR experience to the user. For example, VR technologies use binocular streaming of immersive video that fully occupies a user’s field of vision to deliver high information density. AR and MR technologies that mix digital content with a user’s view of their physical reality have a lower level of information density. Assisted reality technologies that allow users to see or glance at digital content while otherwise viewing all of their physical environment have an even lower level of information density.

 XR technologies with high information densities provide users a more immersive experience, enabling these users to gain a fuller and more detailed understanding of complex digital data and other information. However, at the same time, XR technologies with high information densities demand higher levels of attention and information processing from the user than those with lower information densities. In fact, technologies at the very far end of the XR technology spectrum are so informationally dense that they can practically block out information from physical reality, greatly limiting the user’s situational awareness, which can compromise workplace safety.

Environmental intensity refers to the risk that the user’s physical environment:

  • Could damage either the user or their XR device;
  • Might require the user to operate their XR device for an extended period of time far from a power source; or
  • Presents health and safety risks that require the user to wear certain types of personal protective equipment (PPE) or possess a certain ongoing level of situational awareness.

Examples of places with low environmental intensities include an office in a commercial building, a conference room in a hotel, or a trailer outside a construction site. Meanwhile, places with high environmental intensities include an operating room in a hospital, a garage in an automobile repair shop, the assembly line in a factory, an active construction site, or the top of a high voltage transmission tower or wind turbine.

As we examine the spectrum of XR technologies, we will generally see that metaverse use cases that require high information density are best used in locations with low environmental intensity. Meanwhile, metaverse use cases that take place in areas with high environmental intensity generally require technologies with a lower information density.

Virtual Reality — High Immersion, Low Situational Awareness

At one end of the XR technology spectrum lies VR, which delivers highly immersive metaverse experiences in which the user primarily sees and hears what is in the digital world. VR technologies, like the Oculus Quest (or the newer, rebranded Meta Quest 2), have high information densities that allow them to immerse the user in a virtual world, so that everything they see (and most of what they hear) are images or sounds generated by the VR technology. The ultimate virtual reality is exemplified in the Ready Player One movie, in which the users live their lives almost completely in a virtual world (until they are liberated from its all-consuming control).

Current technologies might not be able to mimic the virtual reality depicted on TV or movies, but they can provide us with visual and audio experiences that allow us to pretend we are a crane operator loading containers onto a ship, test-driving a new vehicle design , or virtually repairing a complex piece of industrial machinery.

VR’s high information density makes it ideal for gaming, entertainment, education, training, and simulation use cases in which any level of awareness of one’s actual physical environment lessens the value of the experience. However, at the same time the fully immersive experiences delivered by high information density VR technologies make them a poor choice for use cases where there is any environmental intensity at all. In fact, as those who have seen people using VR headsets know, tables, walls, and other objects that are usually safe can suddenly become a danger when someone loses most or all of their situational awareness in a virtual world.

This is the main reason why the use of VR in places with even moderate levels of environmental intensity – a city street, a bustling kitchen, a busy warehouse – is to be prohibited or aggressively avoided. Moreover, use cases in these places often involve the user wanting to augment or mix their physical reality experience with digital information, or just use digital information to assist them with a task. Yet in occupying most or all of the user’s cognitive load and information processing capabilities, VR leaves the user with little-to-no attention or information processing capabilities left to use with the physical world.

Augmented and Mixed Reality – Some Immersion, With Some Situation Awareness

In the middle of the XR technology spectrum lies AR and MR, in which the user augments or mixes their experience of the physical world with video, images, audio, and other digital information. AR and MR technologies, like Microsoft’s HoloLens 2, do not provide high enough information densities for a user to totally immerse themselves in the metaverse. Rather, they tightly integrate the metaverse with the physical world, delivering the user a hybrid physical and virtual experience.

There are many uses cases in which the integration of the physical world with the metaverse can be valuable – a gaming application that allows children to discover and interact with digital pets in their neighborhood, an entertainment application that shows a tourist in Rome what an ancient event in the Coliseum would have looked like from where they currently stand, a training application that overlays repair steps and tips on top of the physical object that the student is learning to fix, or an e-Commerce application that allows someone to see how new furniture would look in their home.

However, while AR and VR are well suited for gaming, entertainment, training, and e-Commerce use cases like these, there are still many use cases where high levels of environmental intensity demand that users have a full field of vision and a high level of situation awareness. Though they are less immersive than VR, the information density of AR and MR can still obscure aspects of one’s physical environment and draw attention away from environmental dangers. In addition, since AR and MR technologies generally cover most of the user’s field of vision, like VR technologies, it can be difficult to use them with hardhats, safety glasses, air purifiers, and the other types of PPE required in industrial and other places with high environmental intensities.

Assisted Reality – Low Immersion, with Full Situational Awareness

Assisted reality, a newer and less known XR technology, lies on the opposite end of the XR spectrum than VR. Unlike VR, assisted reality technologies, like my company’s RealWear Navigator 500, provide the user with nearly complete situational awareness, adding in digital content and experiences designed to only “assist” the user with their physical task. The user can consume information and collaborate with others through a digital metaverse experience, but these experiences are clearly separated from the user’s physical reality. One way to think of assisted reality is that it provides a small window into the metaverse that the user can quickly look or step away from whenever they need to. 

Assisted reality is best suited for remote expert guidance, digital workflow, field service, audit and inspection, and other use cases where a reality-first, virtual-second experience is required. Examples of such use cases include a frontline-worker on a factory floor receiving guidance on how to repair a machine from a remote expert, a refinery worker calibrating a new measurement device with the device’s vendor, or a field-service technician using a workflow application to set up a new piece of equipment at a worksite.

Assisted reality technologies are typically appropriate for workers that must keep their field of vision free to prioritize situational awareness. This is typical of front-line workers across many industries who work alongside or with machines – from manufacturing systems on the plant floor, to material moving machines in the warehouse, to complex, dangerous systems located in the field like drilling equipment on an offshore oil platform.

For the frontline worker in industrial environments like these, the need for situational awareness translates directly into safety needs, where the introduction of an XR technology must never compromise safety. For industrial frontline workers maintaining and working with active machinery, safety is not just an imperative, it is a workplace culture that is often regulated and monitored by government agencies, and measured with KPIs that hold workers, managers, and company executives to account for safety violations. 

Safety concerns like these mean that assisted reality must allow industrial frontline workers to keep both of their hands free and maintain situational awareness in places that are dirty, dangerous, and loud.  As such, assisted reality solutions need to provide users with hands-free control of their metaverse experience, through voice command capabilities uniquely designed for high noise environments.  In addition, with safety a baseline imperative, assisted reality technology must integrate seamlessly with PPE (especially hard hats with integrated hearing and eye protection). At the same time, these assisted reality solutions must be comfortable and powered to sustain all-shift usability. 

A Spectrum of XR Technologies for Different Use Cases

The VR, AR, MR, and assisted reality technologies described above all vary greatly in form factor and where they lie on the XR technology spectrum. Yet, despite these differences, they are all being used today to connect workers to the metaverse in a range of markets and industries — from employee classrooms to factory shop floors, warehouses to offshore oil rigs. By finding the right balance of immersion, situational awareness, and other capabilities companies can use these XR technologies to offer their workers metaverse experiences that translate into increased efficiency, precision, and safety.

Rama Oruganti

Rama Oruganti

Chief Product Officer at RealWear

Rama Oruganti is the Chief Product Officer at RealWear, where he is responsible for the product portfolio that includes hardware, software, and services.

Source link

      Guidantech
      Logo
      Shopping cart