Defining XR Terms for CEOs and Business Leaders
As a business leader or owner, you absolutely need to know about the multi-trillion-dollar XR market that’s about to burst wide open.
The problem is… there’s not just a mountain of information out there; it’s more like a chaotic landslide.
Buzzwords and trends will keep flying at you, especially from the tech-nuts on your team, and you need to know what is actually “need-to-know.”
I boiled down the most important buzzwords into quick and clear explanations for you — not just so you understand what they are, but so you understand what they can do for you and your business right now.
XR can make your company more money right now, not just in some idyllic future; I’m here to help you figure out how to get started.
XR: Extended Reality
XR is an umbrella term that covers VR, AR and MR. It’s essentially every digital experience that incorporates your 5 senses. Right now, it’s the most appropriate general term to talk about immersive tech.
What you need to know:
I say it all the time, but that's because it bears repeating: XR can make or save you money right now. Don't wait until it "hits the mainstream." The tech is there, the use cases are plentiful, and the majority of the market is up for grabs. Some aspects of XR can (and should) be a part of every business' game plan right now.
VR: Virtual Reality
VR is a 3-dimensional simulation that can either imitate the real world or be something completely different. VR environments are created by artists using 360° video or 3D models, which are then generally spec’d out in a gaming engine such as Unity or Unreal.
What you need to know:
VR headset sales are currently out-pacing AR and MR headset sales by a factor of 10 or so. If you're looking for the widest adoption possible of an XR experience using a headset, VR is the way to go right now.Currently VR headsets are cheaper than MR/AR headsets - they're less complex because they don't have to incorporate and interact with real-world elements.VR has a large consumer market, especially because of the Oculus headsets from Meta.However, you need to have a plan to migrate to AR/MR; it will become the most adopted headset in the next 3 years.
AR: Augmented Reality
AR refers to overlaying computer-generated virtual elements information on top of a real environment, sort of like a “heads-up display” (HUD). AR normally uses either a live camera on a smartphone screen or AR-glasses.
What you need to know:
The AR mobile revolution has been occurring for a while now. Snapchat Filters are a perfect example of AR in action. Mobile phones now come preinstalled with AR libraries (Google's ARCore and Apple's ARKit). Therefore, AR mobile is one of the easiest areas to implement and scale.
MR: Mixed Reality
MR lives in the space between VR and AR, where reality and computer-generated virtuality are blended together in different, fluctuating proportions.
What you need to know:
In many ways, MR headsets are the most sophisticated headsets on the market; that also makes them the most expensive. For the next 18 months, MR headsets should remain strictly at the enterprise level. In 2024, be prepared to see MR migrating to the consumer market (while also continuing at the enterprise level, of course).
Immersion
Immersion refers to how “real” a digital environment/experience feels — most often referencing a virtual reality space. Immersion is based on all of all 5 of our senses: Sight, Sound, Smell, Touch, and Taste (believe it or not, there are even taste machines you can use for XR!).
What you need to know:
Return on Investment (ROI) is an important filter for evaluating XR experiences, just like any other aspect of your business. The rule of thumb for most XR experiences is that 80% of the immersion can be made up of Sight and Sound (both can be done via the application on the headset or mobile device). Will that missing 20% undercut the effectiveness of your product/service's XR experience? In most cases, the answer is NO. There are some extraneous cases where the immersion is killed without tactile, but that is the exception - not the rule.
VR Controllers and Hand Tracking
VR controllers are handheld devices that are meant to track and translate hand placement and actions in a virtual environment.
Hand tracking is the ability of an XR device to track a user’s hand position and movement and incorporate them into the virtual experience. The goal of hand tracking is usually to replace the need for a VR controller.
What you need to know:
Current VR controllers have extreme accuracy for positional accuracy of a hand. On the flip side, many experiences don't lend themselves well to using controllers. Most VR headsets are starting to include hand tracking, but the hand tracking is nowhere near as precise for position.Hand tracking is an extremely resource intensive process. You can assume that essentially all headsets in the future will have this technology. For now, you have to decide if using your limited resources on hand tracking is more or less important than using them on graphics, intense game play, etc.Here's the question you have to consider: does my experience require a more natural extension of the body (aka normal hand usage) OR is my experience better with the extreme accuracy of precise position data (aka a VR controller)?
Eye Tracking
Eye tracking is software that tracks a user’s eye placement and movement. Eye tracking is usually used to analyze how users visually navigate virtual spaces, including what their attention is drawn to first and/or most.
What you need to know:
Don't worry about how this tech works. Just ask this question: "If I could dig into the data of how users look at an XR experience, how could that help me?"Imagine being able to know if a customer is looking directly at your 3D model, which part of the 3D model are their eyes spending the most time looking at, etc… it's a GOLD mine.Also, consider this: eye tracking will start to be used on interactive experiences to increase the realism of emotions so that you can see when someone blinks, looks inquisitive, sad, etc. That will be a massive leap forward in immersion and realism.
Metaverse
The philosophical Metaverse will be made up of the persistent, interactive 3D virtual and augmented worlds. They’re interconnected by internet standard APIs where user information (such as avatars) can be used across each and every world seamlessly.
Within this philosophical end-all-be-all Metaverse, users will be the main source of content creation. Furthermore, users own their own data and decide if/when it is shared. For example, a user would be able to own a special avatar and choose to use it in both a game created by Disney and then a game created by Epic. The game creators no longer create the walls of a user’s experience.
The generic idea of a metaverse that most folks talk about is simply some sort of persistent 3D “world” that users can join and interact within. Roblox, MineCraft, etc are all examples of “today’s metaverses.”
What you need to know:
The philosophical Metaverse is a continuum. There are parts of the "Metaverse" that are being done well today, but many other aspects won't be completed for many years to come.Here's what you need to FOCUS on now:
"How do I leverage the current, mainstream XR tech that's available today to create new revenue streams right now?" That kind of thinking will also help you prepare your company as the other parts of the tech become mainstream.
Digital Twin
A digital twin is the virtual replica of a real life object. This generally refers to the 3D object in both the 3D mesh (structure) and texture (color). As technology has advanced, digital twin has come to refer to both the object’s look and the way it works or interacts. For example, a digital twin of a city could be both the buildings and the flow of traffic.
What you need to know:
Digital Twins are generally used for familiarization. They can be used for... ► a car mechanic to become familiar with a v6 engine from each piston down to the size of each bolt
► a group of colleagues meeting in a familiar office setting so they can easily connect and brainstorm together
► a group of scientists studying how they can build a better city-wide evacuation route based on how traffic flows
FoV
Field of view is a measure of “degrees” in a VR headset. A higher degree means a wider field of view for the user. A human’s natural FOV is about 135° horizontally and just over 180° vertically. Obviously, a VR experience that closely matches that natural FOV will make it seem more realistic.
What you need to know:
The FOVs that exist today are more than adequate for teaching, training, gaming, assisting, etc. Nothing much to worry about here.
AI
Artificial intelligence refers to the ability of machines to imitate human intelligence. Examples include searching and analyzing data and developing solutions from it without being strictly coded to the task.
What you need to know:
Think of AI as automation on steroids. AI can deliver fantastic results on repetitive tasks that are more complex than just A+B=C. A great example is how AI is being used to remove unwanted distractions in the background of pictures. A good photoshopper can do this very tedious task, or you can utilize AI to do it in a matter of seconds with great results!
Mocap
Motion capture is the process of recording physical movement with sensors and transforming it into a digital form. It’s essential for realistic lengthy animations, especially in 3D virtual environments.
What you need to know:
There are differing levels of Mocap and unfortunately, for most of them, you get what you pay for. The more expensive versions are generally much more accurate and easier to work with than cheaper systems. There are a few systems now under $100k that can be used.There are two basic routes you can take:
1. You can "rent out" time at a mocap center.
You'll want to walk in with a full plan - almost like a movie and script, with all of the choreography done ahead of time. If you only need a few bits of animations per year, just go rent the studio time.2. You can invest in building a full studio.
If you're a school teaching students how to do the full pipeline of work, or if you're a software studio that will use these animations in your everyday work, this will be worth the investment.
Volcap
Volumetric capture is the process of capturing the exact 3-dimensional form of an object, along with its movement, and transforming it into digital form. Think of it as a 3D model that can be played as a video. You can watch the video from any and every angle possible.
What you need to know:
The high end of VolCap can be as hardcore as creating an entire Hollywood movie production, and the low end of VolCap can be used to just stream somebody doing a webinar. Again, usually you get what you pay for, so you'll have to pay for what you need.
360-Camera
A 360-camera can be used to take 360° images or videos.
What you need to know:
This may seem like a simple and straightforward shortcut to more immersive experiences. And sure, it has been that for plenty of companies.But I've got to be honest:
I don't like 360 videos that much. Sure, folks are able to make cheap training material using it. But I think we will quickly see less and less curated by large enterprises. It's not interactive, nor can you move around in the scene. True immersion should be what we're aiming for in most use cases, not just looking around the area in one specific spot. The consumer market is growing in this area: just look at GoPro 360 cameras. If consumers can create it themselves… it won't be very impressive if that's all you're doing, too.
3DOF: 3 Degrees of Freedom
In Augmented Reality, 3DOF describes a floating 3D model that is movable and scalable, but is locked to the user — not the environment. You can move your head or move toward the model, but it stays the same virtual distance from you. Imagine a floating menu or display that always stays in the corner of your vision.
What you need to know:
There are great use cases for 3DOF, but they are generally for less immersive experiences (a runner tracking their speed, a swimmer watching their time, or a pilot checking wind speeds and elevation). However, most training is terrible with just 3DOF.Ask yourself: are you just trying to get information to your user? 3DOF is great. Or are you trying to immerse them in an experience? 3DOF will fall short every time and undermine your goals.
6DOF: 6 Degrees of Freedom
In Augmented Reality, 6DOF describes a floating 3D model that is movable and scalable, and is locked to the environment — not the user. When you move your head or body, the object stays where it appears in space, just as you would naturally expect a real object to.
What you need to know:
Six degrees of freedom in XR-headsets imply that it can track movement in three axes: up-down, left-right and forward-backward, Thus, the user can move freely with their body to all directions. 6DOF is the standard for immersive training and you should not purchase 3DOF headsets for immersive training.
Avatar
Avatar is a broad and nuanced term, but overall it refers to the physical appearance of a character or profile. You may also hear terms like skins or wearables. For now, you can consider them all basically interchangeable. For 3D games and environments, of course, avatars are 3-dimensional; whereas for 2D applications, they are simply 2D images.
What you need to know:
Avatars are a BIG deal to users. Think fashion industry big… games like Fornite have made billions of dollars by selling avatars that do nothing for the game except change the appearance of the game character. As XR and metaverse adoption grow, this will grow into an even more popular method of self-expression for users - and opportunity for your business.
Game Engine
Game engines are visual computer softwares that include key components of game development (rendering, physics, animation, artificial intelligence, sound, etc.). The two most popular commercial game engines are Unity and Unreal Engine.
What you need to know:
For XR headset applications, you will need Unity or Unreal Engine software developers. For less immersive applications that are able to be done on platforms like 8th Wall, you will not need specific Unity or Unreal Engine software developers.
UI
User interface is a set of controls and programs used to control a machine or computer. It facilitates the interaction between a human and a computer. GUI (graphical user interface) includes visual controls for using different programs with a computer.
What you need to know:
For decades, UI has always been imagined and implemented in 2D space. As XR gains ground, UI will evolve to be 3D and spatial. If you can afford it, go for a 3D UI expert to stay at the forefront of this innovation. But if funds are limited, find yourself a solid 2D UI designer, and they should be able to make the leap as time goes on.
WebAR, WebVR, and WebXR
WebAR, WebVR and WebXR are essentially a framework to enable XR experiences through traditional web applications. Some developers code this by hand, while others are using WebXR through a layer on top (think a website builder like Wix or Squarespace, but for XR web experiences). All of this makes XR web experiences more accessible and easier to implement.
What you need to know:
You need to decide your XR experience's desired learning outcomes, the scale you need (dozens, thousands, or millions of users, headsets allowed or just mobile phones, etc), and how much you can spend. Those answers will dictate if you should utilize XR experiences on the web, or use Unity and Unreal engines to create the experience.
3D-Modeling
3D-modeling is the process of creating a three-dimensional object within 3D-modeling software. There are dozens of 3D-modeling programs out there such as Blender, Maya, SolidWorks, etc.
There are also plenty of 3D-modelers in the world these days. Each one of the modelers are artists and have their own style. Some are better at organic items such as food, some are better at creating game characters, and some are better at creating scenes. There are even AI programs that create 3D models… but we will discuss that at a later time.
What you need to know:
For your specific XR experience, don't worry about what 3D-modeling software was used to create the 3D model; just pay attention to your rights! Will you own the 3D model? Can you use it for multiple applications, or just one application, etc? Furthermore, find out if you can source the 3D model from websites such as SketchFab.com or TurboSquid.com, or if you need to have the 3D model built from scratch.
AR Filters
An augmented reality filter is generally talked about in two ways:
1) A scenario or experience that plays out for the user when the filter is turned on. Here is a great example:
2) All items in the world that are looked at will have the specific filter on. For example, if you turned on a “bunny ears” filter, all humans you look at will automatically have bunny ears.
What you need to know:
The AR filters are going to be just like YouTube: there will be individual creators that design incredible filters and gain huge followings. They'll find paths to to monetize their filters, including paid partnerships and sponsorships. But there will times also be companies creating their own experiences.
So you'll have to decide: will your company create your own or sponsor/partner?
Do you need further guidance on how to navigate XR for your business specifically? DM me on LinkedIn and we can chat!