Virtual reality is hot, and growing fast. The current market is estimated as high as $6 billion, and Markets and Markets predicts it will hit nearly $34 billion within five years. Tech giants like Facebook are all in, with the company saying it could spend $3 billion in the next decade to improve virtual reality (VR).
We’re watching VR closely at Equinix, because it doesn’t work without high-performance, low-latency interconnection, and that’s our thing. We figured a look at some key VR terms would make for a timely entry in our “How to Speak Like a Data Center Geek” series, which aims to bring clarity to trends and common terms in our business. We’ll start by defining different kinds of reality found in a digital world.
VR, AR, MR
We’re focusing this post on virtual reality (VR), which completely immerses viewers in manufactured 3D surroundings. But it’s worth distinguishing VR from some related technologies. Augmented reality (AR) is not as immersive, but it’s actually more “real,” in that it enhances the viewers’ actual surroundings by overlaying digital information on top of them. There’s also mixed reality, (MR) which enables real and virtual objects to interact in real time. For instance, mixed reality goggles by Microsoft, an Equinix customer, bring holograms into real-world surroundings and allow users to interact with them while moving about their physical environment.
The Six Degrees of Freedom
The Six Degrees of Freedom (6DoF) are the six ways an object can move within a three-dimensional space. These are critical to a VR system because its head tracking element – which keeps the images aligned to the user’s head shifts – needs to account for the 6DoF for the experience to be truly immersive.
The 6DoF are separated into three “rotational” movements (pitch, yaw, roll) and three “translational” movements (left/right, forward/backward, up/down). See the diagram to the right:
Various sensors, external cameras and internal headset components can be used for tracking the user’s head through the 6DoF in a VR system. For instance, smartphone-based headsets can use the phone’s gyroscope (measures user’s rotation or twist), accelerometer (measures directional movement) and magnetometer (measures direction in relation to magnetic north).
Stereoscopic display enables our brains to do the same work in VR that they do to perceive depth in the real world. Here’s how it works: In the real world, the separation between our left and right eyes gives each eye a slightly different view of whatever object we’re looking at. Our brains fuse these two views, and this creates a sense of depth. We also get a sense for how far away an object is by how different it appears from the right eye to the left eye. Stereoscopic display mimics this in VR by presenting two slightly different images of the same virtual object to each eye, adjusting the differences for how far away it’s supposed to appear. Our brains fuse the images, and virtual world appears as three-dimensional as the real world.
Frames Per Second, Refresh Rate
Frames per second (FPS) measures how many images a computer’s graphics processing unit (GPU) can put out per second. For example, a 90 FPS means the GPU is outputting 90 images per second. The refresh rate of a monitor is how many images it can display per second. The FPS and refresh rate in a VR system need to be synched up, or tearing can occur. This is when the images are broken into pieces and displayed on different parts of the monitor. This not only compromises the “reality” part of virtual reality, it can make the user nauseous. Vertical sync addresses this by limiting the FPS to the monitor’s refresh rate.
In actual reality, Equinix is helping our customers design the flexible IT architectures needed for technologies like VR using an Interconnection Oriented Architecture™ (IOA™) strategy. IOA is all about integrating the physical and virtual worlds where they meet.
Download Equinix’s IOA Playbook to learn how you can leverage an IOA strategy for your digital applications.