Posts Tagged ‘top-story’

Time for Visionary VR Tech to Focus on Vision

Monday, September 24th, 2018

Virtual Reality has immersed us in captivating alternate worlds and intense gaming experiences. However, VR has much to look into, including VR-induced cybersickness, why children should not use VR headsets, and the contradictions VR displays present to our eyes.


The global market for Virtual Reality (VR) and Augmented Reality (AR) is expected to reach $94.4 billion by 2023, according to ReportLinker, a technology company that provides industry data on the topic. Of the two areas (VR and AR), VR held 60% of the market share in 2017.[i]  VR can create an immersive experience that can fascinate users who are transported to another world. Time spent in a VR headset is an issue for youngsters, however. VR is a new technology, so the effect of the long-term use on eyesight is unknown. VR is not recommended for kids age 12 and younger, as VR headsets are made to fit the pupillary distance of an adult, not a child. Excessive use of VR may also affect the way the eyes grow and lead to myopia or near-sightedness. Another side effect that VR is working to overcome is “cybersickness,” where nausea akin to motion sickness can be induced when using VR headsets, although not all VR headsets and not all people using them experience the same intensity of cybersickness. What challenges does VR face, and how can the VR industry overcome them?

Figure 1:  Commuters on a Chiltern Railways train to London experience a taste of Western Australia in fully immersive virtual reality (VR) headsets. The 360-degree interactive format gives passengers the chance to meet the wildlife of Western Australia.  (Image: Joe Pepler/PinPep)

The Effect of VR Headsets on Eyesight
The Canadian Association of Optometrists states “Most VR headset manufacturers have put in place warnings for children. This is important because a child’s visual system continues to develop throughout childhood, and VR systems are set to the pupillary distance of the average adult, not child. Extended exposure to the awkward visual posture created by VR headsets can alter the development of focusing, tracking, and depth perception.”[ii]  An increase in the incidence of myopia in the general population has been linked to the advent of the extensive use of smartphones, but no studies have definitively shown that using VR increases the propensity to develop myopia or any other permanent damage.

Nevertheless, makers of VR headsets include warnings of risk to eyesight. Myopia has long been associated with near-work tasks that prevent one’s eyes from naturally focusing at different distances in natural light. Microscopes are a lot like VR headsets, in that they involve close-up viewing through binocular lenses. A study associated the occupational use of microscopes with myopia in adults.[iii]  VR headsets are not often used in an occupational setting, however, which implies near constant use for 40 hours a week or more. AR headsets do allow natural vision but provide an overlay of virtual objects or information, much like a heads-up display in a cockpit.

Presently, known optical issues specific to VR headsets include eye strain and dry eyes, since people caught up in an active or tense situation in a game do not blink as often as they should. Dry eyes can lead to extreme pain. Over a prolonged period of VR immersion, many users forget to take a break to rest their eyes to avoid eye strain. VR is fun, engaging, and immersive. It’s understandable that playing for a couple of hours without stopping is the norm, not the exception. But VR induces an eyesight-focusing situation that does not often occur in the real world, mainly because a VR headset fixes the display close to the eyes. No one knows if this is an actual health problem yet, but it’s important to be at least aware of what’s happening.

For thousands of years, our eyes have naturally converged to focus on objects located at different distances. However, the visual presentation in a VR headset forces an unnatural action for the eyes. In a VR headset, a separate image for each eye is projected on fixed, bright displays physically located close to the eyes. VR displays induce a sense of depth and a 3D quality by introducing a lateral relative offset of the objects in each image, which is like creating a slight double vision effect if you were to overlap the two images on one display. The higher the lateral disparity between the images, the nearer the object seems. In the real world, our eyes both converge and focus on the same point. In VR, objects that appear to be moving around at a distance are just a few centimeters from the eyes, which messes up the natural convergence action.

Figure 2: Samsung VR demonstration. (Image: Maurizio Pesce, CC2.0)

 Vergence and Accommodation Coupling
Eyesight is complicated. To perceive an object, you first point your eyes in the direction of that object (convergence) and then your eyes’ lenses focus on it (accommodation). In industry lingo, vergence and accommodation coupling work together to create a meaningful image for you. In a VR headset, the eyes remain focused at a fixed distance even though they converge on objects that are seemingly located further away. In a VR headset, your eyes adjust to changes in convergence as objects move around, but unlike natural vision, your eyes never change focus, or accommodation. As a result, VR headsets force users to make exaggerated convergent eye movements when they look at virtual “near” objects. The eyes are forced to accommodate a situation that is not the normal mode of operation in the real world and can result in discomfort for minutes or hours. The long-term effects of VR headset use, especially on children, are not known.

In a paper presented in a scientific report from the U.S. National Institutes of Health, Ocular effects of virtual reality headset wear in young adults, Turnbull and Phillips describe how the focal distance in VR is different than in natural environments, which affects vergence eye movements. “To prevent double vision as gaze shifts between objects, users make both version and vergence eye movements, which minimise retinal disparity of the object between eyes, and permit the object of interest to be perceived binocularly. In real-world viewing of objects, vergence eye movements are associated with changes in accommodation to focus the eyes at the depth of the object. However, in VR, the focal distance of all objects on the screen is constant, and the eyes must converge without changing accommodation to maintain a clear retinal image. Thus, wearing a VR HMD creates a dissociation between convergence and accommodative demands, which may contribute to visual discomfort.”[iv]  Twirling around in a swivel chair also induces dizziness and an off-kilter center of balance that can be uncomfortable if you spin too fast or for too long. There aren’t any studies that tell us whether spinning around in a swivel chair for hours at a time is detrimental to one’s health in the long term, but it is certain that such sustained activity is unnatural in the evolution of the human race.

Users who experience discomfort while using VR headsets have also experienced “cybersickness,” which is a feeling of nausea much like motion sickness. Cybersickness occurs when visual information doesn’t match the user’s body position, sending false signals to the senses, such as when the VR headset immerses the user in a moving landscape when in reality the user is sitting still. A VR experience full of sharp turns, rapid acceleration, and falling would be sufficient for most to induce cybersickness. A time-lag between physical movement and perceived movement can also create sickness. Cybersickness results in a short-term (a few minutes) to long-term (a few hours) illness characterized by dizziness, vertigo, disorientation, headache, and nausea. Cybersickness that lasts for hours can potentially be a real-life problem, for instance, leading to hours of impaired balance or driving. Game developers have found that adding a virtual stationary object in the game can help reduce cybersickness. Stationary objects in the frame can be as simple as adding a front-seat bar in a roller coaster or a virtual nose.[v]  VR games that do not require a lot of head movement contribute less to cybersickness, as well. Displays with images that move with you as they do in the natural world are an improvement. VR headset displays that move the entire visual field when the wearer moves their head can induce sickness, because the experience is little more than strapping a television to your face. Light-field technology can reduce cybersickness since it creates images where people can look around as the image stays fixed while the wearer moves through the virtual world. Light-field technology imparts imagery that is more true-to-life in terms of what influences depth perception, shadows, and motion parallax (i.e., objects closer to you move faster than those farther away from you).

VR is still in its technological infancy. Any technology that makes people feel sick needs improvement. VR makers are well aware of the problem, although no one knows the exact mechanics of the cause as to why some are more prone to cybersickness than others. We can mitigate cybersickness with high-performance hardware and with VR games that avoid sending false or conflicting sensory signals about physical movement to the brain. Eliminating the headset in VR is not a possibility yet, although a real-life “holodeck” as imagined by Star Trek Next Generation creators would solve the vergence and accommodation coupling issue discussed above. Displays that are less exhausting for the eyes as backlit LCDs might also improve the VR experience. Without information on the effects of long-term use of VR headsets, the best action for VR use today is to limit the use of VR to 30 minutes per session, take regular breaks, and avoid any VR use for children under the age of twelve unless the VR is specifically designed and intended for children.

[i] Global Industry Analysts. “Market Research.” Reportlinker Insight, Global Industry Analysts, June 2018,

[ii] “Are Virtual Reality Headsets Dangerous for Our Eyes?” The Canadian Association of Optometrists, 23 May 2017,

[iii] McBrien, N A, and D W Adams. “A Longitudinal Investigation of Adult-Onset and Adult-Progression of Myopia in an Occupational Group. Refractive and Biometric Findings.” Current Neurology and Neuroscience Reports., U.S. National Library of Medicine, Feb. 1997,

[iv] Turnbull, Philip R. K., and John R. Phillips. “Ocular Effects of Virtual Reality Headset Wear in Young Adults.” Current Neurology and Neuroscience Reports., U.S. National Library of Medicine, Nov. 23 2017,

[v] Kanarbik, Kevin, and Al William Tammsaar. “Best Ways of Producing Cybersickness in VR.” Introduction to Computational Neuroscience, University of Tartu, Estonia, 2015, interview with Madis Vasser.

Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.

VR/AR (XAR) Makes its Way through a Fragmented Landscape

Monday, June 4th, 2018

VR/AR developers continue to strive for a fully immersive experience and additional and innovative content. But why is consumer VR/AR (XAR) adoption so slow?

“XAR” is a term that industry experts use to refer to some mixture of Augmented Reality (AR) and Virtual Reality (VR). VR is at one end of the spectrum and AR is at the other extreme. In between you have some virtual and some augmented reality, but it is a mixture, which is XAR. VR is a stepping stone to something even better; eventually there will be more complex devices than what we have today. XAR devices are made up of three main areas: visual, audio, and interaction with the virtual environment. First, developers want to create an immersive 360° experience equivalent to reality. To do that, the visuals must be of very high density with a lot of pixels to make sure that whatever the user sees in a virtual world is equivalent to what they would see in real life. However, the VR industry is not there yet. Second, the user needs immersive, realistic sound such that the experience is intuitive and natural, exactly emulating a situational environment in every sense. Again, the industry is not there yet. The third area for any XAR business is interaction that is intuitive, or like a natural interruption. Many children experiencing VR for the first time want to touch or push what they are seeing. A natural response to a virtual entity is confirmation that the virtual environment is doing something right. Ideally, the most immersive experience would have all three areas: visual, spatial audio, and interaction through touch. The industry is seeing products like UltraHaptics for interaction, but touching virtual objects in mid-air using ultrasonic vibration is nascent technology.

Figure 1: Ultrahaptics creates the sensation of touch in mid-air. Combined with VR, one can pick and place virtual objects in VR. (Image:

XAR can bridge gaps and connect people to experiences where they can access places that are otherwise inaccessible. With XAR, people can immerse themselves in a new environment that they couldn’t otherwise experience. The future of XAR is an immersive medium where people can experience something directly through their senses, as opposed to having that experience interrupted by a headset that blocks the view. How will XAR technology evolve?  Visually, the industry is making progress towards the desired pixel density. The goal is a very high resolution that equals the resolution of the human eye while also delivering a seamless sensation with a high number of frames per second (fps), although many would say that a 1K by 1K pixel quality per eye is good enough. Processors that can deliver a high rate of fps, a high resolution in the density of pixels, and very low latency are the holy grail for VR. Frame rates less than 90 fps tend to cause simulation sickness in most people. Eliminating the headset requires the same performance delivered to wall-sized displays. Rendering data that meet such demanding requirements provides a quality immersive experience but implies the need for potent processors.


Overcoming Platform Fragmentation
VR gaming, unlike traditional gaming platforms, is taking a long time to make its way into the mainstream. One difficulty in attracting interest is that it’s difficult to portray the richness of VR in a two-dimensional video when advertising to the uninitiated. Another challenge facing VR is that technology is changing fast in a very fragmented VR platform market. For example, in VR gaming there are no standards for a User Interface (UI). Each platform creates a different way for users to navigate. Without standards, a fragmented realm of VR gaming platforms means that VR games and other content must be ported to several different platforms. Yet much of what drives the gaming industry is the content, not the platform.

Developers want to maximize the content experience on every form factor, yet they have a lower performance margin to work with when altering games and user experiences in VR because they cannot risk dropping framerate. It’s difficult for VR game developers to ensure maximum saturation, the maximum amount of immersive or detailed content, while also accounting for all of the user devices that games or applications inhabit.

Augmented Reality (AR) on mobile devices like smartphones and tablets is experiencing uptake, ever since Pokémon Go was released in mid-2016. For example, with an AR app on a smartphone, a customer can hold their phone up to a shelf and see a product information overlay. However, people do not want to hold their phone at arm’s length for a long time.

Why is VR so slow to be taken up?
VR has not hit the tipping point yet. One analogy to consumer VR adoption is the PC market of the 80s and 90s, when at first PCs were too expensive while also having too little memory, storage, and processor speed to be good enough for the market that we have in gaming today. Simple gaming platforms like Atari were the limit of the gaming experience. Today we expect fast-moving, immersive gaming experiences that can also carry multi-player communities with a social aspect.

Figure 2: Walmart partnered with STRIVR to deploy an immersive training program for Oculus Rift in 200 Walmart academies across the U.S. (Image:

XAR does open up new experiences. One can watch video on a personal screen and avoid disturbing others because the headset is immersive and confines the screen to the user’s eyes only. For training purposes, XAR cannot be beat, since trainees are not as easily distracted as they would be watching a typical 2D training video. The headset forces the wearer to be in the training environment. Surreptitiously looking at one’s phone is not possible without removing the headset. Future consumer-oriented devices using VR would include cameras that will allow families to take stereoscopic videos of children or pets that can be shared with family members on another continent. The family VR video would be an immersive captured memory of an event, such as sharing the kids opening Christmas presents with a parent stationed in the military overseas.

As a solitary experience, VR for the moment is still, for the most part, socially isolating. Currently, the price for a decent VR system is as high as a laptop. A genuinely good system for a top-of-the-line VR experience costs in the range of a thousand dollars or more, which is out of reach of the mainstream consumer. Three areas need to improve before VR enters the mainstream consumer market: an affordable price for quality VR, the ability for people to share or socialize with the VR experience, and additional and innovative content.

Figure 3: Farmers Insurance uses Oculus Rift to train employees for real-world scenarios before they go out into the field, cutting travel costs to send new hires to training facilities. (Image:

The social aspect of VR is improving. Often, parents will buy a gaming platform with the excuse of kids or families playing it together. But unlike bowling with a Wii console at a party, VR has been slow to create socially engaging experiences. Presently, VR is only experienced by the individual who is wearing a headset. When PCs first enabled chat facilities on the internet, the social aspect of engaging with real people made services like AOL take off. VR has a capability similar to chat in that an immersive conversation, albeit using an avatar (one’s physical representation to others in VR), seems more real than text flowing across a screen. Supported on Oculus Rift and HTC Vive, Facebook Spaces (still in beta) offers a VR experience much like a chat room but with the physical experience of seeing other VR users in the room, also gesturing and talking. Facebook recently updated the capability for user’s avatars to look more like the user, although they still resemble cartoon characters.

Figure 4: Bigscreen offers a VR space where remote teams can collaborate together in virtual offices. Other use cases of Bigscreen include a virtual living room to watch movies, play video games, browse the web, and hang out with friends. Bigscreen beta launch was in March 2016.

Bigscreen is a VR room (or “lounge”) for sharing experiences with others around the world. According to the Bigscreen site, “Use cases of Bigscreen include both entertainment and productivity. It’s used as a virtual living room to watch movies, play video games, browse the web, and hang out with friends. It’s also used for productivity as a tool for remote teams to collaborate together in virtual offices.” This use case for XAR enables collaborating with physical movements possible, such as arranging dance choreography or rehearsing a theater act. Bigscreen requires a quality VR headset as well as a PC with a minimum of 16 GB RAM, a powerful graphics card, and an Intel Core i5/i7 processor or AMD Ryzen (or higher) core for a comfortable performance. Another such VR lounge is the Oculus Rec Room, where users can play games together with virtual others. Many users claim that in a VR meeting space, they immediately feel like the person they’re talking to is right next to them. The audio experience correlates to the visual experience. Users on a quality system like the Oculus can hear a colleague on the left and another on the right in these virtual meeting rooms, which makes all the difference versus a flat-screen video conference in terms of meeting others.

XAR is a growing yet challenging industry. However, rapidly improving technology is enabling XAR to incrementally improve quality for a more immersive experience.

Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.


The Future of VR Depends on Lessons from Its Past

Monday, January 8th, 2018

Why we need to reset our expectations of what technology can deliver today if we want VR to be successful tomorrow.

Virtual reality (VR) stands at a critical juncture. Down one path are consumers clamoring for powerful, transformative devices that will open up a new age of virtual immersion. On the other, developers, designers, and engineers continue to grapple with a long list of technology limitations that frustrate the ideal wearable headset design.

What We Need to Make VR Successful
VR requires a highly complex blueprint of features and functions, mimicking the human brainthe most complicated of which is spatiotemporal orientation. VR must persuade our minds in multiple ways (visually, aurally, with scale and context) to believe that the digital is reality, or at least a very good simulation of reality.

To be clear, VR will transform our world. According to Orbis Research, spending on VR technology (independent from augmented reality) is expected to surpass $40 billion by 2020. Research firm IDC also reports that spending on VR systems is forecast to be greater than AR-related spending in 2017 and 2018. VR will transform how we learn, play, create, build, manage, market and interact. Even how we compete.

A Candid Assessment of Virtual Reality Today
We can trace the modern concept of consumer VR technology to the 1990s, when the Sega VR-1 motion simulator was released. It was (by today’s standards) a crude mash-up of visor, stereo headphones and sensors that roughly tracked and responded to the wearer’s head movements.

Fast forward to 2010, when the first personal virtual reality headset prototype, the Oculus Rift, emerged on Kickstarter. It featured a breakthrough 90-degree field of vision (FOV) and was later purchased by Facebook, setting off an avalanche of VR investment and developments by competitive technology companies. This came with projections that the ultimate consumer VR experience was mere months away.

So, what keeps us from delivering a mass consumer, high-end standalone VR experience? Three key issues:

  • Tethered headset and latency. A robust and immersive VR system demands a powerful computer with a fast graphic card, which today is only possible via a physical connection to a PC. But, our relationship with mobile phones, tablets, laptops and more has resulted in a consumer market that considers stationary technology archaic. Additionally, wearing a headset while tethered to anything, as you try to move within your virtual environment, is annoying at best—an immersion-killer at worst. In addition, latency—image lag following a head motion—can be a real cause for a flawed VR experience and the oft-mentioned (and never popular) issue of VR motion-sickness.
  • Form factor. Never underestimate the importance of comfort, fit and style—particularly in a product worn on the face. Lenses need to align with every set of eyes; headphones need to fit comfortably in the ear, and weight distribution, calculated for comfort and overall size, all need to be taken into account. Right now, 360-degree, fully occlusive VR headsets are very heavy. We are essentially trying to package a high-powered computer with rapid processing speed, high-resolution graphics, positional audio, motion tracking and reasonable battery life into a cool-looking pair of glasses.
  •  Price. No VR system comes cheap. Facebook’s Oculus Rift headset is currently $400, not counting the added cost of the computer needed to power its virtual reality experiences and games—that’s expensive, especially for the casual VR user (although the Oculus Go costs $200). The highly touted HTC Vive runs about $600, and the console Sony PlayStation VR about $400. The most widely used mobile option (for those who already own a new Samsung phone) is the Samsung Gear VR at about $130.

Then there’s the accessories. For about $299 (pre-order), TPCast’s wireless adapter for HTC Vive establishes a wireless connection capable of transmitting a 2K resolution between the Vive’s hardware and host PC, with less than two milliseconds latency. Also gathering steam (and crowdfunding on Kickstarter) is the $800, Shanghai-based Pimax 8K VR headset, which features two 4K screens and a wireless transmission add-on similar to the TPCast wireless upgrade kit.

Solving the Issues: Lessons from Technologies Past
As always, past technology evolutions and milestones may influence the future of mass consumer VR adoption.

First, consider the impacts of overcomplicated design. Consumers assume that a completely immersive experience can be crammed into a sleek pair of sunglasses. Not true—yet. The reality is that features and functionality come at the expense of size and weight. If we are to look at technology from a practical perspective, the military is a prime example of delivering highly functional, yet stripped-down, devices designed to serve specific needs. Similarly, by scaling back the bells and whistles, and delivering disciplined products—manufacturers can ease consumers into VR.

And by compromising some features, VR can still deliver an adequately immersive experience. This is not to say that VR doesn’t have essential requirements, but some functionality is more ‘luxurious’ than others.

One example is the emphasis on wide FOV, which makes the user feel more present in the experience. But wide FOV requires designers to use bigger displays and bigger optics, making the headset very bulky. In addition, magnifying display images with insufficient resolution in pursuit of wide FOV aggravates the “screen door effect” (where individual pixels become so amplified as to be distracting to the experience).

At Kopin, we offer smaller size, but higher resolution (2048 x 2048) OLED displays with greater pixel density (3000 pixels per inch) to mitigate the dreaded “screen door effect.” Images are magnified and exaggerated using stronger—but much thinner—lenses to allow a very compact headset.

And while weight, comfort, and style will make or break VR adoption and public acceptance—don’t forget price. For those of us old enough to remember the Motorola DynaTAC (brick) cell phone, you’ll also recall the ‘cringe’ factor associated with a device so large it was obvious and obnoxious. But it was the price—$3,995 ($9600 in 2016 dollars)—that kept it from being a mass-market product. In 1996 Motorola unveiled the flip clamshell StarTAC at the cost of $1000, and widespread consumer adoption of the cell phone was born.

The Future of VR
So, how does the industry extend the appeal of existing VR technology to the masses while encouraging innovation?

First, consumer onboarding to VR must be made as easy and affordable as possible. While the most immersive and hyper-realistic experiences are still the domain of gamers, securing a sophisticated VR system will set them back $1000-1500.00. However, since gamers are most likely to already possess the core equipment, they are usually the first to adopt VR technology. Luckily, most technologies go through natural price adjustments as computer and device specs evolve to accommodate desired features.

Another lesson from the past is that, eventually, dominant platforms emerge, streamlining both hardware and software development over time.

Critical to VR’s future success is form. Knowing that today’s consumer expects their communication and entertainment devices to be portable and universally accessible likely means standalone headsets will win. But at the same time, wireless headsets will need to be comfortable on many levels, so weight, size and style will factor significantly in whether a device becomes a novelty or an integral part of everyday life.

So, although the tech limitations of today are clear, VR is on the path to eventual mass adoption. And while challenges like price, ergonomics, low resolution, latency and even a shortage of content, have slowed VRs integration into the mainstream, acknowledging these obstacles assures us that fixes will surface. When the stakes are this high, winners will emerge in the race to transform how people interact with the digital and physical worlds.

Dr. John C.C. Fan is the CEO and co-founder of  Kopin Corporation. For more information, please visit Kopin’s website at




Extreme Sensor Accuracy Benefits Virtual Reality, Retail, and Navigation

Tuesday, December 5th, 2017

What minimizes lag that leads to VR “motion sickness,” explains why your store coupon app requires use of your smartphone’s accelerometer, and keeps fitness trackers and cars on track even when GPS fails?

Good Virtual Reality (VR) is an immersive experience, a simulated world with a hint of boundaries. Excellent VR is closer to the real thing. VR technology has to have a very high-density display with enough pixels to make sure that VR can emulate real-life details. Spatial stereo audio is also part of that immersive experience. That is, audio should sound like it’s emanating from the same place as the associated visual. Audio reception in a perfect VR experience would include the Doppler Effect and other physical vagaries of sound. Lastly, VR immersion should include the ability for the user to intuitively interact with the system, as you might in real life. However, VR has not yet reached perfection in any of the above areas; with a high level of visual detail, experientially accurate sound, or the ability to interact naturally in a virtual world. Alas, VR is still in the early days, causing “VR sickness” by making many users nauseated; a sickness that’s mainly due to a time lag of more than 20 ms, as the differences in sensory inputs conflict with each other.[i] The latest VR products have a lag delay of 6 to 10 ms, however, enabling lengthier and more enjoyable VR experiences.

Figure 1: Excellent VR is an immersion experience. AR shares design challenges with VR such as latency and precise motion tracking. (Source: Qualcomm)

VR will show nothing in the user’s actual surroundings, whereas Augmented Reality (AR) supplements the real-world view, much like a heads-up display with an overlay of information superimposed onto the user’s view of actual surroundings. For developers, there’s a “spectrum of immersion” in Virtual Reality (VR), depending upon the technology that’s put into play. VR can be as simple as sliding a smartphone into a Google cardboard device that looks a bit like a View Master (a vintage stereoscopic viewing toy), or VR can be quite immersive, with a 360° headset, in-sync spatial audio, and controllers and sensors for both hands and feet. People in the industry are increasingly using the term “XR” to refer to “AR/VR.” A more detailed look at the XR spectrum starts with VR and extends to AR at the other end of the spectrum, with Mixed Reality (MR) existing as a less confusing name for AR.

Micro Electro-Mechanical Systems (MEMS), a semiconductor technology used for creating tiny sensors such as accelerometers, gyroscopes, magnetometers, and more, is in wide use in the XR market. A significant player in the MEMS/sensor industry is InvenSense, now a part of TDK, with a sizeable $368 million (USD) share in the 2016 MEMS market. David Almoslino, Sr. Director Corporate Marketing at TDK InvenSense, has an excellent handle on the XR industry since sensors play a critical role in the outcome of the VR experience. Sensors do much more than sensing at InvenSense, however, and are found in a majority of XR headsets, controllers, and related peripherals. Sensors work in concert to gather and synthesize data in what’s known as sensor fusion. As Almoslino states, “HTC Vive has incorporated InvenSense technology for one-to-one tracking for how the head is moving. At the same time, HTC controllers have our tracking ability with InvenSense Inertial Measurement Units in each controller. All these sensors track and communicate so that when you are physically moving in a game, the Inertial Measurement Unit (IMU) recognizes the inputs and keeps them all together so that you can truly be immersed in a game.” Gaming is just one use for XR, however.

Augmented Reality (AR) is similar to VR but has the additional design burden of a heads-up display and potentially more sensors that feed data directly to the viewer. Many design challenges are shared. Improving the level of visual detail in XR to perfectly emulate reality may require a display that nears the resolution of the human eye, requiring a high density of pixels (≥2160 x 1080) and a frame per second (fps) rate of at least 60 fps. Field of View (FoV) should be at least 110°. High-performance computing is required to render data with a high pixel density and frame rate without adding lag, as the data processing burden is enormous. It is crucial that data from motion sensors (also known as IMUs) in the VR headset and hand controllers line up with the corresponding visual display on the headset. If not, lag ensues.

Reducing Lag
High-performance computing aside, much of the work in lowering lag resides in sensors. InvenSense is known for very accurate sensors. Real-world sensing translates to analog input that requires filtering, digitization, and additional processing, for which these sophisticated sensors have integrated microprocessors to process and format data before sending it to the main CPU.

Lars Johnsson, InvenSense’s Sr. Director of Product Marketing, explains how InvenSense IMUs reduce lag and ease the developer experience. “The sensors have integrated filtering with adjustable parameters that include bandwidth and noise. When taking the signal from analog to digital, there’s something called a Digital Motion Processor (DMP) that performs post-processing for sensor fusion, which we offer at certain data and sampling rates. Sensing followed by rapid conversion and post-processing happens locally in our sensor so that when it reaches the rendering engine, it is preprocessed. For VR, all developers have to do is say, ‘If the user looks 1° to the left and 10° up, here is what he should see,’ and the correct spot just gets presented to the screen.” In other words, calculating vectors for relational placement of the display in concert with the physical placement of the headset is done for you.

The sum of the parts of XR add up to a very complicated but exquisitely coordinated high-performance sensing and compute platform. Minutiae do not burden developers when using smart sensors that include practical algorithms. Algorithms will vary for sensors in different locations. As Nicolas Sauvage, InvenSense’s Sr. Director of Ecosystem, points out, “Sensors in the headset are less likely to experience the kind of speed and acceleration that hand controllers present. The performance of an IMU in the hand controllers has different performance tradeoffs than an XR headset.” InvenSense is tuned in to the finer details of VR design. Sauvage goes on to explain, “We fine-tune the performance of our chips to take advantage of these different performance trade-offs. Since your head with the headset will never be as fast as your hands, we can smartly adjust trade-offs in the acceleration of the head. Latency for motion sickness is significant here, but may not be as important elsewhere.” 

Figure 2: Six Degrees of Freedom (DOF) offers more than orientation (3 DOF), but will track your location as you physically move around. Latency (lag) adds up as the XR system collects and processes huge amounts of data. (Source: TDK InvenSense.)

Sensors Combat Motion Sickness
Sensor accuracy plays a very large part in avoiding motion sickness due to lag. There has to be a perfect alignment between where the user is looking and where the VR rendering engine thinks the user is looking. Add the rapid movement of two separate hand controllers and the action that’s integrated into the picture within the VR game, and you have a recipe for disaster without good sensors. Johnsson goes on to say, “With respect to having very low noise and very high-temperature stability, as the electronics quickly warm up, you don’t want signals to drift as they react to a temperature change. We compensate for these types of things, as they affect accuracy and can create lag.” InvenSense sensors are in the Oculus Rift, Microsoft HoloLens, HTC Vive, and numerous other XR products.

Augmented Reality
Augmented Reality requires an informative overlay onto a display, whether projected inside a Head-Mounted Display (HMD) visor or in a heads-up display in a car. A well-known example of AR/MR is Pokémon Go, which is played on a smartphone. In the game, Pokémon characters are superimposed on a smartphone screen as captured by the camera in various GPS locations. Other uses for AR include training and as a productivity enhancer. One lesser-known benefit of VR is that users are somewhat forced to focus on the content that’s strapped to their head. Unlike with a TV, VR would make it more difficult for users to look at their smartphones during advertising. Training employees with VR as a medium ensures that they cannot do something else while in the training session, for instance. Boeing found that AR, as tested in a manufacturing setting against a control group, increased productivity by 25%. According to the Harvard Business Review, AR improved productivity significantly in a warehouse. “At GE Healthcare a warehouse worker receiving a new picklist order through AR completed the task 46% faster than when using the standard process, which relies on a paper list and item searches on a workstation (view GE’s AR assisted productivity video here). Additional cases from GE and several other firms show an average productivity improvement of 32%.”

A three-axis accelerometer measures movement in three dimensions. Adding other sensors adds additional axes for acuity with more data. In the industry, it’s common to refer to a pressure sensor as an additional axis, for instance, because the sensor measures height in the air based on air pressure, not motion. Fusing the inputs gives more accurate data. A nine-axis sensor would include three degrees of freedom each from an accelerometer, a gyroscope, and magnetometer. Software algorithms complement sensor fusion. One prominent example in navigational mapping uses GPS as well as six-, seven-, or nine-axis IMUs that continuously measure orientation changes and speed. These IMUs keep travelers on track when GPS fades. For example, navigating a tunnel with a highly accurate IMU will accurately track a car’s progress without GPS, since error accumulates on a minuscule level when you have high levels of accuracy. The InvenSense Positioning Library (IPL) algorithms can implement tracking to complement navigation when GPS goes missing in an urban canyon. Other use cases include wearables that may incorporate power-hungry GPS only intermittently, preserving battery power while keeping true to course. Have you ever wondered why a smartphone application for in-store coupons would need permissions for accessing your gyroscope/accelerometer? Retail use cases include smartphone applications that can accurately track and monitor a person’s travel inside a store using a highly accurate six-axis IMU. Tracking can easily include how long a person stays in a particular location. With a standard store layout and accurate position tracking triggered by a single Bluetooth beacon as users enter the store and open their coupon apps, data can reveal information on shoppers. A shopper might get a pop-up offer on their smartphone app for a discount on Snuggies after walking away from a Snuggies display where they lingered a little too long. This accurate tracking is done without expensive video cameras. It’s easy to see that extremely accurate sensors are affecting more than VR.

Where is VR Headed?
The global VR Head-Mounted Display market is projected to increase to around 90 million units per year by 2021. VR has some challenges with fragmentation in platforms for developers. VR content is challenging across a fragmented landscape of various platforms with varying numbers of controllers and no one unifying standard, at least not one that’s been widely adopted yet, similar to how USB solved the problem for connectors. VR systems can come with up to two controllers, affecting the application of gameplay with each choice. Pricing puts the best VR systems out of reach for much of the existing gaming market. A lack of good content comes with the territory for a VR market fragmented by many platforms, making it that much more difficult for developers to create content that sells in volume across many platforms. These challenges are being solved, as many see XR as a wondrous experience and a productivity boon for a manufacturing sector with rising job openings and falling hiring rates.

Figure 3: The global VR Head-Mounted Display market is projected to increase to around 90 million units per year by 2021. (Source: ABI Research)

The highly accurate sensors used in XR translate well to several other segments. As for InvenSense, the TDK purchase was a good thing. Almoslino’s perspective is seasoned by years of experience in sensors, where InvenSense excels. “InvenSense has had sensor success in the consumer products area and TDK in industrial, and together they complement each other. The automotive sector is going to be our next big growth scene.”

AR is already making significant headway into increasing productivity of workers in warehouse “pick and pack” activities. Transportation industries that include trains, busses, and automobiles will benefit from previously unaffordable, augmented heads-up displays (HUDs) that a decade ago were only available in sectors big budgets and critical importance, such as in military cockpits. It is without doubt that XR will have a significant impact on economies world-wide by increasing productivity and decreasing accidents.

Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.


[i] Mason, Betsy. “Virtual Reality Has a Motion Sickness Problem.” Science News, 8 Mar. 2017,

Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.