What is Reality Anyway?



Definitions do exist for Augmented Reality and Virtual Reality–along with challenges as these technologies step up to disrupt entertainment and enterprise.

Virtual Reality (VR) is the projection of a 3D image into which the VR headset wearer can immerse themselves. The image manipulation is well established from Computer Aided Design (CAD) software which can show a 360° view of objects. VR is used primarily for gaming, but it is being explored for sales, for example, to showcase a home in a real estate sale, or ‘try out’ a resort or experience by travel companies. An example of VR made popular, is Oculus Rift, which was bought by Facebook in 2014.

“As the unit is worn on the head, the battery cannot add too much extra weight. “Large heat sinks can also add to the bulk and weight of the design, making heat dissipation a consideration as well.”

Augmented Reality (AR), sometimes called Mixed Reality, presents layers of images and information through glasses which also allow the wearer to view the real world. This means users can be aware when there is a wall or a table in the real world, while they are in the AR world. For example, the wearer can drink a cup of coffee while wearing the AR headset without missing the cup or missing setting it down on a surface. It also means the end of YouTube videos where some hapless gamer falls over a chair, or the family dog, while saving the universe in a VR world.

Figure 1: The Microsoft HoloLens uses holographic technology and specialized components for Mixed Reality (MR).

Figure 1: The Microsoft HoloLens uses holographic technology and specialized components for Mixed Reality (MR).

An example of AR is the Microsoft HoloLens, with commercial and developer editions available today. The company describes it as a self-contained holographic computer, with specialized components, including multiple sensors, optics and a chipset that uses 24 Tensilica DSPs. It uses holographic technology to project virtual images that look like they are in the real world, while still letting the wearer see through to the real world (See Figure 1).

Figure 2: Augmented Reality with HoloLens turns a room into a space battleground.

Figure 2: Augmented Reality with HoloLens turns a room into a space battleground.

One of the limitations of any wearable technology is power. The HoloLens is driven by the battery life which dictates the length of play-time. As the unit is worn on the head, the battery cannot add too much extra weight. Equally, consideration must be given to heat dissipation, as adding large heat sinks can also add to the bulk and weight of the design.

I put the HoloLens through its paces at the Cadence booth at CES (Figure 2). While I was getting to grips with how to zap mutants that were coming at me, Neil Robinson, Director of Segment Marketing, IP Group at Cadence, explained that the Digital Signal Processors (DSPs) provide sensor fusion along with real-time processing of depth, camera, orientation and temperature sensor inputs.

“CPUs and GPUs were unable to deliver low enough power while also delivering real-time performance,” Robinson comments. He adds, “CPUs are general-purpose, so they are not efficient at doing the sensor processing. GPUs are specialized at pixel/polygon rendering and so are also not efficient at the sensor processing, but are better than CPUs.”

Something Special
“In the end, [Microsoft] needed something special,” says Robinson. “That typically involves a huge development task to create a new processor from scratch—taking decades of man years of effort and lots of risk.” The company selected Tensilica processor IP. It’s a choice that allows customers to customize the processor—taking far less time and risk, explains Robinson. “Customers start from a working processor and modify it using a simple language that maps into a fully automated process that creates the processor and all the advanced tools needed to write code on it.”

I also tried to shoot a quiver of arrows at a target in an idyllic garden setting, at the Lattice Semiconductor booth. The Vive VR underscores the roles that Field Programmable Gate Arrays (FPGAs) play in handsets, gyroscopes and accelerometers to mix the data streams for a VR system. In this version (Figure 3) there are 32 Infra-Red (IR) sensors in the handset and 24 sensors in each controller. The VR system requires a lot of I/O and processing. FPGAs collect sensor data and provide an interface to the processor. For real-time performance, the data can be time-stamped, and the Serial Peripheral Interface (SPI) can be amended as required. The processor may not have a sensor interface, so the FPGA can convert data to a suitable format for the processor of choice.

Figure 3: Two Lattice Semiconductor iCE40 FPGAs are used in the Vive headset and one in the controller.

Figure 3: Two Lattice Semiconductor iCE40 FPGAs are used in the Vive headset and one in the controller.

Ying Jen Chen, Senior Business Development Manager, Lattice Semiconductor, explains some of the functions needed for a typical VR system. “VR is about low latency, with high bandwidth displays and accurate spatial tracking,” he says. “Much of the processing, bridging and interface with sensors’ arrays are more suitable inside FPGAs. Other low-cost programmable devices, such as an MCU, often do not have the performance needed to process video, nor the I/O and parallel architecture to deal with a sensor array.”

The parallel architecture of the FPGA and the high I/O count make the device suited to concurrent data capture and processing, asserts Chen. He cites the LatticeECP3 and ECP5 which have the logic capability to provide real-time and low latency embedded video processing. The company’s Crosslink can provide high bandwidth camera and display bridging, he says, while the iCE40 low power family is optimized for sensor array interface and processing, and the iCE40 UltraPlus FPGA adds more embedded memory and DSP blocks “which can greatly help with data buffering, sensor fusion and processing,” he adds.

The Vive is a tethered VR system, but a wireless upgrade kit has been announced by TPCAST that uses Lattice’s WirelessHD and a suite of FPGA and ASSP products to upgrade Head-Mount Display (HMD) VR systems (Figure 4).

The TPCAST 2.0 protocol supports wireless transmission of HD display and feedback control for smart devices and computers at up to 4K resolution at 120Hz. It provides, says the company, near-zero latency transmission for both the display and controllers.

The upgrade kit includes Lattice’s MOD6320-T/MOD6321-R WirelessHD modules offering near-zero latency and non-line of sight (NLOS) performance. It also features the SiI9396 600MHz HDMI bridge IC, LatticeECP3 SERDES-based FPGA, as well as the TPCAST 2.0 protocol and algorithm. The ensemble supports wireless transmission of VR display resolution at 2160 x 1200 at 90Hz.

Figure 4: TPCAST Technologies’ 2.0 protocol supports up to 4k resolution for wireless VR HMDs.

Figure 4: TPCAST Technologies’ 2.0 protocol supports up to 4k resolution for wireless VR HMDs.


hayes_caroline_115Caroline Hayes has been a journalist covering the electronics sector for more than 20 years. She has worked on several European titles, reporting on a variety of industries, including communications, broadcast and automotive.

Tags: