WYSIWYG for the Automotive AR Era: Q&A with Mike Firth Texas Instruments
Why HUD system imagery and drivers stand to gain
Editor’s Note: Automotive Head-Up Displays (HUDs) enable drivers to see critical information in front of their eyes so they no longer need to glance down. Augmented Reality (AR) is coming to automotive HUDs using technology that’s faced challenges in the harsh environment of the automobile. Thermal management has been an especially big challenge, since electronics are expected to start up and function properly from the instant the car is started. Texas Instruments (TI) has made improvements in HUD technology that overcome thermal management issues, double the Field of View (FOV), and provide drivers with image depth and vitality where AR can flourish.
TI marketing manager for automotive DLP Products Mike Firth took a moment to answer some questions for Embedded Systems Engineering on what makes a much-improved HUD experience possible.
Lynnette Reese, Embedded Systems Engineering (LR): Can you tell us a bit about the technology that Texas Instruments provides?
Mike Firth, Texas Instruments: The technology powering these HUD interior automotive projection systems is the same basic technology that you find in corporate and educational projectors as well as digital cinema theatres but with a bit more to withstand the automotive venue. The DLP3030-Q1 has more than 400,000 individual micromirrors that switch on and off at extremely high rates. This fast switching is what enables the clear and bright imagery, high color saturation, and faithful color representation. Fundamentally, there are some unique features that enable the DLP3030-Q1 HUD chipset to address augmented reality (AR) in the automotive market, and they are the quality of the image that the chipset can produce, image brightness, and the ability of the chipset to withstand the thermal challenges that heavy solar loads—such as those found in a harsh automotive setting—pose. The digital micromirror device (DMD) automotive operating temperature range is from -40 ºC to +105 ºC, and the DLP technology performance does not deteriorate. In short, the DLP3030-Q1 chipset supports AR with amazing imagery, color, and very bright displays.
Digital imagery (Figure 1) is positioned in the driver’s field of view (FOV). You can see that it is not just a display. Instead, we are interacting with the driver’s FOV, the environment, and the objects in the environment, marking the distance between the driver and the next car. The red barrier on the right indicates potential danger. The system overlays digital information onto the windshield, interacting real-time with the world as the driver sees it.
What makes this next generation DLP 3030-Q1 chipset special is the solar load performance, accurate color reproduction, and brightness. And not to be overlooked is the decreased package size that enables smaller HUDs while increasing the overall performance. A new ceramic pin grid array package (CPGA) has reduced the overall footprint by 65 percent versus the previous generation.
LR: Are you saying that there’s no compromise in display quality in an automotive environment at all? What if the driver puts on polarized sunglasses? Won’t that make the HUD disappear?
Mike Firth, Texas Instruments: You see the same color, brightness, and contrast across that whole temperature range, enabling clear imagery in all types of conditions—images are visible regardless of temperature and polarization. And yes, in a typical HUD when the driver puts polarized sunglasses on, the image disappears because creating the image demands polarized light. With the DLP system images remain visible even when the driver wears polarized sunglasses.
LR: What makes an AR HUD any better than any other automotive HUD?
Mike Firth, Texas Instruments: To this point, an HUD has been primarily a display. It has not necessarily had the means to float far out over the road in the driver’s field of view, and both colors and information remain basic.
However, as we transition to AR HUDs, as seen in Figure 2, digital information is overlaid completely within the driver’s field of view, at varied distances away from the driver. The distance separation and the red warning bars are quite far out.
The virtual image distance, which is the measurement of how far from the driver’s eyes the images appear to be floating or resting, is typically somewhere in the two- to twenty-meter range. Today it is probably in the two- to two-and-a half meter range and essentially acts just as a secondary display. As the move to augmented reality takes place you’ll start seeing 7.5-, 10-, 15-, 20-meter virtual image distances, allowing those images to be projected farther.
So, in this case colors, brightness, and the field of view increase. While it’s great to have a wider FOV, you need more brightness to power a larger FOV. You need a technology that is very bright and very efficient at providing the light and accurate colors to enable that larger FOV. Part of the answer lies in the true augmented reality functionality that the DLP employs.
LR: Can you explain what you mean by “true augmented reality functionality”?
Mike Firth, Texas Instruments: Sure. It indicates how interactions occur and where that digital imagery can be projected (Figure 3). If you start with a 2.5-meter virtual image distance and, say, a five-degree field of view, which is the red, as Figure 3 shows you see that the image floats just over the hood of the car. You don’t have much field of view, so the images are not very large. You can’t interact with a whole lot of the driver’s environment, but as you start to increase the virtual image distance and increase the field of view with this technology, you can see that now you can start to interact with the cars ahead, sides of the street, turn indications, and so forth.
LR: No pun intended, but have there been any significant road blocks in designing AR for automotive?
Mike Firth, Texas Instruments: One of the primary challenges in designing HUDs is related to the solar load (Figure 4), which is magnified by the HUD’s optics. That effect puts a whole lot of thermal energy on a very small area, causing considerable challenges to thermal load management. If not managed properly the amount of energy projected onto a very small area will cause significant damage to the imager.
LR: It sounds like part of the challenge includes optics. Could you tell us why, and go into more detail on why thermal management is a problem?
Mike Firth, Texas Instruments: Already thermal load management is challenging for today’s HUDs. And that’s with virtual image distances of just 2-2.5 meters. When you move to augmented reality, and you start getting to 7.5 meters, the challenge increases, because you have to increase the magnification to support that longer virtualized distance, which moves the imaging plane. So, whatever is making the image, it moves it closer to the focal point of the optical system, which results in a higher concentration of energy. In an augmented reality HUD, that imaging plane—the diffuser panel in the case of the DLP system—moves further back, resulting in a higher concentration of energy, not necessarily more energy coming in. Although with an AR HUD, because you have a wider field of view, you do let in more sunlight at the beginning than a traditional HUD out there today.
LR: I can imagine that the DLP is what makes the difference. Can you connect the dots for us as to how the DLP technology creates an advantage over other HUD imaging solutions for harsh environments like automotive?
Mike Firth, Texas Instruments: DLP is a projection-based system that projects the image onto a diffuser, which receives the focused sunlight. The main advantage in DLP technology architecture is that the absorption of the sunlight is minimal, which eases the solar load problem; therefore, heat does not reach the level that other competing technologies introduce and then must deal with.
LR: What impact do you expect to see AR have in the automotive sector?
Mike Firth, Texas Instruments: Trends are driving towards enabling AR displays, that is, trends are moving towards making it easier to implement augmented reality displays in the automobile. The trend in augmented reality itself is to increase that virtual image distance to at least 7.5 meters if not greater, and in a field of view of at least 10 degrees, as compared to the current six to eight degrees in FOV. The longer the virtual image distance and the wider the FOV, the better experience the driver is going to have.
LR: Does Texas Instruments see any other trends in automotive?
Mike Firth, Texas Instruments: Three trends are gaining momentum, and they are Advanced Driver Assistance Systems (ADAS), electric cars, and autonomous cars. In 2024, lane departure and distance collision warning are forecast to be in over 50 percent of the automobiles produced worldwide.
The shift to electric cars is very strong, with several studies out there now that show that in Europe, up to 30 percent of the cars produced in 2025 could be electric—either plug-in hybrids or fully electric. Worldwide, it’s forecast that 14 percent of automobiles will be electric cars in 2025. And in an electric car, you no longer have a firewall separating occupants from a combustible engine compartment, so you have more space to install an augmented reality HUD. Electric cars can be designed from the ground up with AR in mind.
There is also a lot of interest in how AR can play a role in keeping drivers properly engaged in vehicles that are at less than Level 5 on the self-driving scale—especially at that transition point where the self-driving car is leaving fully autonomous mode, handing off control to the driver.
LR: Do you have any development tools for those interested in evaluating or developing an AR for a harsh environment like the automotive space?
Mike Firth, Texas Instruments: There’s the DLP3030-Q1 EVM evaluation module for the electronics side of things, as well as the DLP3030PGUQ1EVM for evaluating a picture generation unit. A third EVM, DLP3030CHUDQ1EVM, is a table top demonstrator combiner HUD. It shows the image on a piece of glass and is a portable way to evaluate DLP technology and the performance. It’s a complete HUD system that you can drive with different test patterns and different video while assessing the overall performance of a DLP-based system and what it can offer.
Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.
 Strategy Analytics’ Aug 2017 Report (Advanced Driver Assistance Systems Demand Forecast 2015 – 2024)