Q&A with Richard York: ARM Safety Packages and Automotive



Early in 2015 ARM announced that the ARM® Cortex®-R5 processor would be targeting real-time, safety-critical applications like avionics, power plant control and so on. It turns out that ARM has plans to also target its “Cortex-A” processor family towards safety-critical applications and systems—particularly automotive and machine vision.

I caught up with ARM’s VP of Embedded Richard York to discuss the growing trend in safety-critical systems applied to automotive. Called “ADAS”—Advanced Driver Assistance Systems—these systems are already on high-end luxury vehicles and are quickly migrating downward into mid- and economy-grade automobiles. ARM sees big opportunities for its Cortex-A processors.

For the reader’s convenience, I’ve included some of ARM’s slides to supplement Richard York’s narrative.

– Chris A. Ciufo, Editor-in-Chief, Embedded Systems Engineering magazine

Chris A. Ciufo (“C2”): Richard, what exactly is ARM announcing?

ARM VP of Embedded Richard York

ARM VP of Embedded Richard York

Richard York: The key announcement is the availability of the safety packages for Cortex-A53, Cortex-A57 and the Cortex-A72 processors, the new high-performance 64-bit processor we announced in Q2. We’re asserting there’s going to be 100x increase in the amount of data to process in a vehicle in the coming years as more sensors are added and existing ones are upgraded. Processing all of that data by the many embedded CPUs is very demanding and in many cases needs to be done with functional safety in mind. Safety packages for these three processors will contribute to that processing story.

This announcement is the result of more than three years of investment in internal [ARM] processes, training, verification, design flows, requirements tracking and safety management. It’s a very big deal. And this is on top of last year’s announcement at ARM TechCon: the ARM Compiler qualification and certification story. So now we have technical data packages, certified tools and processors all geared for use in safety-critical applications and systems.
C2: What systems is ARM targeting?

Richard: We have a strong focus on Advanced Driver Assistance Systems (ADAS) for the automotive market, and we’re supporting today’s advanced cars as well as the cars of the future.
C2: ARM announced the safety package for the Cortex-R5 processor earlier this year. How does that apply to these “A” processor architectures?

Richard: The Cortex-R5 drove a lot of our internal work, got us to change the way we develop our processors and allowed us to establish the necessary internal processes so that we can show that we comply with the spirit and the context of various safety standards.

Once we had done that the natural next step was to apply that expertise to the way we develop our application processors as well. That joins very nicely with the move towards ADAS and, longer term, to highly automated driving. In these systems most of that functionality is being built on our application processors because of the sheer amount of processing power required.
C2: Tell me more about the actual processing and applications that make up “ADAS.”

Richard: Machine vision is, to quite a large extent, the heart of ADAS. I’ve got a slide that shows the technology in today’s car including the vehicle safety features [Figure 1]. The really demanding ones are in the areas of vision, whether looking outside to see what is going on or looking inside to monitor the driver’s actions [Figure 2]. This establishes situational awareness, watching for things like driver drowsiness. For the environment outside the vehicle, are you looking forwards, are you looking to the side, are you looking to the rear? Can the vehicle fuse that information with radar and other data? These machine vision tasks, all part of ADAS, rapidly become incredibly computational.

Figure 1: Technology in the modern car. ARM fits in all of these places and many of the new vehicle safety features require heavier processing.

Figure 1: Technology in the modern car. ARM fits in all of these places and many of the new vehicle safety features require heavier processing.

Figure 2: A more detailed look at ARM’s Cortex processors in the modern car.

Figure 2: A more detailed look at ARM’s Cortex processors in the modern car.

We’ve tried to do an estimate of the amount of data to process, and it’s amazing just how much this will increase by. There will be ten to a hundred times more data in the next generation of vehicles, over the coming years, than in the current generation. By the time you’ve added up the number of sensors, cameras, radars and so forth, serious processing muscle is required.
C2: You mentioned cameras inside the vehicle; what are they for?

Richard: There’s a need to monitor the drivers themselves as a key component in the overall vehicle safety equation. ADAS systems should monitor drowsiness [e.g., head nodding], distraction [e.g., eating], and the frequent challenges many drivers face, including spending far too much time fiddling with their phones or with their entertainment systems rather than concentrating on driving!

The good news is it’s actually very practical for the vision system to observe the driver and make sure he’s concentrating on the road ahead. We can easily foresee these vision systems getting smarter by determining where drivers are looking. Suppose a pedestrian on the left hand side is close to the edge of the road, but the driver is looking to the right? You might want to sound an alarm. There’s a lot of interesting stuff you can do.
C2: Are there other examples of outside-the-car machine vision?

Richard: Absolutely. Check out [Figure 3] to see just how much computational complexity there is for a car to interpret its environment.

Cameras can help drivers in many ways. For example, it’s something we take for granted: what is the distance between us and the car in front, how rapidly that distance is changing, where we are in the lane, the posted road warning signs, the blind spots—all of these things are something that we want the computers to be able to look at. In order to do that in a variety of different conditions, at night, in heavy rain and thick fog, make this very challenging. The consistent feedback from automotive engineers is that they’ll use as much computational performance as we can give them.

Figure 3: Fusing camera, radar and other sensors to monitor the overall driver/vehicle situational awareness environment. These tasks are computational challenges, requiring high performance and safety-critical software.

Figure 3: Fusing camera, radar and other sensors to monitor the overall driver/vehicle situational awareness environment. These tasks are computational challenges, requiring high performance and safety-critical software.

C2: How does this 3D ADAS challenge compare with, say, a fighter jet’s sensors?

Richard: This may well be more complex for the simple reason that radar gives you certain types of information in one or two dimensions. [ADAS] is giving you much more dimensionally complicated data. And it changes faster, particularly as you increase the speed of travel. And there’s less room around a vehicle than there is around a fighter jet.

And in a car, there are more variables to contend with: pedestrians, parked cars, or line markings.

This is a really good fit for what ARM is doing. We’re focused on cost effectiveness and low power, attributes clearly of interest to automotive; it all builds on our long running story for high-performance and low-power compute.

It also fits nicely with what our SoC partners, such as Freescale, Renesas, Texas Instruments and others, are building—very capable and dedicated SoCs for these markets.
C2: The FPGA guys, like Xilinx, have been talking about ADAS for the last couple of years. How does an FPGA compare to a safety-critical ARM-based SoC?

Richard: FPGAs have got this wonderful amount of flexibility and plenty of the Tier 1 (primary automotive suppliers) appreciate that a great deal because they have their own proprietary vision engines, vision hardware and vision IP.

Quite a few Tier 1s have gone down the Altera and Xilinx route. But others haven’t. They’ve been happy to accept the image engines, the latest R-Car SoCs from Renesas, or the S32V family that Freescale announced back in January with lots of dedicated vision hardware on-board.

So the choice is between a dedicated SoC or an FPGA, but designers have lots of flexibility. And either way, with ARM cores embedded inside of the FPGAs, you see our technology providing some of the common building blocks in both approaches, around which a vibrant ecosystem is developing.
C2: It’s clear to me why performance and safety converge with ARM’s processors. Are there any other markets where this is the case?

Richard: There are several adjacent markets pertaining to “vision.” The investment in vision is directly feeding into some of those other adjacent markets, such as robotics, and they’re benefiting from the spike in that investment. A few years ago computer vision was quite niche and focused almost entirely around automation for things like manufacturing quality control. ADAS has changed that completely.
C2: What about standards organizations? Does ARM participate with the Embedded Vision Alliance or with GENIVI, for instance?

Richard: Yes and we held a workshop at one of the most recent Embedded Vision Alliance meetings to explore the market with our partners. And “yes” to your question about GENIVI, that’s an important automotive infotainment effort. We spend a great deal of time participating with organizations pertaining to vision and automotive.

In addition ARM is contributing to the work towards the second edition of the ISO 26262 functional safety standard, which will include a new part specifically for semiconductors. This part will be a welcome addition to the standard, since there is a lot of interest in supporting functional safety all the way from IP suppliers onwards. By participating in this work, ARM has a good visibility on future requirements.
C2: But are you going to do safety versions of these other Cortex-A processors, with features such as a lockstep capability and ECC like you have in the Cortex-R5? Or are we just talking about a technical data package?

Richard: We are talking about a middle ground somewhere between those two points. Safety packages with full documentation and hardware fault reporting features. And maybe ECC protection, for example, which is very important because it checks through all your memory and makes sure there are no transient or permanent anomalies.

But we probably won’t go as far as doing things like lock-stepping [as on the Cortex-R5] with the Cortex-A application processors. Typically, the cost of that is just too great. Ultimately though we are market-led and we’ll respond to what the market demands.

As well as the packages we’ll be partnering with others, such as YogiTech, to do support such things like software self-test libraries that go along with the processors to allow you to detect faults with little or no dedicated hardware.

If you don’t know YogiTech, it’s an Italian company that does what it calls software test libraries (STLs). There are lots of ways of taking those applications processors and monitoring and managing faults in different ways. You need the formality of asking the questions: “How do I build this and verify it accurately? Then when I build it in a real system, how can I manage detecting and dealing with faults?”

You can see [in Figure 4] the spectrum of ARM and where our various partners fit. And there is some overlap. Ultimately, the “R” processors are focused on the decision making and the “A” are focused on the processing environment.

Figure 4: An overview of ARM’s automotive activity.

Figure 4: An overview of ARM’s automotive activity.

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google