Road Sense Becomes Connected Sensor Technology



Sensors in vehicles are increasing in number and importance, reports Caroline Hayes. Without them, the era of  autonomous driving would have stalled.

Trials of autonomous vehicles continue to roll out, heralding the era of the driverless car. In North America, four states have taken advantage of government legislation that allows testing of automated vehicles. In Europe, Volvo, the Swedish Transport Administration, the Swedish Transport Agency, Lindholmen Science Park and the City of Gothenburg have come together in the ‘‘Drive Me” project putting 100 self-driving cars, performing everyday commutes using Autopilot technology, on the streets of Gothenburg by 2017. The test cars have so far been able to follow lanes and adapt to speeds and merging traffic, with the next stage being the cars driving the whole route in autonomous mode. Other countries are watching the growing interest in autonomous driving. Earlier this month, in Germany, arguably the heart of Europe’s automotive manufacturing, Chancellor Angela Merkel invited carmakers to draw up a wish list—and timetable—so that a legal basis for self-driving testing could be proposed.

Figure 1: The city of Gothenburg, Sweden is in the throes of the Drive Me project of autonomous vehicles driving commuter routes.

Figure 1: The city of Gothenburg, Sweden is in the throes of the Drive Me project of autonomous vehicles driving commuter routes (Courtesy Volvo).

Vehicles today have between 60 and 100 sensors. These are used in Advanced Driver Assistance Systems (ADAS) to warn a driver of hazards or obstacles on the road while driving, for example, if lane shifting is detected, as well as when parking. Sensors can detect when a vehicle is too close to the one in front with collision warning sensors, if vehicles are behind you when reversing (Rear Cross Traffic Alert, or RTCA), alert the driver to blind spots, and when coupled with a 360 degree viewing monitor, can detect moving objects within the vehicle’s perimeter.

ADAS can also be connected to the Cloud and other telematics services to advise of weather or traffic alerts, via a Human Machine Interface (HMI) on the driver dashboard.

As more data is collected, the number of sensors is increasing and is projected to reach 200 per vehicle, or approximately 22 billion sensors by 2020.

Real-Time Response

Autonomous vehicles rely on larger volumes of data than ADAS. It is critical in a self-driving car that it responds to changes in environment or conditions in real-time. This requires large amounts of data, from inside and outside of the vehicle, to be processed and analyzed in real-time. Once analyzed, it must be displayed in real-time.

Figure 2: Volvo, with the Gothenburg Chalmers University of Technology uses sensors to develop in-vehicle driver safety systems. (Courtesy Volvo)

Figure 2: Volvo, with the Gothenburg Chalmers University of Technology uses sensors to develop in-vehicle driver safety systems. (Courtesy Volvo)

In addition to cameras relaying image data, other safety features rely on sensors. For example, Volvo and the Gothenburg Chalmers University of Technology have been researching driver behavior (Figure 2). Sensors on the dashboard monitor where the driver is looking, head position and angle, as well as how open the eyes are to detect the driver’s state of awareness and attention. If the driver allows the vehicle to get too close to the car in front, or move out of its lane, an audible alert is activated.

Nissan is also using sensors to monitor driver alertness. Earlier this month, it introduced the 2016 Maxima sedan, which it calls a four-door sports car, at the New York International Auto Show. The four-wheel drive vehicle is equipped with a 3.5-liter V6 engine and Driver Attention Alert (DAA) technology to detect drowsiness or inattention. It uses steering angle sensors to monitor driver steering patterns. When a baseline is established, it continuously monitors subsequent driving patterns. It uses logic to account for curves in the road, lane changing, braking and weather conditions, but any deviation from the baseline triggers an audible chime, a coffee cup icon and message “Take a break?” on the information display (Figure 3).

Figure 3: Nissan’s sensor-based Driver Attention Alert (DAA) technology advises a break if it senses the driver is drowsy. (Courtesy Nissan).

Figure 3: Nissan’s sensor-based Driver Attention Alert (DAA) technology advises a break if it senses the driver is drowsy. (Courtesy Nissan).

This amount of data processing and display takes its toll. To access and process data from many sources, automotive manufacturers are looking for compute-intensive, yet compact, lightweight semiconductors to be designed into the automotive embedded system.

The consensus is that multicore architectures are the most silicon-efficient solution. Renesas has announced its third generation of SoCs for automotive computing, the R-Car H3. It is built around an ARM® Cortex®-A57 or Cortex-A53 core, with 64-bit CPU core architecture. To meet the processing power required in today’s vehicles, it achieves a processing performance of 40,000 Dhrystone Million Instructions Per Second (DMIPS) and also has Imagination Technologies’ PowerVR GX6650 3D graphics engine to display the information.

Radar Monitoring

The R-Car H3 processes the data from sensors around the vehicle in real-time and allows multiple applications such as detection, prediction and avoidance to run. It conforms to the ISO 26262 (SAIL-B) functional safety standard for automotive use. The SoC is supported by Green Hills Software’s INTEGRITY Real Time Operating System (RTOS)and INTEGRITY Multivisor Virtualization platform. The 64-bit secure virtualization platform was released last year, and was developed, says the company, with the specific capabilities of the R-Car H3 in mind. The platform meets ISO 26262 functional safety requirements and can also be adapted to applications such as reconfigurable digital instrument cluster or to provide compute and sensor capabilities for ADAS. The AUTOSAR-compliant application framework means that existing software components can be seamlessly integrated and reused.

Figure 4: Renesas has announced the third generation of its automotive computing platform, the R-Car H3. (Courtesy Renesas).

Figure 4: Renesas has announced the third generation of its automotive computing platform, the R-Car H3. (Courtesy Renesas).

Another semiconductor company, Infineon, relies on the multicore AURIX microcontroller. It is based on up to three 32-bit TriCore™ Central Processing Units (CPUs). The microcontroller can be used to control body, safety, ADAS and Powertrain applications.

In Munich last month, it presented a 24GHz radar-based assistance system for trucks based on the microcontroller, customised for radar applications.

Radar systems monitor blind spots in large trucks and construction vehicles. They allow several trucks to drive in columns with short stopping distances, even at night, during rain, fog or snow.

The 24GHz radar system uses a 24GHz radar chip, the BGT24ATR12 and the AURIX TC264DA microcontroller, which has been customized for radar systems. The radar chips receive high-frequency signals and transmit them to the microcontroller in the radar electronic control unit. The processed data is captured around the vehicle by the radar chip and transmitted to the driving assistance system.

Weight and space are valuable commodities in vehicle design. According to Infineon, AURIX in a radar system can eliminate the need for Digital Signal Processors (DSPs), memory chips and Analog to Digital Converters (ADCs).

The company’s AURIX TC297T microcontrollers were specified by TTTech for Audi’s zFAS central control unit, which integrates ADAS functionalities. The control unit is based on TTTech’s Deterministic Ethernet to combine traffic data with TTIntegration middleware to run the application on top of networked microcontrollers for piloted driving on congested roads and piloted parking.

Deterministic Ethernet and the AURIX TC297T have also been used to develop an evaluation platform driver assistance system Engine Control Units (ECUs) – TTA Drive. Different application modules can be integrated at the development stage, using TTIntegration software.


Caroline_Hayes_ThumbCaroline Hayes has been a journalist, covering the electronics sector for over 20 years. She has edited UK and pan-European titles, covering design and technology for established and emerging applications.

Share and Enjoy:
  • Facebook
  • Google
  • TwitThis

Tags: