Future Development of Autonomous and Connected Driving
SAE L5 will be a reality, but that reality will vary from city to city and from country to country.
Autonomous driving is here, at least on a limited scale. In fact, you experience autonomous driving (AD) all the time. At the San Francisco Airport, when you get on the monorail to move between different terminals, you are experiencing autonomous driving. Even though it is remotely controlled, there is no human driver on board. We cannot go to a car dealer today to buy a fully autonomous car, but that is about to change. However, with the California Department of Motor Vehicles (DMV) announcement that starting April 2, 2018, automakers can test drive a fully autonomous vehicle with no human onboard, we marked an historic milestone. The manufacturer must, however, obtain a driverless testing permit from the DMV and meet the permit requirements. Today, there are already technologies to aid partial autonomous driving. So, what does full autonomous driving mean?
When describing autonomous driving most people (and the DMV) use the Society of Automotive Engineers (SAE) International’s standard J3016, which defines five levels of autonomous driving (not including Level 0, which is no automation). The following briefly defines the five levels of autonomous driving:
- Driver Assistance (Level 1) – the vehicle provides a specific function including acceleration and steering while the driver is responsible for the remainder of driving requirements.
- Partial Automation (Level 2) – the vehicle provides assistance in additional functions like steering, acceleration/deceleration and auto-braking based on sensors detecting the driver’s environment.
- Conditional Automation (Level 3) – the vehicle can drive itself, but with a driver that monitors the situation and can take over at any time.
- High Automation (Level 4) – the vehicle can take over all aspects of driving on surface streets.
- Full Automation (Level 5) – this is the highest level of automation. The vehicle can take over the driving function completely. This is the goal that all manufacturers are aiming for.
Connected Driving with Vehicle-to-Vehicle (V2V) Technology
In tandem with the autonomous driving fever from the automakers and manufacturers, the National Highway Traffic Safety Administration (NHTSA), the government agency responsible for motor vehicle safety, is working with the Department of Transportation (DOT), the automotive industry, and academic institutions to research Vehicle-to-Vehicle (V2V) Communications. NHTSA will be proposing regulations to require vehicles to implement V2V, with the ultimate goal of eliminating traffic accidents.
V2V uses dedicated short-range radio communication (DSRC) to communicate with other vehicles to exchange information such as a vehicle’s speed, heading, and braking status. A vehicle equipped with V2V will be able, for example, to detect that a vehicle in front has applied its brakes and is slowing even before the driver can detect what is happening ahead. DSRC are secured, wireless two-way communications that allow vehicles to interact within approximately 300 meters using the 75 MHz band of the 5.9 GHz spectrum, which has been allocated by the FCC for Intelligent Transportation Systems (ITS). NHTSA estimates that using “intersection movement assist” (IMA) and “left turn assist” (LTA) can potentially reduce accidents and related injuries by 50%.
The Race to Autonomous Driving
In recent years, many of the top manufacturers have rushed to join the race to offer autonomous driving. They include the alliance of Intel, Mobileye (now part of Intel) and BMW. Separately top automakers like Toyota, General Motors and many others are joining the race.
Intel, the largest silicon manufacturer, has developed the advanced driver assistance systems (ADAS) using many of its processors to support camera, sensors, graphics, and audio to provide speed status, steering information, graphics and audio warnings. With the Intel/BMW alliance, the goal is to offer fully autonomous (SAE L5) vehicles in the coming years.
In 2015, Toyota Research Institute, a wholly owned subsidiary of Toyota Motor North America, began to focus on advanced technology for autonomous driving. At the recent Consumer Electronics Show (CES 2018), Toyota Research Institute (TRI) demonstrated its next-generation automated driving research vehicle on a Lexus LS 600hL. The new Platform 3.0 can detect a 360-degree perimeter using four high-resolution light detection and ranging (LIDAR) scanning heads. (More info on Luminar LIDAR below).
Another major automaker in the autonomous driving business is General Motors. Recently it filed a petition with the DOT for a SAE L4 Cruise AV, which will be able to operate safely on the road without a driver, pedals, or steering wheel by 2019. As early as 2015, GM started testing AD in the streets of San Francisco, which was much more challenging than doing AD in a rural area. Figure 1.
New Innovations Propel Autonomous Driving
Improving critical technologies has helped propel autonomous driving in ways not possible before. These technologies include deep learning/artificial intelligence (AI), sensors including light detection and ranging (LIDAR) and radar, new cameras, and new silicon solutions.
Deep Learning / artificial intelligence (AI) technology is fueling autonomous driving. NVIDIA, a leader in graphic processor units (GPUs) provides super-fast GPUs to help cars learn how to drive. The GPU technology can learn much like a human driver does and can improve its driving skill over time. Many automakers are using the technology to achieve autonomous driving including VW, Volvo, Audi, Tesla and Mercedes-Benz. A competing technology offered by Mobileye (acquired by Intel for $15.3 billion in 2017) includes additional software for autonomous driving. Mobileye (Intel) is working with multiple automakers including BMW, Audi, and Honda to produce SAE L3 autonomous cars (AC) starting in 2019 with the goal of launching an SAE L4 AC in 2020. Computer giant IBM is utilizing AI in AD with Watson, its AI engine, to propel an electric bus.
One breakthrough is the advancement of LIDAR technology comes from Luminar. Most other LIDAR or equivalent technologies cannot distinguish between objects with similarly dark or bright colors. However, the Luminar LIDAR sensor can, at 75 MPH, detect a 10% reflectivity object, like a black car or tire on the road, from a distance of 200 meters, providing seven seconds of reaction time. It operates at 1550nm wavelength, and compared with other LIDAR solutions, offers resolution that is 50 times greater with a range that is 10 times longer. Toyota has incorporated the Luminar technology in its future autonomous driving solution.
Mentor Graphics, a Siemens business, provides automotive networking solutions to many carmakers with hardware and design tools in the automotive areas of connectivity, electrification, autonomous driving and vehicle architecture. For autonomous driving, signals from various sensors including radar, LIDAR, vision, and others are fed to a central processing system. Mentor provides the DRS360 platform which captures, fuses and utilizes raw data in real time for autonomous driving and is ISO 26262 ASIL D-compliant.
NXP has been providing semiconductor and software solutions to the automotive segments for many years. (Qualcomm has offered to acquire NXP). Its solutions cover Driver Replacement and Connectivity, In-vehicle Experience, Body and Comfort, and Powertrain and Vehicle Dynamics. Its S32 platform is a scalable automotive computing architecture to support autonomous driving. The platform includes S32 MCUs and MPUs, design studio and an automotive-grade software developer’s kit (SDK). Recently it introduced the NXP BlueBox, a development platform that makes it easier for developers to achieve SAE L3 autonomous driving. The kit includes vision and sensor fusion processors, embedded compute processors, and a radar microcontroller.
SAE Level 2 (and beyond) autonomous driving requires many sensors including radar. In theory it would require multiple silicon chips and a transceiver with an external MCU or DSP to process the radar data. Texas Instruments, a provider of advanced driver assistance systems (ADAS) to the automotive segment for many years, offers the AWR1642 device, which is a single-chip Frequency Modulation Continuous Wave (FMCW) radar sensor capable of operation in the 76- to 81-GHz band with 4 GHz available bandwidth. Based on CMOS, it also includes a transceiver with an external MCU or DSP and Arm Cortex-R4F-Based Radio Control System. This single chip solution provides a compact solution for future autonomous driving.
In autonomous driving, besides platforms and silicon, NOR Flash memories are needed for applications in SAE L3, ECU systems, camera, radar and LIDAR. Cypress, a provider of silicon solutions to the automotive segment, provides the Excelon Auto Ferroelectric-RAM (F-RAM‚Ñ¢) devices which are AEC-Q100 grade and Functional Safety ASIL-B compliant. The Excelon‚ device, with 108-MHz low-pin-count QSPI interface, provides non-volatile data capture and the data is safe in cases of power loss. Additionally, it provides practically unlimited read/write cycles for the 2GB memory and meets Automotive Grade 1 requirement (-40 ºC to +125 ºC).
What Will the Future Look Like?
We witnessed a historical milestone when the California Department of Motor Vehicles announced that starting April 2, 2018, automakers can test drive a fully autonomous vehicle with no human onboard. With the National Highway Traffic Safety Administration (NHTSA) drafting regulations to require the automotive industry to implement V2V, we are one step closer to real, safe autonomous driving. With break-through technologies like deep learning, LIDAR and companies like Cypress, Mentor, NXP and Texas Instruments offering advanced autonomous driving solutions, the industry is moving forward fast. With major corporations such as Intel, Toyota, GM and the like investing heavily and claiming autonomous driving production models coming in the next few years, what will the future of autonomous driving look like?
Don’t expect that tomorrow you will hear an announcement from an automaker that model AD-X is now available, so you can just relax and leave the driving to the computer. Instead, you may hear that the city of Petaluma, Calif. has announced that the city will now run a driverless bus from 9am to 5pm, and that residents are welcome to try it. Or areas of central California will announce that effective May 1, 2020, all fields will be tilled by machines without human interaction. Full autonomous driving will come, and the progress to date is remarkable. But you will need to be patient for a little while longer.
John Koon’s current roles include embedded technology research and content creation. Prior to this, he was the Editor-in-Chief of the RTC Magazine and COTS Journal. As a researcher, he presents findings of technology trends such as Aviation, AI, autonomous driving, robotics and automation, medical innovations, Fog Computing (beyond cloud), IoT, software and COTS advancement at conferences and seminars.
Koon has held various management roles including Director, Product Line Manager and Associate VP. Focus mainly on technologies, he worked for HP, Western Digital and distributors of Microsoft. He managed teams up to 11. Other experiences include publication of technology reports and many technical articles in the past 20 years. He held a BS in engineering (California State Polytechnic University, Pomona) and an MBA (San Diego State University).