print

Car-to-Cloud: Level 5 Autonomy Depends on 5G

Car, cloud, and connectivity are what it takes to advance past Level 4 autonomous driving. Level 5 requires pervasive, low latency, high-bandwidth wireless connectivity. Is a fully autonomous Level 5 realistic?

Americans spend about 75 million hours a year driving[1] . Safety in automotive transportation has been steadily improving over recent decades as both technological advances and safety requirement laws (e.g., mandatory seat belt use, air bags, etc.) have been instituted. However, the increase in safety devices now appears to be offset by an increase in driver distractions (cell phones, for example) as well as an increase in the total number of Americans driving (due to lower gas prices, population growth, etc.). This may account for the fact that 2015 saw the largest percentage increase in fatalities in the U.S. in over 50 years with more than 35,000 road fatalities[2]. The year 2016 was even worse, with a 6% increase in automotive-related fatalities over 2015. It’s estimated that ninety-four percent of vehicle accidents can be traced to a human error or poor choices, so it stands to reason that driver assistance in cars will reduce deaths caused by human error.

Figure 1: The success of fully autonomous cars, on a large sale, depends greatly on the promise of pervasive 5G wireless communication. (Source: Intel Corporation) Improved Traffic Management and More

Figure 1: The success of fully autonomous cars, on a large sale, depends greatly on the promise of pervasive 5G wireless communication. (Source: Intel Corporation) Improved Traffic Management and More

Although many companies are entering the autonomous vehicle market in one way or another, three areas must see advances in technology before we will see the world with self-driving cars: 1) power-efficient, high-performance computing; 2) pervasive, reliable, high-speed, high-volume, wireless connectivity with low latency; 3) and large cloud-based data centers where higher-level, big data processing will occur. Of the three, wireless connectivity faces the greatest technical challenges but has promise in 5G, whose specification will be released in 2020 by the 3rd Generation Partnership Project (3GPP), which is overseeing the development of 5G. Widespread vehicular communication will result in high volumes of streaming data. With additional computing from people, by 2020, each person could be generating 1.5 GB of data or more every day through personal communication alone. We will see 1.8 billion PCs, 8.6 billion mobile devices, and 15.7 billion Internet-connected devices by 2021, according to ARM TechCon 2016 keynote speaker and SoftBank CEO Masayoshi Son. By 2036 we might have more than a trillion Internet of Things (IoT) devices connected to the Internet[3].

Figure 2: Infrastructure such as this light-pole-as-information-node demonstrated at the 2017 Mobile World Congress will communicate with vehicles and to the cloud. The poles are equipped to gather information such as how many cars go past and at what speeds. Many nodes like this would reveal information for better city traffic management. (Source: Intel Corporation.)

Figure 2: Infrastructure such as this light-pole-as-information-node demonstrated at the 2017 Mobile World Congress will communicate with vehicles and to the cloud. The poles are equipped to gather information such as how many cars go past and at what speeds. Many nodes like this would reveal information for better city traffic management. (Source: Intel Corporation.)

The Society of Automotive Engineers (SAE) has listed five levels of driving, as outlined in Table 1. Level 1 involves simple driver assistance such as cruise control. Level 2 is partial automation, with some advanced features such as lane keeping assistance, acceleration and merging, and collision avoidance. Tesla’s Autopilot qualifies as Level 2, for example. In levels 0 – 2, the human driver is responsible for monitoring the driving environment. Levels 3 through 5 have the automated driving system monitoring the driving environment. Level 3 has all the advanced features possible in Level 2. Level 3 is a conditional level of automation, yet requires a car to be able to make decisions, such as passing a stalled vehicle in its lane ahead, rather than braking and coming to a stop. For Level 3 vehicles, humans will need to be cognizant of surroundings and take over if prompted to do so by the system, such as when poor weather or road conditions compromise sensors. Level 4 has a high level of automation and may prompt for intervention for a human driver to take over, but does not require the human to respond. At Level 4, a prompt for the human to take control may go ignored, but the car is likely to safely pull over and come to a stop. Level 5 is quite ambitious, with total control of driving accomplished by the autonomous vehicle at all times.

Table 1: The U.S. Department of Transportation’s National Highway and Traffic Safety Administration has adopted Society of Automotive Engineer International (SAE) definitions for levels of automated driving.  (Source: SAE International, J3016 “Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems[4]”)

Table 1: The U.S. Department of Transportation’s National Highway and Traffic Safety Administration has adopted Society of Automotive Engineer International (SAE) definitions for levels of automated driving. (Source: SAE International, J3016 “Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems(4))

Vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communications will happen regardless of whether we reach a society full of Level 5 vehicles. Vehicle communication with everything (V2X) enables drivers to obtain real-time warnings on traffic conditions, obtain information on the cars around them that will further enable collision avoidance (among other things), and geographically relevant advertising (audible billboards, so to speak). Cars are already communicating wirelessly for over-the-air software and firmware updates, receiving high-definition content for back-seat entertainment, and downloading maps for navigation systems. But as cars advance in capability, 5G networks will be needed to accommodate the vast data requirements.

5G and Deep Learning GO Together

The Intel® GO™ automotive 5G platform is the first automotive platform ready for 5G, which allows early development and testing. The Intel Go platform covers cloud, car, and connectivity via multiple development kits that scale from Intel® Atom™ processor-based platforms up to higher performance Intel® Xeon® processors. Both platforms provide for developing automated driving functions in the three primary areas. First, they are involved with gathering and processing car sensor data (e.g., radar, lidar, ultrasound, optical sensors, and more) so that the car interprets its surroundings accurately in what is more than ordinary human vision. Secondly, the Intel GO platforms correlate and “fuse” large amounts of input data, a process called “sensor fusion.”

Sensor fusion can be as simple as averaging or weighting inputs but may involve other techniques. The fusing of information from multiple sources provides a means for the car to see its surroundings with anything from video and object recognition to radar that is constantly pinging around the car to detect objects, obstacles, and potential collisions. Finally, decision-making is carried out in a split second using high-performance processors capable of deep learning. The 5G aspect of Intel GO includes the Intel 5G Modem, which uses mmWave (the band of spectrum between 30 GHz and 300 GHz) and sub-6 GHz, per expectations of what those involved in 5G today (e.g., Verizon) are working on. The current vision for 5G is a collection of solutions that together will offer 20 Gbps download speed, 10 Gbps for uploads, 1ms latency, and a million connections per square kilometer. Thus, Intel GO allows developers to work on problems ahead of the 5G specification rollout in 2020.

City Lab Rats?

Even though autonomous cars will not get distracted, drunk, or show off, there will be cyber attacks that attempt to cause harm by way of car, cloud, or connectivity. Some believe that autonomous cars will necessarily be restricted to city streets and low speeds for the foreseeable future as we work out the actual safety of this technology and create regulations that make sense without stifling growth. Autonomous vehicles need numerous pilot studies and millions of miles traveled to make meaningful statistical comparisons to human driving. However, the rewards of autonomous vehicles will be huge. Morgan Stanley believes that AVs have the potential to deliver around $507 billion in annual productivity gains, and Intel estimates that the systems, data, and services market surrounding the technology will reach as much as $70 billion by 2030.

Level 5 vehicles are the last challenge that may take the longest to achieve. That’s not only because technology faces growing pains to achieve 5G first, but also because the amount of statistical evidence regulators may need to be convinced that AVs will save lives doesn’t exist yet. “The most autonomous miles any developer has logged are about 1.3 million, and that took several years. This is important data, but it does not come close to the level of driving that is needed to calculate safety rates,” said Susan M. Paddock, a senior statistician at RAND Corporation. “Even if autonomous vehicle fleets are driven 10 million miles, one still would not be able to draw statistical conclusions about safety and reliability[5].”  Still, autonomous vehicles have that aura of mystique that getting a man on the moon generated in the 60s—an achievement that seems far-fetched yet technically…within reach.


LynnetteReese_115Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.


[1] “Autonomous Cars: The Future is Now.” Morgan Stanley. Morgan Stanley, 25 Jan. 2015. Web. 29 Mar. 2017. <https://www.morganstanley.com/articles/autonomous-cars-the-future-is-now>.

[2] The Statistics Dept. “NSC Motor Vehicle Fatality Estimates.” News Documents. National Safety Council, 2017. Web. 29 Mar. 2017.

[3] ARM. “SoftBank Group CEO Masayoshi Son – The Journey to 1 Trillion IoT Chips –ARM TechCon 2016 Keynote.” [Video file.] YouTube. ARM, 26 Oct. 2016. Web. 29 Mar. 2017.

[4] SAE. “Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems.” SAE International. Society of Automotive Engineers, 4 Oct. 2014. Web. 29 Mar. 2017. <https://www.sae.org/misc/pdfs/automated_driving.pdf>

[5] Kalra, Nidhi and Susan Paddock. Driving to Safety: How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?, Santa Monica, CA: RAND Corporation, 2016. http://www.rand.org/pubs/research_reports/RR1478.html.

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • TwitThis