The Network is the Car



How data centricity connects with Artificial Intelligence in Autonomous Vehicles.

 An autonomous car combines vision, radar, LIDAR, proximity sensors, GPS, mapping, navigation, planning, and control. These components must combine into a reliable, safe, secure system that can analyze complex environments in real time and react to negotiate chaotic environments. Autonomy is thus a supreme technical challenge. An autonomous car is more a robot on wheels than it is a car. Unlike current cars, autonomous vehicles (or “carbots”) must suddenly be Artificial Intelligence (AI)-capable computers.

Figure 1: Database vs Databus. In a data-centric architecture, applications communicate only with the data infrastructure, not with each other. A database implements data-centric storage that finds the right old data through search. A databus implements data-centric sharing that finds the right future data through filtering. Both technologies make system stem integration much easier, supporting larger scale, better reliability, and application interoperability.

Of course, there are millions of lines of software in today’s cars. But most is embedded in Electronic Control Units (ECUs). ECUs may perform complex functions, but their interactions are minimal. That design—and indeed the entire supply-chain model—doesn’t extend to autonomous drive. Autonomy requires richer connectivity between components and increasing the connectivity of the existing designs is impractical without an entirely new architecture.

The new generation of autonomous vehicles also requires distributed computing. Even with extremely-fast processors, distributed computing has many fundamental advantages over centralized designs. Most importantly, distributed systems are more modular. That helps optimize designs; for instance, pairing computing with sensors matches the software to the hardware and eases tuning. Plus, distributed systems can more easily support redundancy, increasing reliability. These advantages become critical at higher levels of autonomy.

To address these challenges, data-centric architectures are becoming the new standard for highly autonomous vehicles. Data centricity enables and controls complex data flow. It greatly simplifies component interaction. It reduces lines of code in the ECUs. And it directly supports AI modules.

Data Centricity
Data centricity is an architecture formed from participants that communicate only with the data infrastructure. Data-centric communication systems contrast with object-oriented systems (where objects communicate), message-oriented systems (where participants send messages to each other), and service-oriented architectures (where participants connect to services). Participants in data-centric systems are decoupled from all other participants in time, space, and flow. Data-centric connectivity is also called a “databus,” defined by the Object Management Group (OMG) Data Distribution Service (DDS) standard.

Databases are also data centric. However, a database implements data-centric storage, while a databus implements data-centric communication. The key difference: a database searches old information by relating properties of stored data. A databus finds future information by filtering properties of the incoming data.

DDS implements the virtual abstraction of a “global data space”—all data appears to be available everywhere. In reality, of course, that is not possible. But, by asking every application to specify what it has or needs, how much data that is, and how often it can produce or consume it, the databus can make it appear so. Thus, every application simply provides data it knows into the space and gets the data it wants from the space. It’s naturally parallel; all the data is logically everywhere whenever anything needs it.

The databus is elegant and powerful. Applications automatically discover any data, along with metadata like timestamps, types, and units. Any application can join, leave, add data, or remove data at any time. The databus guarantees data delivery rates and maximum allowed delays. Applications can request notification of changes in specified timeframes. History is available on request. All communications are peer-to-peer, allowing operation at full wire speeds. It does not require servers, so there are no servers to locate, configure, provision, reboot, choke, or fail. It scales well; adding new flows does not disturb current flows. And, it incorporates fine-grained security; only applications with the right permissions for specific dataflows can participate. Just as the database enables complex enterprise applications, the databus enables complex intelligent systems. Both do that, fundamentally, by simplifying applications through data management.

Table 1: Autonomy Challenges. Autonomous systems pose unique demands on system architecture. The databus concept directly targets this application, and thus has unique approaches for each demand.

Data Centricity and AI
The DDS “all data everywhere” abstraction simplifies AI integration. In fact, the databus concept started in autonomous systems at the Stanford Aerospace Robotics Lab. It has unique properties that map well to the challenge. Table 1 summarizes the challenges and approach.

Perhaps most importantly, a databus supports many dataflows with one abstraction. Most earlier designs require a separate technology for every flow. For instance, older systems would send the extreme throughput of video over a streaming protocol but use a specialized real-time bus for small but frequent control signals. Worse than the hardware, this also forced a change in abstraction. That makes it hard to write software that can fuse the sensor sets.

Figure 2: Complex Dataflow. A complex system must support many data types and sources. Some are very high volume; others are fast. DDS supports 22 “Quality of Service” (QoS) settings for each flow. Each module can specify exactly what data relationship it requires for operation. QoS includes update rates, reliability, data availability guarantees and more. Infrastructure that specifically sends exactly the right information to exactly the right places at the right time makes system development much easier.

Highly autonomous systems make this much more challenging. Carbots combine many direct and derived sensors, perception modules, intelligence, feedback control, and off-vehicle communications. Combining all these data through traditional designs is messy.

DDS excels at dataflow control. Distributed architectures are much easier with an abstraction that delivers the right data to each module.

At lower levels of autonomy, designs can use a centralized design that sends all data to a single central computer. Consider, for instance, the perception system. A key need is to “fuse” data from many sensors into a single common understanding of the situation. The easy way to access all that data is to get it all to the same place for processing. Thus, if the vehicle has an array of cameras and proximity sensors, each would send the raw video and data streams to the central processor. This simplifies computing. Of course, the processor becomes a choke point and single-point-of-failure. Worse, it requires a lot of data transfers and dedicated video wiring for each camera.

Figure 3: Sensor Fusion. Perception modules must fuse multiple sensors to better model the world. Level 3 autonomous systems can send all data to a powerful central computer. However, this design breaks down for the complex sensor architectures required for more demanding autonomy at levels 4 and 5. These systems benefit from a distributed design that processes data closer to the sensors, thus spreading the workload, reducing wiring, and allowing redundancy.

For higher levels of autonomy, this design breaks down. Rather than send all the raw data to one central box, it’s more effective to pre-process the data closer to the sensor. This is often called “early” or “hybrid” fusion. The databus is a good fit to this design; its data-centric virtual “global data space” abstraction is a powerful substitute for actually sending all data to a central node.

The Future of Autonomous Systems
Autonomy requires complex software integration in the vehicle. That’s difficult to do by combining ECUs. Thus, OEMs are increasingly looking to in-house software teams. DDS provides a powerful standard to enable that effort. Most carbot designs use DDS directly. However, it also underlies other architectures, such as ROS2 and soon, AUTOSAR.

Figure 4: Autonomous Car System Architecture. DDS integrates all the components in a typical autonomous car design. The data-centric interface controls all module interactions, schema, rates, reliability, and system health. Powerful system integration support lets teams work independently with the assurance that the infrastructure will directly support the data interaction needed for system-wide operation.

Of course, DDS is far more than an in-vehicle technology for automotive systems. RTI Connext DDS, for instance, has over 1000 designs in the Industrial IoT including medical, oil & gas, naval, avionics, air-traffic control, hyperloop, metro transit, and robotics applications. Therefore, DDS is also a good fit to the off-board needs of autonomous drive systems. Its increasing traction across the broad sphere of smart machine applications makes it a good decision for the future.

 


Dr. Stan Schneider is CEO of Real-Time Innovations (RTI), the Industrial Internet of Things connectivity company. Schneider serves as Vice Chair of the Industrial Internet Consortium (IIC) Steering Committee. He also serves on the advisory board for IoT Solutions World Congress, and chairs OpenFog’s Fog World keynote committee. He was named Embedded Computing Design’s Top Embedded Innovator Award 2015 & IoTOne’s top-10 most influential in the IIoT 2017.He holds a PhD in EE/CS from Stanford.

 

 

Bob Leigh is director of market development, autonomous vehicles at RTI. He brings over 15 years of experience developing new markets and building technology companies to his role. Leigh graduated from Queen’s University with a degree in Mathematics and Engineering and has used his education in control and communication systems to engineer embedded solutions for a variety of industries including energy, manufacturing, and transportation.

Share and Enjoy:
  • Facebook
  • Google
  • TwitThis
Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.