“…brain inspired”: Q&A with Paul Washkewicz and Chet Jewan, Eta Compute
Why microcontrollers architected in a “brain inspired” way could have a growing role in smart metering and other point-to-point communication applications, such as those relying on sub-GHz next generation networks.
Editor’s Note : “Low power solar operation means more deployments with low operational cost,” Eta Compute co-founder and VP marketing Paul Washkewicz tells EECatalog. One potential use case, according to the company’s Chet Jewan, who serves as VP sales and business development: oil pipeline monitoring. Earlier this year Eta Compute Inc. and ROHM Semiconductor announced the two have partnered to develop sensor nodes compatible with the Wi-SUN sub-GHz communication protocol—news followed by a successful Sensors Expo demonstration of ROHM’s sensor technology and Eta Compute’s low power MCUs. Edited excerpts of our conversation follow.
EECatalog: What did you want to accomplish by introducing your Sensors Expo attendees to an energy-harvesting sensor evaluation board and how did it go?
Chet Jewan (CJ): To convey that it was possible to capture data—including lighting, temperature, pressure—and transmit that from the sensor node to a Wi-SUN receiver in ROHM’s booth, presenting how an energy-harvesting solution captures and transmits data in real time—all without being plugged into the wall—as well as operating from lighting in the exhibit floor, showing a real-life application.
Paul Washkewicz (PW): There was a steady stream of sensor customers coming by both the Eta Compute booth and the ROHM booth asking about details of the design, especially in relation to the issue of: “How do you effectively deploy systems that have excellent coverage but don’t require a steady stream of management, which raises operating expenses?”
EECatalog: Why pool resources with ROHM?
PW: Pooling resources makes it possible for customers to get the sensor nodes required for deploying next-generation networks. Individually, each of our companies has products that theoretically could work seamlessly together. However, while engineers hear those words “work seamlessly” often, they can find actual implementation more elusive.
For example, a key part of this joint solution is the power management for the energy-harvesting section of the design. By pooling resources, we put together a design that includes our low-power controller and ROHM sensors, but also includes the design of the power management. Anything to streamline an application’s design and prove out the application in hardware and software is a better starting point than a collection of datasheets.
CJ: They can use our reference design as an example, whether they want to change the communication protocol, or change the sensors, or take an existing reference design and be able to deploy that in an existing factory as is, they can do that.
EECatalog: For a running start to deployment as quickly as possible?
CJ: Exactly, so they can take the sensor node as is and deploy it on a trial basis, and from there they can enhance it, modify it, whatever the case may be.
EECatalog: What’s the significance of Wireless Smart Ubiquitous Network (Wi-SUN) compatibility?
PW: Wi-SUN is an open standard based on a low-power mesh network targeting smart cities’ and smart utilities’ networks. With hundreds of millions of possible deployments, making these as near “maintenance free” as possible will help the proliferation of such systems. Wi-SUN is one good example of a low-power sub GHz RF standard for these applications.
EECatalog: What features of this solution do you anticipate catching folks’ attention?
CJ: One aspect people will be interested in is how often can they capture and transmit data from the sensors, and the variable would be, for example, lighting in their environment. If this node is inside a factory, the data customers are looking for is, “Okay, if it’s only 100 to 300 lux type of lighting, how often can I poll the sensors for data, and then how often can I transmit that data?”
EECatalog: But we’re not talking just inside factories?
CJ: Absolutely not, for instance, take cold chain applications. The sensor node could be inside a truck and you could transmit to the cabin to alert the driver—say he’s transporting lettuce—that the temperature is going up.
EECatalog: How does sensor fusion fit in?
CJ: Say along with the temperature sensor that indicates the truckload of lettuce is getting too warm, you also have a GPS unit that is tracking location. For that truck that is moving inventory which supposed to be refrigerated you want to be able to look at location, temperature, pressure and say, “Okay, what’s really happening here?” Maybe action has to be taken because the truck has traveled from a cooler, higher altitude in Flagstaff Arizona to a warmer location, Phoenix or Tucson. You are taking data from multiple sensors and producing meaningful data for the user. Looking ahead, artificial intelligence integrated into our MCU will make possible a rapid, low cost way of doing sensor fusion at the edge resulting in relevant information for the user.
EECatalog: What is capturing Eta’s Compute’s attention with regard to Artificial Intelligence?
CJ: With the AI market, which is still in its early stage, a lot of the work being done is to transmit the data to the cloud in order to get it processed and then provide data back down. We, on the other hand, are working to provide AI at the edge. So, for any of these applications that are IoT sensor-driven, we want to be able to take the data from the sensor, use artificial intelligence to do sensor fusion, and act [based upon] that data without going to the cloud for processing.
We want to make an edge device that is very low power, with the performance to run AI and process locally, instead of having to transmit the data to the cloud. You want to avoid latency and connectivity issues, where, for example, you’ve got this truck and you’re sending data to the cloud. And by the time you get the data back, you’ve got a refrigeration issue, and it may be too late.
We are developing efficient neural nets, we’ve got the low-power hardware—we are putting that altogether where AI is possible at the edge, and especially for sensor-fusion and IoT types of applications, we can sense, infer, and act locally, without cloud connectivity.
EECatalog: That’s in the context of some applications needing the kind of computing power the cloud can offer?
CJ: Yes, it’s application dependent. For some heavy-lifting performance, yes, you are going to need the cloud, and that is not the market we are going after.
In terms of sensors and IoT things, not everything is connected to the cloud, and to go ahead and connect that is going to cost money—you’ve got to get some kind of subscription service, whether it is LoRaWAN or NB-IoT, or whatever the case is.
Let’s take an agriculture application, where you’ve got these sensors out in the field checking for moisture, checking for light—those type of things, it is going to be difficult to have all of these devices connected to the cloud, but you may want them connected locally, so you can send a message. Wi-SUN, LoRaWAN typically transmit a kilometer or more, and you may have a base station there within the farm—and now you are sending signals where you are seeing too much moisture, too little moisture; there may be an issue with lighting; there may be an issue with a soil sensor checking different pH levels, whatever the case may be, but you want to be able to handle that locally and not necessarily send all that data up to the cloud to get processed.
If it is a large field you may have 100 sensors there, and you would not have each one connected to the cloud, you might be pulling them together to a local place to get processed and then act, depending on what that sensor is telling you (Figure 1).
Among those 50 billion connected devices that are supposed to be here by 2020, we are trying to connect the ones that need very low power, that are optimal for energy- harvesting There’s a large market where a plug isn’t readily available, and where you don’t necessarily want to rely on batteries that require changing. Locations that have energy-harvesting sources available, vibration, or light or thermal for instance are those where typically there would be enough power for our low-power MCU and sensors to operate.
EECatalog: Unlike a situation where there’s the need to check on and change batteries that could be in difficult to access locations, it’s more “set it and forget it”?
CJ: Yes, to cite another example, oil pipelines. These pipelines run for hundreds of miles across the country, and they are out in the open, so sunlight is available as an energy source making it possible to monitor for flow, leaks, other parameters of concern.
The sensor node can be mounted on a decent-sized magnet that can be positioned on the pipeline. Oil flowing through the pipeline will create a constant vibration and with sensor fusion and AI you know what a typical flow would feel like. And if something stops or changes, you can pick up on that. And all of this can be done without batteries, without electricity. Replacing batteries would be very expensive, whereas you can just run solar for indefinite periods of time and be able to transmit that to a localized base station or even use an NB-IoT type of application, employing a cellular network or satellite, and transmit that data a few times a day, saying, “Everything is okay, there are no issues,” or if there is an issue, you can transmit that data right away. That’s as compared to sending people out to check the pipeline or using helicopters to fly along the pipeline to make sure there are no issues—that gets very expensive over time.
EECatalog: Anything to add before we wrap up?
PW: Our delay insensitive asynchronous logic, or DIAL, MCU is a good fit for supporting machine learning and machine intelligence in portable devices, mainly due to the extremely low power of operation. With operating currents on the order of micro amps, even coin cells last years or alternatively, we can run on small solar cells.
CJ: Because our DIAL MCU is asynchronous, it consumes very low power. Putting together our asynchronous MCU with a brain inspired neural net which is also running asynchronously produces a solution that we do not believe, for the amount of performance we offer at low power, is found anyplace else out there.