The Edge Finds its Place in the Industrial Internet



Why the Industrial Internet Consortium makes the case for defining the edge by business challenge

Editor’s Note: “Everybody wants to unlock the edge, but it is key to unlock it in a cost-effective, secure, managed, orchestrated way,” Lalit Canaran, VP, SAP, tells EECatalog. Canaran and Todd Edmunds, director for industrial IoT strategy and solution architecture at Dell Technologies, are co-authors of the recently published IIC Introduction to Edge Computing in IIoT. They commented on some of the white paper’s surprising conclusions and how it can help designers understand a paradigm undergoing change. Edited excerpts of our conversation follow.

EECatalog: What did you want to accomplish with the IIC Introduction to Edge Computing in IIoT white paper?

Todd Edmunds, Dell Technologies (TE): Identify some of the trends, the architectures, the standards that are around why do edge computing and what those architectures can do once they have been enabled.

Lalit Canaran, VP, SAP (LC): And we wanted to give people who are going to be tasked with either implementing a strategy or a solution practical guidance as to how to do that. But the white paper is just the start to that. We anticipate one or more technical reports to follow.

EECatalog: What did you discover about edge computing in the process of developing the white paper?

TE: Edge computing is changing the way we view and use the Cloud, and that means that there are still some things that the Cloud will do despite a tendency by some to think that edge computing is going to eat the Cloud. We think the paradigm will flip: The Cloud will still do some orchestration and aggregation; it will have a place in big data storage. However, many current use cases in machine learning, analytics and the like are going to be applied more at the edge, especially when we start building in things like orchestration, key when you have all these distributed highly functional compute units down on whatever application. You need to make sure that they all operate in harmony.

EECatalog: What ideas that you held before completing the white paper morphed into something different?

TE: We ended up not using the definition of “edge” that we came up with in the vocabulary task group because we found that edge computing is not defined by where it is, but by what business challenge you are trying to solve. Edge computing and the things that go along with it move along a continuum based on the requirements of the problem at hand.

LC: The Edge is a continuum where your goal is to put the right compute at the right location to provide the right outcome  at the right time.

There is a very synergistic quality to the edge and the Cloud. For example, most people would agree that having a central viewpoint on orchestration, management, security is best done from a Cloud perspective, but how you actually implement that and where it is implemented is certainly at the edge.

Similarly, we understand the cloud is where the data is and will be the best place to train algorithms. In many cases we will gain critical performance improvements by executing the algorithms at the edge.

The Edge is also a great place to pre-process IoT data for the cloud. For example, adding business context, filtering, aggregating, anonymizing data at the edge will lead to cleaner, more secure data and ultimately provide a foundation for a data lake vs a data swamp.

The use case or business case that you are trying to create is going to drive your version of the edge for your implementation.

LC: Making that mental leap, that the edge layer is not a physical layer, but a logical layer was quite pivotal for us.

EECatalog: What are the “what if” questions around a Cloud-edge paradigm shift?

LC: The easy thing about the Cloud is you’ve got this elastic scalable computing, and as a developer you typically don’t think about scalability and elasticity; whereas at the edge quite often you are dealing with, “Hey it’s a ruggedized quad-core box or dual-core box that is sitting in a hard-to-reach location, where they might not even have a lot of IT staff.” So now you are thinking about the hardware again. And you are thinking about how your application will perform on the hardware: “If I deploy that next microservice to the edge, does that saturate its compute capabilities so that the things that were already working there might not work as well?”

In a highly distributed model, there is more work in managing, securing, and orchestrating all of that, whereas the Cloud is a central spot where it’s much easier to do all of the above.

Everybody wants to unlock the edge, but it is key to unlock it in a cost-effective, secure, managed, orchestrated way.

It’s interesting that edge is this wonderful new topic, but it is also interesting that many of the things we learned in distributed computing are still important and still valid.

EECatalog: You are making the point that edge is function not place.

TE: We detail that in the white paper: If you are trying to control the temperature of a device you’ve got thermal couples at the edge sensing that. And then you’ve got a temperature controller, which is the edge compute at that device controlling that temperature.

Now consider you are responsible for production across 100 factories spread out globally. You want to know how each one of those factories is performing, and you have some sort of centralized Cloud capability that can pull data from each one of those factories. For you, the edge is going to be the factory, and the edge compute is going to be wherever the compute resources are in the Cloud.

We wanted to make sure we took that into account as we developed what these architectures look like, what the terminology and the standards and approaches are for each one of those use cases. They can share a lot of the same things. There are different use cases, but from the right point of view they can look very similar.

EECatalog: How does the white paper help readers understand this idea that the edge is not a place?

LC: Use cases included in the white paper help readers understand that the edge is use-case specific. We have a crew safety management use case; we have a fleet tracking/platooning one, which of course brings into the discussion things like, “If I am going to have an anti-collision system in the car, I really don’t want that algorithm to be in the Cloud because if there’s any network hiccups, that is unacceptable. I need all of that information to be consumed in the car, which is the edge, and have instantaneous response, because safety is the priority.”

Whereas the routing information, GPS data, algorithms could be great things that come from the Cloud. The white paper also talks about predictive maintenance from a connected elevator, product traceability perspective. The use cases are a good spectrum of the kinds of things seen across industries, how people solve things and approaches that might be taken.

That’s one of the values of the Industrial Internet Consortium, you will see a lot of use cases and you will also see test beds that address very specific problems.

 

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.