- By Jack Smith
- May 10, 2023
Zededa’s Transform Edge Computing conference highlighted why there's no place better place to compute than the cloud and how OT systems can benefit.
New ways to approach operational technology (OT) digital transformation were at the heart of ZEDEDA’s third Transform Edge Computing conference in March. Some of the topics covered at Transform 2023 included new approaches to digital transformation, ways to enhance security for data and applications at the edge, and dealing with legacy software challenges, among others.
Founded in 2017 to enable edge computing, ZEDEDA used this virtual event to educate its customers, partners and prospects on the Internet of Things (IoT) and cloud and edge computing.
Founder and CEO Said Ouissal discussed the state of edge computing, saying that industrial enterprises have been heavily adopting IoT to extract data from their assets, machines, and vehicles “Any application that can run in the cloud will run in the cloud, because it's simpler and easier to build, develop, and run apps there than anywhere else,” he predicted.
With all the data coming online from IoT devices, it would be impossible to upload it all to the cloud. But the data still needs to be analyzed. “So, instead of trying to get the data to the cloud, the cloud is being pushed all the way to the edge of the network, Ouissal said. “This is what we define as edge computing.”
Ouissal said as the cloud experience is pushed to the edge, it goes from highly centralized clouds and cloud-native data centers to regional and local data centers. “While the location of computing is different, the environment of computing that is happening is still pretty much the same. ZEDEDA’s solution was purpose-built to be distributed using an open architecture, he added.
Dr. Mahader Satya of Carnegie Mellon University said, “The cloud offers compute elasticity. You ask for more compute and you get it on demand. You may have to pay for it, but it's basically an infinite source of computing. That's the abstraction. It's also the safest place to keep your data. If you want to archive data that you can retrieve 100 years from now, you put it in the cloud. The cloud represents extreme consolidation. The economies of scale are exploited to their maximum in the cloud. There's no other place anywhere in the system that offers computing so cheaply, so the lowest cost per compute is in the cloud.”
Apps become a lot more specific to people, and in particular the things they are serving,” Ouissal said. “The number of locations or nodes could be tens of thousands or even millions. Imagine machines, vehicles, production lines, well sites, and substations, each enabled with edge computing. We call this type of edge ‘the distributed edge.’ This is the edge that we think is now the fastest growing in the market and entering mass deployment with many enterprises. Using all these ecosystems requires a way to orchestrate and manage it. This is what we do. We provide edge orchestration and management to anyone looking to deploy distributed edge computing.” (Figure 1).
Rob Tiffany, principal at Digital Insights, moderated a discussion panel focusing on digital twins. “I didn't do the digital twin thing until 2016 with Lumata, but it was clear to me that digital twins probably should be at the heart of an IoT system. IoT is just plumbing that feeds data into that digital thing,” Tiffany said.”
Matt Mohajer, digital program manager at SLB added, “Obviously, we want to make sure we have a virtual or digital representation of a physical thing, whether that thing is a space shuttle, a valve, one piece of equipment, or an entire facility. There are several key elements that make it more of a real twin. There is the element of real-time data communication. Two-way communication is also important, but you want to [first] ensure you have the ability to get the data. [Then you] do your analysis, or the machine does the analysis, then converts that to insight and, based on that insight, you take some sort of action. For me, those are the key elements that make it a digital twin.”
James Teal, director of system architecture cloud and edge at Rockwell Automation added, “As we push more of our offerings to the cloud, the team here at Rockwell is seeking answers to, ‘how do you effectively get data that's in these OT networks that tends to be cheaply controlled?’ It's not just the view of the plant with a line, and the conveyor is moving and pumps, or moving bottles around. It's really the process twin. There are multiple types of twins that become part of this. We're trying find a way to curate those device level assets so they can be consumed by a whole bunch of other services and other twins in a very effective way, including how our cloud products interact with things like devices and controllers.”
Automating the edge with APIs
Padraig Stapleton, vice president of engineering at ZEDEDA hosted a discussion with panelists from BOBST, IBM, and Emerson who have extensive experience deploying large solutions at the edge and making application programming interfaces (APIs) core to the deployment and scaling of solutions.
“APIs are the engine of automation,” said Stapleton. “Every software solution today requires them. They are replacing what we used to do with UIs [user interfaces] in the past. The API is becoming the key user agent for people as they deploy solutions at the edge. They allow us to monetize data, to forge partnerships with both our partners and our customers. It’s a cost reduction as well.”
Murali Gandluru, vice president of strategy, product and GTM at IBM said, “We believe the edge is going to play a critical role in how we build out the IT [information technology] distributed hybrid multi-cloud enterprise. That's a key part of IBM strategy—hybrid cloud and AI [artificial intelligence]. One of the things we noticed in the infrastructure space is there's been a constant pendulum move between centralized solutions versus distributed solutions. Distributed keeps coming back. You have to drive a standards-based approach, and that's very important to IBM—an open, standards-based approach. To push a declarative versus imperative model of managing those, the best approach is an API-driven model that we embed and use in all our products.”
Neil Wang, product marketing manager at Emerson, said “We collect a lot of data from the cloud for manufacturing our devices. We reflect that data onto the edge so our users can use the data freely and make the data generate as many insights as they need to. We are talking to the devices to do the execution of valves and to modulate motors. We collect data from them and from a huge number of sensors all over the plant, and we use the signals to run controls. But the signals could be used for other purposes, such as reliability, safety, production, and planning. That means we need to share the data, and eventually the data would be used by people who are not traditional operators.”
“The control system is a piece of a critical infrastructure, dealing directly with the manufacturing process. We have to separate the data provision from running the controls,” added Wang.
Does edge require the cloud?
Leonard Lee, founder and executive analyst at neXt Curve, moderated a panel discussion focusing on the relationship between the edge and the cloud. Lee said we hear edge all time. “We've heard it for more than a decade, but it still seems that it means a lot of different things to a lot of different people.”
“The promise of cloud was endless compute and storage, and when we get to the edge, it's a different story,” said Bryan Ashley, WW Field CTO at Aviatrix. “The other component is bandwidth, depending on what sort of edge computing location we're talking about. That can be a real challenge. As we push the edge capabilities and story down the road a few years, there's a lot of decision making that has to go into it.”
“We want to help provide the platform,” said Marilyn Basanta, senior director, product management, edge computing at VMware. “As you get the sensor data, you can feed it into our edge compute platform and take advantage of VMware. For example, we have Rabbit MQ, which can now do messaging and streaming so we can take that data. You can process it with your new edge data apps, taking advantage of our data. In manufacturing now with our offerings, we're getting to new parts of the shop floor.”
“We have this big announcement with Audi that will help revolutionize how they manufacture their cars,” added Basanta. “They have a lot of IPCs. We will help them consolidate workloads that can do proper real time [computing], depending on the latency, and the tolerance of failure.” Manufacturing is really about predictive maintenance, quality control, and improving business efficiency because every minute of a down production line is thousands of dollars, she added.
The role of open source in edge computing
“Edge is a community, and it has defined two types of edges: a user edge and a service provider edge,” said Arpit Joshipura, general manager, networking, edge and IoT at Linux Foundation, and moderator of the “Role of open source in edge computing” panel discussion.
A user edge is operated by the user service provider, Joshipura said, and there are three implementations: an implementation that is extremely constrained, such as embedded microcontroller; the smart device edge, which is more like IoT gateways; and the on-premises data center edge, which is still within the control of the user.
“Edge compute is irrelevant if your responsiveness is not less than 20 milliseconds in terms of latency,” said Joshipura. “People clump IoT and edge in the same sentence. A sensor that wakes up every week and dumps data into a cloud is not an edge application. It's an IoT application. Not all IoT is edge.”
Joshipura added, “If you remember one thing, use the terminology that the community is using. This is one of the powers of open-source and open community that is it's not driven by one vendor or one analyst. It is adopted by the community of participants. With that I'm introducing the open-source portion of the edge, which we call ‘LF Edge.’”
LF Edge is an umbrella organization, created by the Linux Foundation, that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system. By bringing together industry leaders, LF Edge will create a common framework for hardware and software standards and best practices critical to sustaining current and future generations of IoT and edge devices.
Erik Nordmark, founder and CTO of ZEDEDA asked, “To what extent is the edge different than using open source in the cloud and the data center? Because one of the key aspects of open source is that you can actually drive a lot more security concerns in various levels because of the transparency you get with open source. It is very powerful introducing, moving quickly, and deploying different use cases—rapid innovation, using building blocks that you have access to.”
Larry Morris, director of product management at SUSE, added, “I recently read that more than 56 million developers have contributed to open [source]. Over the last several years, we've seen most of the very large technology companies, whether it's Microsoft, Google, IBM, Amazon, Facebook, or Intel, are starting to contribute, use, and contribute to open-source projects. This really is at the heart of why you see the innovation or the diversity. There are so many different open-source projects. Whatever your use case, whatever your particular need, there probably is an open-source alternative to a proprietary software that you can look to with many people contributing.”
Image courtesy of ZEDEDA
Did you enjoy this great article?
Check out our free e-newsletters to read more great articles..Subscribe