The “Edge” is a hot space right now, although sometimes I’m not sure that everyone agrees what “Edge” is as they develop products and solutions. However, first thing this morning I saw this tweet from Tom Bradicich of Hewlett Packard Enterprise (@HPE) referring to an article that mentions him in CouputerWeekly.com. I’ve written about HPE at the edge and with IoT before. Looks like something’s up.
Tweet from @TomBradicichPhD Not only computing at the #edge, but also a new product category of “converging IT with #OT” systems (such as controls, DAQ, industrial protocols). Watch this space, my team’s next innovation is all this as-a-service. #aaS
Here is the rationale from the Computer Weekly article. “The benefits of edge computing have the potential to help businesses dramatically speed up their data analysis time while cutting down costs. @HPE’s Mark Potter and @TomBradicichPhD share how we can make this possible.”
In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.
It is fascinating to see our environment described by an enterprise IT writer. The truth is that following the Purdue Model, suppliers tried to make PLCs and DCSs part of the information infrastructure in parallel to supervising or executing control functions. That proved too unwieldy for control engineers to manage within the programming tools used. It was also too slow and not really optimized for the task.
Along came IT companies. I have followed a few over the past five years. They have had trouble with figuring out how to make a business out of edge compute, gateways, networking, and the like.
In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.
Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.
Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”
I get this analogy to converging technologies into a device such as the iPhone. I don’t know if we want to cede control over to an HPE compute platform (although it has plenty of horsepower), but the idea is tempting. And it would be thoroughly disruptive.
Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.
HPE’s investment in a new company called Pensando, which recently emerged from stealth mode and is founded and staffed by former Cisco technologists with former Cisco CEO John Chambers installed as Chairman, Sunil believes new categories of device will come to market, aimed at edge computing, could lead to a plethora of new devices perhaps to perform data acquisition and real-time data processing.
Mark Potter recently wrote in a blog post, By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in in a way that is completely air-gapped from the core system.
These are critically-important requirements in our hyper-connected, edge-centric, cloud-enabled and data-driven world – where billions of people and trillions of things interact.
This convergence is generating unimaginable amounts of data from which enterprises seek to unearth industry-shaping insights. And as emerging technologies like edge computing, AI and 5G become even more mainstream, enterprises have an ever-growing need to harness the power of that data. But moving data from its point of generation to a central data center for processing presents major challenges — from substantial delays in analysis to security, governance and compliance risks.
That’s where Pensando and HPE are making an industry-defining difference. By moving the traditionally data center-bound network, storage and security services to the server processing the data, we will eliminate the need for round-trip data transfer to centralized network and security appliances – and at a lower cost, with more efficiency and higher performance.
Here are benefits that Potter listed:
- Lower latency to competitive solutions, as operations will be carried at a 100Gbps network line rate speed;
- Controller management framework to scale across thousands of nodes with a federation of controllers allowing scale to 1M+ endpoints; and
- Security, governance and compliance policies that are consistently applied at the edge.