Industrial Internet of Things (IIoT) or Manufacturing
Enterprise Solution (MES), that is the question. Actually, I didn’t know that
it was a question, or an either/or. I’m thinking false dichotomy.
As you might imagine, I receive a ton of emails from a
variety of marketing and PR people promoting one thing or another. Recently, a message arrived from Brock
Solutions highlighting just that question. Since I know several people at
Brock, including the man whose name is on the building, so to speak, and since
I respect the organization, I bit.
Here is the pitch: Many of our manufacturing customers
have multiple plants around the world. And it’s safe to say that each of
those plants are different. Some plants come from acquisitions. Some have been
around forever. And some plants are brand new.
Inevitably, this mix of sites also means a mix of
technology at the operational layer. As you may know, there is no easy answer
to manage this. Watch our video to see how
we help manufacturers evaluate MES vs. IoT using a
They then sent me to a vlog (video blog). This is, of course, a little promotional at the end. But some good issues are raised about evaluating your business, your technical needs, your application needs and fitting a solution to the need. It’s worth a watch. Only 5 minutes.
Alan Johnston caught up with me yesterday to update me on progress MIMOSA has made toward updating and adoption of its asset information data and data flow models–described by the Open Industrial Interoperability Ecosystem (OIIE). I had been working with them a few years ago, but it was too early for the promotional work I could help them with.
[Note: This is an old slide I had in my database. I don’t think Fiatech and POSC Caesar are still involved, but I cannot edit the slide. The ISA 95 committee is still involved.]
I did write an Executive Summary White Paper that has been downloaded many times over the years. This paper is four years old, but I think it still describes the ideas of interoperability, using standards, handing off from engineering to operations and maintenance of process plants.
Many operations and maintenance managers have expressed frustrations of handover and startup events. When I’ve described this system, they’ve all been receptive.
On the other hand, neither the large integration companies nor the large automation and control companies are thrilled with it out of concern about greatly reduced revenue generated by lock in.
I could reference the work of the Open Process Automation group attempting also a “standard of standards” approach to dissociating software from hardware for improved upgradability. Schneider Electric (Foxboro) and Yokogawa have seen the possibility of competitive advantage, especially with ExxonMobil, with this approach. But the view is not generally held.
Back to Alan. He has been making progress on the standards adoption front and getting some buy-ins. I’ve always seen the potential for improved operations and maintenance from the model. But the amount of work to get there has been staggering.
Looks like they are getting there.
Gasp! Signs of common sense begin to pervade the discussion of digitalization and its cousins–connected (everything?), digital twin, cyber-physical, and so forth. Meaning that it’s all about leadership.
Suppliers constantly develop or enhance technologies within products. But I’m betting that just about all of you already have more digital data than you know what to do with. I’m betting most of you already have some products and connectivity–and have had for 15 years or longer.
What is always lacking is the will, the ingenuity, the, yes, leadership, to use all of this to its most beneficial effect.
Leadership doesn’t just appoint someone to head an exploratory team. It sets vision and expectations about how a new business model can send the company on a growth and success trajectory.
Leadership sees data as an asset and asks how it can be used to further goals of profitability, process uptime, improved quality, faster time to market, better/faster customer service, supply chain smoothing, and more.
Leadership organizes and motivates people to forge new paths into the economy.
Simply compiling digital data is a waste of time and resources. Leadership treats it as a foundation for success.
I’ve received a couple of news items about something called the Open Manufacturing Platform (OMP). I have searched in vain for a website–maybe a GitHub or Linux Foundation or something. This is sponsored by Microsoft, so no surprise that it is built on Microsoft Azure. I guess the open part is open connectivity to Azure.
I had a brief chat in Hannover a couple of weeks ago and picked up this press release. The companies putting this together have added members. Just a few right now. Like always, I adopt a “wait-and-see” attitude to see how this develops.
- Anheuser-Busch InBev, BMW Group, Bosch Group, Microsoft, ZF Friedrichshafen AG named OMP steering committee members
- OMP was established in 2019 as an independent initiative under the umbrella of the Joint Development Foundation
- First working groups created: IoT Connectivity, Semantic Data Model, Industrial IoT Reference Architecture and Core Services for Autonomous Transport Systems
- The first appearance of the Open Manufacturing Platform
The Open Manufacturing Platform (OMP) has expanded, with new steering committee members and new working groups established. OMP is an alliance founded in 2019 to help manufacturing companies accelerate innovation at scale through cross-industry collaboration, knowledge and data sharing as well as access to new technologies. The OMP was founded under the umbrella of the Joint Development Foundation, which is part of the Linux Foundation.
Original members The BMW Group and Microsoft welcome Anheuser-Busch InBev (AbInBev), Bosch Group and ZF Friedrichshafen AG as steering committee members. The OMP steering committee has approved a number of working groups to focus on core areas important to the industry, including IoT Connectivity, semantic data models, Industrial IoT reference architecture, and core services for ATS (autonomous transport systems).
Common approach to industry challenges
The expansion of intelligent manufacturing is driving new efficiencies and increased productivity, as well as revealing new challenges. Within the industry, legacy and proprietary systems have resulted in data silos, making operation-wide insight and transformation daunting. As common challenges across the industry, they often require a high degree of investment for modest returns within any one organization. The OMP has been developed to address this, where manufacturers and their value chains come together to identify and develop solutions that address these non-differentiating problems. It brings together experts across the manufacturing sector — including discrete and process manufacturing, transportation and consumer goods, industrial equipment, and more.
“Our goal is to drive manufacturing innovation at scale, accelerate time-to-value and drive production efficiencies by jointly solving mutual challenges, based on an open community approach. The OMP helps manufacturing companies unlock the potential of their data, implement industrial solutions faster and more securely, and benefit from industrial contributions while preserving their intellectual property (IP) and competitive advantages, mitigating operational risks and reducing financial investments,” said Jürgen Maidl, Senior Vice President Production Network and Supply Chain Management at the BMW Group.
Scale innovation through common data models and open technology standards
The OMP operates under the umbrella of the Joint Development Foundation (JDF). The JDF is part of the Linux Foundation and provides the OMP with infrastructure and an organizational framework to create technical specifications and support open industry standards. The OMP supports other alliances, including the OPC Foundation and Plattform Industrie 4.0, and leverages existing industry standards, open source reference architectures and common data models.
“Through the open collaboration approach that is the cornerstone of OMP, manufacturing companies will be able to bring offerings to market faster, with increased scale and greater efficiency,” said Scott Guthrie, Executive Vice President Cloud & AI at Microsoft. “Solutions will be published and shared across the community, regardless of technology, solution provider or cloud platform.”
The heart of OMP: working groups to address common manufacturing challenges
“Comprised of members from across the manufacturing industry, the collaboration framework and heart of the OMP are its working groups. We are very excited to join in a moment where our manufacturing facilities are becoming increasingly connected, and we are looking for innovative ways to make use of the treasure trove of data that is being generated,” said Tassilo Festetics, Global Vice President of Solutions at AB InBev. The OMP initial first working groups will focus on topics such as IoT Connectivity, Semantic Data Model, IIoT Reference Architecture and Core Services for ATS (autonomous transport systems). Initial focus areas include:
The OMP steering committee will support industry efforts to connect IoT devices and machines to the cloud. It is one of the first steps to digitize production lines and leverage cloud-connected Industrial IoT applications.
“Today, it is all about analytics and predictions but without data no analytics and without connectivity no data. Modern devices can easily be connected via the OPC Unified Architecture (OPC UA). Connecting machines and applications to the cloud that have been in production for decades comes with bigger interoperability challenges as various standards and interfaces must be addressed to interconnect these historically developed legacy systems (‘brownfield approach’). The working group IoT Connectivity will focus on providing industrial-grade edge and cloud functionalities for the integration and management of OPC UA devices in brownfield environments,” said Werner Balandat, Head of Production Management, ZF Friedrichshafen AG.
Another OMP working group focuses on semantic data modeling: Machine and manufacturing data are crucial for industrial companies to optimize production with artificial intelligence (AI). However, managing data in a common format across multiple sources with constantly evolving semantics is a real challenge.
“Data is the raw material for Industry 4.0 and a prerequisite for optimizing production with the help of artificial intelligence. At OMP, we are developing a semantic model that makes data understandable and illustrates its relations and dependencies. Users no longer receive cryptic, incomprehensible numbers and characters, but production-relevant information including their context. This semantic data structure ensures improvements along the entire value chain and makes AI-based business models possible on a large scale,” said Dr.-Ing. Michael Bolle, Member of the Board of Management, Robert Bosch GmbH.
One of the companies I met with a couple of weeks ago at the Hannover Messe Preview in Germany was SALT Solutions AG and its subsidiary SALT Software GmbH. The company is a systems integrator that also has developed a platform with micro services to bring together its service offering.
SALT will be partnering with SAP at its stand and also with Hewlett Packard Enterprise (HPE) at its stand. Demonstrations at each stand will be showing how companies can quickly and easily digitize their Supply Chain by means of digital processes, Digital Twins, and modern technologies such as Artificial Intelligence, Robotic Process Automation, and Cloud Computing. Visitors will experience how they can control a complete production process independently thanks to intuitive technologies and a high degree of automation.
Partnership With SAP
One feature is a live demonstration called “E-Bike: Make-to-Operate” at the SAP Stand (Hall 17, Stand A42). This demo shows all the process steps in the manufacture and operation of an E-bike, from production and control (“Make”) to monitoring and control of operation by processing and interpreting sensor data (“Operate”).
Visitors can intuitively control all process steps themselves in the live demonstration: Graphic detailed planning, production, and handover of the finished E-bike as a Digital Twin to the SAP Asset Intelligence Network (AIN). The SAP AIN as part of the SAP Digital Manufacturing Strategy is a virtual space in which ride and status data of the E-bike are collected and analyzed and which allows all parties involved, the manufacturer, the operator, and service providers, to have access to all data relevant to them in a central location.
In practice, this scenario offers the possibility, for example, for the E-bike rider to be recommended to visit nearby E-charging stations if the data analysis shows that the battery charge is no longer sufficient to reach the destination.
Predictive maintenance can be planned automatically when transferred to a production scenario: A specific maintenance requirement can be derived from sensor data. In order to determine the ideal time for the next maintenance order, the system includes all relevant data such as the current production plan and independently sets the date to production-free time, e.g. on the weekend to minimize production disruptions.
In the live demonstration, special focus is placed on the process orchestrator, SAP Shop Floor Designer (SFD). In production, the focus is on the end-to-end process with its numerous individual steps, which companies in the future will design completely in advance. The entire process runs digitally, and the design decides which functionalities are to be carried out at each point in the production process. While manufacturers used to be tied to the functionality of one or more systems, SAP SFD makes tools from all systems available such as ERP, MES, BDE and others.
Digital Supply Chain Demo With HPE
The second live demonstration “D4S: The complete digitization of a complex Supply Chain, from the supplier to your own company to the customer and beyond existing system boundaries” will be seen at the Hewlett Packard Enterprise stand (HPE, Hall 15, Stand E64). This demo shows how the digital Supply Chain runs based on the SALT Business and IoT platform Data for Services (D4S). In the live demonstration, processes, objects, and data are synchronized and orchestrated via D4S in a model factory in independently interacting, networked Digital Twins.
The central role of process design in D4S can be taken on by the Process Engine. This ensures that the complete set of functions from all different systems such as ERP, MES, or BDE is available at all times in the production process, and additionally the systems of customers and suppliers are also included, including the logistics and production facilities.
D4S makes the interaction of the signals between all systems involved and also the Digital Twins possible and therefore a meaningful, interrelated action and reaction across the entire system – finally, the entire process runs automatically and efficiently after just one input.
In practice, SALT Software GmbH decides based on the customer’s business model which Digital Twins and their signals should be meaningfully created and exchanged in order to enable optimum process execution.
SALT Solutions AG employs more than 600 experts in Dresden, Munich, Stuttgart and Würzburg.
SALT Software GmbH, a subsidiary of SALT Solutions AG, was founded on January 1, 2020 and has 40 employees.
The “Edge” is a hot space right now, although sometimes I’m not sure that everyone agrees what “Edge” is as they develop products and solutions. However, first thing this morning I saw this tweet from Tom Bradicich of Hewlett Packard Enterprise (@HPE) referring to an article that mentions him in CouputerWeekly.com. I’ve written about HPE at the edge and with IoT before. Looks like something’s up.
Tweet from @TomBradicichPhD Not only computing at the #edge, but also a new product category of “converging IT with #OT” systems (such as controls, DAQ, industrial protocols). Watch this space, my team’s next innovation is all this as-a-service. #aaS
Here is the rationale from the Computer Weekly article. “The benefits of edge computing have the potential to help businesses dramatically speed up their data analysis time while cutting down costs. @HPE’s Mark Potter and @TomBradicichPhD share how we can make this possible.”
In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.
It is fascinating to see our environment described by an enterprise IT writer. The truth is that following the Purdue Model, suppliers tried to make PLCs and DCSs part of the information infrastructure in parallel to supervising or executing control functions. That proved too unwieldy for control engineers to manage within the programming tools used. It was also too slow and not really optimized for the task.
Along came IT companies. I have followed a few over the past five years. They have had trouble with figuring out how to make a business out of edge compute, gateways, networking, and the like.
In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.
Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.
Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”
I get this analogy to converging technologies into a device such as the iPhone. I don’t know if we want to cede control over to an HPE compute platform (although it has plenty of horsepower), but the idea is tempting. And it would be thoroughly disruptive.
Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.
HPE’s investment in a new company called Pensando, which recently emerged from stealth mode and is founded and staffed by former Cisco technologists with former Cisco CEO John Chambers installed as Chairman, Sunil believes new categories of device will come to market, aimed at edge computing, could lead to a plethora of new devices perhaps to perform data acquisition and real-time data processing.
Mark Potter recently wrote in a blog post, By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in in a way that is completely air-gapped from the core system.
These are critically-important requirements in our hyper-connected, edge-centric, cloud-enabled and data-driven world – where billions of people and trillions of things interact.
This convergence is generating unimaginable amounts of data from which enterprises seek to unearth industry-shaping insights. And as emerging technologies like edge computing, AI and 5G become even more mainstream, enterprises have an ever-growing need to harness the power of that data. But moving data from its point of generation to a central data center for processing presents major challenges — from substantial delays in analysis to security, governance and compliance risks.
That’s where Pensando and HPE are making an industry-defining difference. By moving the traditionally data center-bound network, storage and security services to the server processing the data, we will eliminate the need for round-trip data transfer to centralized network and security appliances – and at a lower cost, with more efficiency and higher performance.
Here are benefits that Potter listed:
- Lower latency to competitive solutions, as operations will be carried at a 100Gbps network line rate speed;
- Controller management framework to scale across thousands of nodes with a federation of controllers allowing scale to 1M+ endpoints; and
- Security, governance and compliance policies that are consistently applied at the edge.