I couldn’t make it to the February forum where the Open Process Automation Forum celebrated its 10th Anniversary. I remember meeting with Don Bartusiak, the instigator, along with Alan Johnston, Tom Burke, Dennis Brandl, and Dave Emerson to discuss standards and interoperability with the new initiative.
OPAF have come a long way. Ten years of developing consensus standards of standards, specification guides, certification testing.
Three engineers from The Wood Group talked with me yesterday bringing the update that I missed last month. Brad Mozisek, Patrick Sloan, and Alex Eaton told me that OPAS is not a science project but a real thing that users are implementing as we speak.
The organization never tried to engineer a new process automation system. They left that where it belonged—with the vendors. The goals included decoupling software and hardware, defining not only open but interoperable systems, and giving owner/operators flexibility to add best-in-class technologies without being locked into a single-vendor situation.
I missed seeing presentations by ExxonMobil, Shell, and Reliance on their projects.
Celebrating 10 years plus seeking to spread the message beyond the US, the organization has scheduled a World Tour. The first is will be March 25, 2026 at 06:00 AM (CDT) | 11:00 AM (GMT) | 16:30 PM (IST).
Speakers: Aneil Ali, The Open Group, Ravi Jagasia, R. Stahl Inc, Jacco Opmeer, Shell, Dominic de Kerf, Cargill, and Luciano Narcisi, ARC
Join us on March 25 for this webinar (hosted by The Open Group) and featuring Members of the Open Process Automation Forum. This webinar will feature European based end user organizations showcasing the business and technical milestones of their O-PAS adoption journeys.
Learn how these organizations are implementing the standard to drive innovation and eliminate vendor lock-in. Register here.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
The last that I wrote about CESMII the output had been several educational initiatives regarding smart manufacturing.
I contacted Jillian Kupchella, marketing director, last month initiating some conversations so that I could get an updated.
For those not familiar with the organization: CESMII – the Smart Manufacturing Institute – has a total current investment commitment of $201M from Department of Energy funding and public/private partnership contributions, with a mandate to create a more competitive manufacturing environment here in the US through advanced sensing, analytics, modeling, control and platforms. CESMII is one of 18 Manufacturing USA institutes on this mission to increase manufacturing productivity, global competitiveness, and reinvestment by increasing energy productivity, improving economic performance and raising workforce capacity. University of California at Los Angeles (UCLA) is the program and administrative home of CESMII.
The CEO is a former colleague from MESA, John Dyck.
The early education initiatives have blossomed over the ensuing few years to a community of nearly 100 Certified Smart Manufacturing Roadmapping Professionals who are equipped to engage manufacturers of all size – small, medium, and large – to assess current states, develop strategic roadmaps, align communications, and establish sustainable funding models. This work is accelerating the development of data-driven cultures and a true Smart Manufacturing mindset across all industries.
They identified manufacturing and systems interoperability as a strategic imperative – marking the end of siloed data and stovepipe architectures and enabling scalable data, application, and integration interoperability. I’ve heard and written about the data silo and stovepipe architecture for perhaps decades. I hope they can move that ball forward (to use an American football analogy given the recently completed Super Bowl).
Hearing from Dyck, CESMII have identified a couple new initiatives they consider key to the widespread deployment of Smart Manufacturing.
CESMII’s 3 Smart Manufacturing Architecture Imperatives represent a foundational set of requirements that address this demand for interoperability. We are advocating for open, standards-based information modeling (SM Profiles), interoperable platform requirements, and a common API that can drive scalability, reduce complexity, and unlock real-time value from manufacturing data across systems, applications, and the supply chain. You can learn more about these SM Imperatives here: SM Architecture Imperatives Workshop
We do want to draw your attention to the newest, and arguably most important of these imperatives. CESMII convened an international, open initiative to establish a common, vendor-agnostic API for contextualized manufacturing information. This effort addresses a longstanding challenge faced by manufacturers and application developers alike: the need to build against incompatible, proprietary platform interfaces. Adoption of this API is already underway among several leading manufacturing software and platform providers, with an official launch planned for early 2026.
We are also excited to share that several of our technology provider partners are actively working toward compliance with CESMII’s Smart Manufacturing Imperatives. As a result, we anticipate the addition of several new compliant Smart Manufacturing Interoperability Platforms (SMIPs) in 2026 – further strengthening the ecosystem. Stay tuned for announcements.
Scaling Smart Manufacturing for Impact
Through community engagement, CESMII has identified several strategic innovation and investment areas essential to scaling and deploying Smart Manufacturing, including:
Replicating Smart Manufacturing solutions across factories within an industry
Scaling from unit operations to factory and enterprise levels
Extending Smart Manufacturing solutions across the supply chain, including tier suppliers and small and medium-sized manufacturers
Scaling and deploying will demonstrate industry integration, implementation, and reusability of existing SM solutions, practices, and infrastructure.
The Institute have given itself some ambitious projects. We wish them success.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
We met in a conference room at an office in Barrington, IL. A place where sometime later a couple guys thought they’d screw me in a business deal. I came out ahead in the end, but the place has mixed memories.
This meeting involved thinking about the future of asset data and systems interoperability. We had a system diagram. The idea was to solve a huge problem for owner/operators of process manufacturing enterprises—flowing engineering data into other software systems for operations, maintenance, and enterprise. The incumbent system was a morass of paper (or pdf documents which was much the same thing).
We did trademark searches and domain name searches and eventually settled on the Open Industrial Interoperability Ecosystem—OIIE.
I plot this history for context for the conference I attended recently—the 2nd ADIF Workshop at Texas A&M University dubbed Driving Asset Data and Systems Interoperability Toward an Open and Neutral Data Ecosystem.
This workshop brought together owner/operators, EPCs, System Integrators, university researchers, standards organizations, and software vendors. Each group conducted a panel discussion of its needs and successes. I was there for a short presentation and to moderate the standards panel.
Professor David Jeong from Texas A&M and the session leader previewed the discussions. One of his colleagues later presented research his team has performed to provide a method for taking P&ID documentation into a standard format usable by other software systems.
The message that came to me from the panel of owner/operators (grossly summarized, as will be all the discussions) included two key words—collaborate and operationalize. They are impatient about solving this data interoperability problem. One panelist quipped, “We know the project is finished when the large van backs into the loading dock and disgorges mountains of paper.”
What blows my mind is that I was moved to a position called Data Manager in 1977 to tackle the (much smaller) mountain of paper our product engineering department provided to operations, accounting, and inventory management. I led a digitalization effort in 1978 to tackle the problem. The problem not only remains, but it is immensely more complicated and critical.
The EPCs basically said that their hands were tied by the owner/operators mandating which design and engineering software to use and the inflexibility of the vendors of said design and engineering software. When owner/operators had requested digital documentation, they had responded with pdfs. Hardly interoperable data.
Our standards panel included the leader of DEXPI, whose organization has developed a method of changing P&ID data into an xlsx (Excel) format. That, of course, is a good start.
An organization called CFIHOS (see-foss) presented their take on standards. I’m afraid I got a bit lost in the slides (note: more research needed). What I gathered was that they were attempting one overriding standard—and that that work was years away. Interesting that I listened to Benedict Evans’ podcast this morning. He is a long-time tech industry analyst. He remarked in another context, “It seems that where there are 10 standards and someone comes along with a standard to encompass them all, you wind up with 11 standards.”
The ISA-95 was presented. This messaging (and more) standard is incorporated with the OIIE, which was presented next. Dr. Markus Stumptner of the University of South Australia presented his research work on proof of concept of the OIIE.
If we can get enough momentum focusing on this area and find some SIs willing to take the OIIE to an owner/operator, perhaps we can finally prove the business case of asset data and systems interoperability.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
Interoperability forms a key feature for useful technology. Think of train rail gauges. Or shipping cargo containers. Or much of our industrial technology—much, but far from all. The drive of technology application suppliers for proprietary lock in is strong. Many will open up only as much as customers demand immediately.
This news comes from ABB and the Linux Foundation regarding a new interoperability initiative for industrial applications. They call it “Margo” which is Latin for “edge.” Cute, eh? Better than many names I’ve seen over the years.
In brief:
Margo, a new open standard initiative for interoperability, will address key roadblocks to digital transformation
The initiative is hosted by the Linux Foundation and driven by a founding group of industrial automation solution providers, including ABB Process Automation and ABB Machine Automation (B&R)
Margo aims to unlock interoperability at the edge – a key layer of Industrial IoT ecosystems where plant data is transformed into AI-powered insights to drive efficiency and sustainability
At the Hannover Messe on April 23, 2024, founding members ABB (including B&R), Capgemini, Microsoft, Rockwell Automation, Schneider Electric (including AVEVA) and Siemens announced collaboration on a new initiative to deliver interoperability for Industrial IoT ecosystems.
Hosted by the Linux Foundation and open to further interested parties, the Margo initiative draws its name from the Latin word for ‘edge’ and will define mechanisms for interoperability between applications, devices and orchestration software at the edge of industrial ecosystems. In particular, Margo will make it easy to run and combine applications from any ecosystem member on top of the hardware and runtime system of any other member. Margo aims to deliver on its interoperability promise through a modern and agile open-source approach, which will bring industrial companies increased flexibility, simplicity and scalability as they undergo digital transition in complex, multi-vendor environments.
“Mastering efficiency, flexibility and quality faster than competitors is key to success in today’s industrial world,” said Bernhard Eschermann, CTO, ABB Process Automation. “Digitalization can help deliver on these benefits, but digital ecosystems require a robust, secure and interoperable framework at the edge, connecting operations and information technologies. For ABB, a long-standing advocate of open automation systems, driving a forward-thinking collaborative initiative like Margo is key to achieving this goal.”
“The more sources you get data from, the better the decisions you can make,” explained Florian Schneeberger, CTO of ABB’s Machine Automation division (B&R). “Yet, while the benefits of digitalization increase with scale, so do the challenges of navigating heterogeneous industrial ecosystems. That’s why interoperability is so crucial to unlocking the full potential of digitalization. It empowers organizations to adopt and scale Industrial IoT solutions at full speed without large teams of IT specialists.”
In March 2024, ABB became a member of the Linux Foundation. This will enable the company to further enhance efforts in promoting open community collaboration, helping unlock innovation and enable better products and experiences for customers. This further strengthens ABB’s commitment to open standard based systems.
I just released a podcast where I thought about standards, interoperability, and open technologies. This news came my way, speaking of open, that Shell Information Technology International has become a platinum member of The Open Group.
Shell has been a Member of The Open Group since 1997, and has contributed to its numerous Forums which enable collaboration to develop open technology standards and certifications. The company played a critical role in the foundation of The Open Group OSDU Forum that facilitates the development of transformational technology for the world’s changing Energy needs, and donated important intellectual property that formed the basis of the OSDU Data Platform. Shell also contributed to the inception of The Open Group Open Footprint Forum that focuses on creating an environmental footprint data model standard applicable to all industries.
The Open Group is a global consortium that enables the achievement of business objectives through technology standards. Its diverse membership of more than 900 organizations includes customers, systems and solutions suppliers, tool vendors, integrators, academics, and consultants across multiple industries.
Glad to see end user companies taking an active part in openness. Their support is the only way open technologies will grow.
This workshop at Texas A&M Harnessing Digital Transformation Through Asset Data and Systems Interoperability is next week. I’ve only just heard about it. Data and systems interoperability is something I’ve worked on for years. Evidently someone at Texas A&M has been researching. They will be presenting ideas next week. I have asked if I could get an interview to post after the conference.
The conference is October 25-26, 2023 at the Memorial Student Center (MSC), Texas A&M University, College Station, Texas.
“This workshop aims to address the significance of embracing open standards, vendor neutral interoperability for Owners, EPC, and Vendors .”
Needs—Achieve consensus on significance in open, vendor neutral interoperability
Actionable Plans—Learn and develop actionable plans and strategies to move towards open, vendor neutral interoperability, ensuring that stakeholders can collectively work towards achieving this goal.
Roadmap for the Future—Establish a roadmap outlining the steps and milestones required to accomplish long term standards based interoperability, setting a clear direction for industry wide progress.
Join us to explore the future of digital transformation and its pivotal role in shaping digital twins for the process industry and critical infrastructure
Key Highlights:
Introduction to the Asset Data Interoperability Framework (ADIF) initiative.
Navigating the challenges of data and systems interoperability issues.
Overcoming barriers in maximizing the digital potential for growth.
Embracing interoperability as a mindset to enable the adoption of digital twins and other transformative technologies
About ADIF Working Group:
A dedicated consortium of industry experts and academia, ADIF is committed to fostering open, vendor neutral and standards based solutions, prioritizing digital enhancements in asset lifecycle management.