Select Page

Yokogawa Launches OpreX Plant Stewardship

Catching up on some older news items from earlier this month. This news from Yokogawa shows how process automation companies have had to expand their offerings and problem sets to solve in this era of process automation and control market maturity. This new product is said to deliver holistic lifecycle management and operational excellence.

Yokogawa Electric Corporation announced the launch of OpreX Plant Stewardship, the most comprehensive lifecycle service program in the company’s OpreX Sustainable Maintenance family, to support customers in achieving and sustaining operational excellence.

OpreX Plant Stewardship offers a tailored, performance-based approach that strategically addresses the ever-evolving challenges faced by customers, enabling them to mitigate risks, effectively manage operational challenges, and achieve key performance indicators across all levels of their organization. OpreX Plant Stewardship is available in all regions outside of Japan.

With the acceleration of plant complexity, IT/OT integration, cybersecurity threats, and a shortage of skilled resources, traditional product-centric maintenance methods are no longer sufficient. In response to these issues, Yokogawa has expanded its lifecycle services portfolio with a program that ensures service performance across systems, field instruments, analyzers, software, and applications.

Main Features

1. A customer-centric lifecycle approach

OpreX Plant Stewardship is a service program designed through a strategic framework to drive long-term operational excellence. By proactively and systematically addressing risks and challenges, organizations can align their operational strategies with business objectives. Through this approach, Yokogawa works closely with various stakeholders across different levels of the customer’s organization, helping customers navigate complex demands and ultimately find the economic optimum that balances performance, cost, and sustainability.

2. Comprehensive coverage of five dimensions

Leveraging decades of domain expertise, Yokogawa’s service approach is built to provide coverage on five essential dimensions that drive a well-operated, efficient, and resilient business throughout the plant lifecycle:

  • Safety & security: Ensuring robust operational safety and cybersecurity measures are in place
  • Reliability & availability: Eliminating plant disruptions and improving equipment reliability
  • Regulatory compliance: Ensuring compliance with evolving industry regulations while supporting relevant Sustainable Development Goals
  • Operational efficiency: Enhancing process efficiency and reducing waste
  • Investment efficiency: Optimizing asset investments to maximize long-term value

3. Four-step process for continuous engagement

The four-step engagement model ensures effective collaboration with customers:

  • Identify: Understanding customer challenges, priorities, and operational risks through an assessment model
  • Assess: Evaluating which services and solutions are best tailored to address customer pain points and achieve their operational goals, and crafting a tailor-fitted, long-term collaboration proposal
  • Control: Ensuring seamless and effective global delivery of the services and commitment to high-quality output
  • Review: Continuously supporting customers through long-term engagement and improvements, utilizing a structured feedback loop to ensure ongoing performance alignment and adaptation to evolving operational needs

Major Target Markets

Oil and gas, petrochemicals, chemicals, renewable energy, power, pulp and paper, pharmaceuticals, food, mining, iron and steel, water distribution, and wastewater treatment.

Applications

  • Risk-based performance assessment and improvement
  • Lifecycle management and mitigation strategies
  • Maintenance and reliability enhancement
  • Compliance support and regulatory alignment
  • Operational performance optimization

Optimized Energy Savings From Innovative Standards

While I am on a standards reporting kick, this news reflects the growing collaboration among formerly competitive standards development organizations. I wrote recently about how OPAF is actively taking an end user view into standards collaboration and rationalization. Working together usually brings benefits to users.

From the statement of purpose: Accurate energy consumption data is essential for companies aiming to achieve climate-neutral production. To support this goal, a consortium of organizations has recently published a groundbreaking specification for interoperable and efficient energy management in industrial and process automation.

\A key goal of the mechanical and plant engineering industry is to achieve climate-neutral production in the future. This effort is supported by the European Union’s European Green Deal, which aims to make Europe climate-neutral by 2050. In order to achieve this goal and implement many other use cases, accurate data on energy consumption in production is crucial. The consortium, consisting of the organizations ODVA, OPC Foundation, PI and VDMA, has now jointly published version 1.0.0 of their groundbreaking specification for interoperable and efficient energy management in industrial automation and process automation. This group is chaired by the VDMA.

Dietmar Bohn, Managing Director of PNO, explains: “The measurement and analysis of energy consumption in machines and systems is an extremely important topic for the future. We are pleased to make an active contribution to this important initiative to optimize energy consumption and thereby reduce the harmful effects on the environment caused by waste and surplus.”

This specification defines a standardized information model based on OPC UA that enables comprehensive energy management in industrial automation. “This Power Consumption Management collaboration ensures that end users have a highly standardized and interoperable means of achieving their environmental, social and governance (ESG) goals,” explains Dr. Al Beydoun, President and CEO of ODVA.

The introduction of this standard will make energy management in industry considerably easier: companies can now record, analyze and use precise and consistent energy data even more efficiently in order to further increase their energy efficiency. This not only helps to reduce operating costs, but also to reduce the ecological footprint. Standardization makes it possible to implement innovative technologies and best practices faster and more effectively, which contributes to more sustainable and environmentally friendly production in the long term.

The specification essentially comprises two main content fields: Firstly, monitoring, i.e. the display of all types of energy consumption, including electrical energy as well as energy from air, water or coal. Secondly, standby management, which is understood to mean the control and display of various energy-saving modes on machines and components. It is based on the results of the research project “Development of energy management interfaces for IoT technologies (IoTEnRG)”. “The aim of the IoTEnRG research project was to make the results available to industry. We were able to contribute our results directly to the Joint Working Group and thus significantly accelerate the development of the OPC UA Companion Specification,” says Prof. Dr. Niemann from the Institute for Sensor Technology and Automation at the University of Applied Sciences and Arts in Hannover.

“For digitalization, we need an agreement on a common understanding and description of data, including in the energy sector. OPC UA provides exactly that. I am proud that with this joint group, we can also contribute to the energy transition and thus promote optimized energy savings through standardized and efficient monitoring,” says Stefan Hoppe, President of the OPC Foundation.

The VDMA has defined a fundamental standard for the entire mechanical and plant engineering industry, known as “OPC UA for Machinery”. Various functional building blocks are specified in this standard. A new building block for energy management is being developed based on the publication. “The four organizations have been working hard to harmonize and standardize information on energy consumption in manufacturing. This is an excellent first step towards defining an upcoming OPC UA Building Block for mechanical engineering that will bring the machine and plant manufacturing industry a big step closer to the goal of climate-neutral production,” says Andreas Faath, director of the VDMA Machine Information Interoperability department.

Digital Twin Consortium Publishes Spatially Intelligent Digital Twin Capabilities and Characteristics

I have mixed feelings toward standards organizations and consortia. Some engineers use their work to build systems. I’m never sure what the final benefit is. Some have built technology in everyday use—OPC, ODVA, FieldComm (HART, FDT), Profinet. Some publish papers that I have hear practical outcomes emanating from.

Yet, I still report on some of these. You never know how some engineers may benefit from the work while building their systems.

This news (I’m catching up on news that came my way while traveling and thinking about what I learned there) comes from The Digital Twin Consortium (DTC), a unit of the Object Management Group. My last two trips and several subsequent interviews and press events all worked in the term Digital Twin somewhere in the discussion. So, it’s relevant.

The Digital Twin Consortium (DTC) published a whitepaper titled Spatially Intelligent Digital Twin Capabilities and Characteristics to help business executives, enterprise, business, and solution architects, system designers, and developers understand the base concept of spatial information relative to the capabilities and characteristics used to describe locational intelligence in the context of digital twin capabilities. The concepts described in the whitepaper apply to a broad spectrum of digital twin use cases, industries, and disciplines.

The whitepaper provides organizations guidance to:

  • Document the capabilities and resulting value streams provided through the ability to visualize, understand, and analyze the geospatial locational characteristics of real-world entities and conditions.
  • Understand the distinction between different forms of locational representations, including geometric (3D models), spatial, and geospatial models.
  • Document the key characteristics of locational representations in a digital twin so organizations can consistently capture locational attributes, enabling digital twin system-to-system integration.
  • Capture the Spatially Intelligent Digital Twin’s locational characteristics in the context of capabilities using the DTC’s Capabilities Periodic Table (CPT).

By completing the steps outlined in the white paper, organizations can define locational capabilities and data requirements for their digital twins. They can design, develop, and operate digital twins that meet organizational needs and provide business value.

The Digital Twin Consortium Architecture, Engineering, Construction, and Operations (AECO) Working Group prepared the whitepaper. Download the DTC website’s Spatially Intelligent Digital Twin Capabilities and Characteristics whitepaper. Become a DTC member and join the global leaders in driving digital twin evolution and enabling technology. DTC is a program of Object Management Group.

Foundation Engineering General Intelligence (EGI) Launched

The media relations person found me somehow tempting with this teaser:

  • Converting natural language into code – Transform vague or unstructured instructions into precise, actionable code through AI-driven natural language processing.
  • Enhancing automation, accuracy, and efficiency – Minimize manual input and errors, optimizing every stage of the design-to-production lifecycle with intelligent automation.
  • Seamlessly integrating with existing tools – Leverage AI to integrate effortlessly with existing design and manufacturing software, ensuring full compatibility with established engineering tech stacks.

OK, this is general, full of the latest buzz words. Reading between the lines of this and her emails, I thought I detected the seeds of an answer to problems I had working in engineering data management in the late 70s. So, I consented to an interview (I do few of these anymore) with Foundation EGI co-founder Wojciech Matusik, Professor of Electrical Engineering and Computer Science at the Computer Science and Artificial Intelligence Laboratory at MIT.

This decision was one of my best lately. Matusik laid out a coherent, logical, precise definition of what this company (that launched last week) defines as the problem and the solution it intends.

These are my words—I see what they are doing as the crucial step between PLM and MES. That’s a bit to crude, but bear with me.

Early in my career, the VP of Product Development for a manufacturing company picked me for the role of data manager. From the foundation of managing the bills of materials and specifications generated by product design communicating with other departments such as manufacturing operations, cost accounting, and purchasing. These placed me at the center of cost control and reduction efforts (Hello, Elon, I could have helped you do a better job!).

Matusik began with the premise that engineering is programming, that is, it generates a set of rules in a domain-specific language. Look at assembly, for example, it is a small set of operations from a small domain. From this, one could construct a domain-specific program for any set of tasks. (I call this workflow—something I’ve seen MES developers trying to perfect for more than a decade.)

A coding assistant would be a great help developing these task programs. AI helps today’s software engineering become more efficient. Why not use the same tools for this problem? Enter Foundation EGI. Natural language programming of these “rules” for building from the engineering design data.

  1. Construct domain
  2. Program compilers
  3. Data Sets
  4. Then use LLMs and so forth

They are building a platform. Engineering change notices built-in. Focus on the most boring of engineer’s work, leave to the engineer the most creative. They hate the add-on administrative work. They integrate with a variety of CAD files and works either with PLM or not. Also fits well with robotics with automated coding assistant.

Like I have told a few founders whom I’ve interviewed, I am enthusiastic about the potential of your platform—I’m interested in seeing how the technology actually works out.

On Thursday, April 17, Foundation EGI launched, announcing the availability of the first domain-specific, agentic AI platform — engineering general intelligence (EGI) — designed to supercharge automation, accuracy, and efficiency for every stage of product lifecycle management. With EGI, design and manufacturing engineers will be able to build better products faster, driving healthier revenues for the world’s leading industrial brands. To sign up to be part of the beta, interested customers can sign up here.

Foundation EGI was co-founded by MIT academics Mok Oh, Ph.D, Professor Wojciech Matusik, and Michael Foshey, and has assembled a seasoned team with deep engineering, industrial, startup and AI experience. Backed by an over-subscribed $7.6M seed round from early investors including E14 Fund, Union Lab Ventures, Stata Venture Partners, Samsung Next, GRIDS Capital, and Henry Ford III, Foundation’s EGI platform is already in testing at leading Fortune 500 industrial brands, which are witnessing its transformative and revenue-driving potential.

Unlike other digitally-transformed industries, manufacturing and engineering processes and instructions remain manual and disorganized, causing inefficiencies, production delays and stagnant revenues — to the tune of $8T in economic waste. Using Foundation EGI’s purpose-built large language model (LLM) and EGI agentic AI platform, engineers can now transform natural language inputs, including vague and messy instructions, into codified programming that is accurate and structured, optimizing automation, accuracy and efficiency at every stage of the design to production lifecycle. Foundation EGI’s web-based technology platform seamlessly integrates with the major design and manufacturing software applications and tech stacks already used by engineering teams.

“Engineering is primed for an AI revolution, but generic LLMs won’t cut it: they lack vital domain-specificity and are prone to inaccuracies,” said Foundation EGI co-founder and CEO, Mok Oh. “Our first-of-its-kind technology is purpose-built for engineering and will supercharge every stage of product lifecycle management — starting with documentation. EGI transforms what is traditionally error-prone, manual  and inconsistent into structured, sustained and accurate information and processes, so that engineering teams can not only achieve significant cost-savings but also be more nimble, productive and creative.”

Dennis Hodges, CIO at Inteva Products, a global automotive supplier of engineered components and systems, commented: “We have high expectations from Foundation’s EGI platform. It’s clear it will help us eliminate unnecessary costs and automate disorganized processes, bringing observability, auditability, transparency and business continuity to our engineering operations.”

Said Habib Haddad, founding Managing Partner of the E14 Fund, the MIT Media Lab affiliated venture fund: “The timing and market conditions are perfect for a company like Foundation EGI to solve what has long been a large and expensive challenge for America’s industrial manufacturing leaders. The combination of Foundation EGI’s vision, its world-class team, the widespread industry appetite for enterprise AI solutions, plus the uptick in manufacturing demand makes this a rich opportunity.”

Co-founder Wojciech Matusik, Professor of Electrical Engineering and Computer Science at the Computer Science and Artificial Intelligence Laboratory elaborated on EGI’s potential, “Engineering general intelligence transforms natural language prompts into engineering-specific language using real-world atoms, spatial awareness and physics. It will unleash the creative might of a new generation of engineers. Expect leaps and bounds in agility, innovation and problem-solving.”

Foundation EGI’s mission was inspired by research conducted by Professor Matusik, Michael Foshey, and others at MIT and other academic institutions, published in a March 2024 paper titled “Large Language Models for Design and Manufacturing.”

The Open Process Automation Forum Continues Advancement

The Open Group Open Process Automation Forum (OPAF) provides annual updates at a forum in Orlando in February. I missed that meeting, however recently receiving an update from Aneil Ali, The Open Group OPAF Director.

OPAF members have worked diligently for years developing a standard of standards in order to break the proprietary grip of specific process automation suppliers—hence the word “Open” in the name. Owner/operators facing needed technology upgrades balked at the price of rip-and-replace automation.

I have seen these efforts a few times in the past. The results have provided benefits, but usually far from the vision of the founders.

This organization continues to move forward. They have released version 2.1 of the standard, launched a product certification program, and have witnessed some products making it through the system.

The headline news is ExxonMobil’s Lighthouse project. They have operationalized the OPAF system at a resin finishing plant in Baton Rouge at tail end of 2024. Engineers beat deadlines for startup. They have published some good lessons learned from the project. It’s the first deployment of a commercial OPAF system making money for the owner/operator.

One complaint levied over the years concerned the proliferation of standards, many of which are not interoperable. OPAF has addressed standards harmonization hosting for the fourth year standards harmonizing meetings in Eastern Hemisphere. Recently one was in Germany with FieldComm, OPC UA, Namur, OPAF, PI. They typically meet for three days looking for where there is a risk for divergence and potential problems for endusers.

Ali noted the OPAF have started a regular cadence of user meetings as an effort to get them together to air wishes/desires. These thoughts can be distilled to assignments for working groups.

Ali concluded, “The Forum always open to receiving guidance and feedback from end users not in the ecosystem—we’re not a closed club.”

Agentic AI, SaaS, Community—The Aras Community Gathering

The Aras ACE2025 Community Event in Boston closed two weeks ago. It has taken me that long to wrap my head around everything I learned. Normally there are many really important-sounding words that sound so enlightening at the time, yet when I sit to write I find no substance. In this case, there was so much substance that I have trouble filtering to the most important themes.

Let’s say that not only were the expected buzz words in evidence but the underlying concepts were demonstrably in use. Aras is a PLM (product lifecycle management) developer. They are solving problems that I had in the late 70s while working at a manufacturer. Mainly, how to make usable sense from all the engineering data.

The principle phrase of the week was digital thread. They are all about the digital thread. Companies were also using Large Language Model (LLM) technology trained on their own data. Agentic AI rears its head and will become even more important with use. (See my interview with John Harrington of HighByte for more on Agentic AI.)

Customer presentations that showcase actual use cases provide reality to the theory.

I sat in a presentation by the sensor manufacturer Sick. They have applied AI to unstructured data turning them into useful structured data. Using Aras PLM, they have realized better speed to market finding product data via natural language query. They have instances of development times cut from 3 years to 6 months.

Another customer presentation came from Denso. Engineers find the digital thread from PLM as a tool for collaboration. The connected flow of data ensures continuity from design to manufacturing to operations. Inconsistent data hurts the business. PLM is the heart of their digital strategy with the BOM as centerpiece. Once again an example of someone actually using GenerativeAI trained on their data to fill in gaps.

The highlight of customer applications came from my half-hour discussion with Tetsuya Kato, Manager of the Technical Management Group from SkyDrive in Japan—the Flying Car company. OK, it’s not the Toyota in your driveway suddenly flying to the store. But it’s close. Check out the goodies on their website.

He was hired to bring order to the product information system. In other words, to develop a better Manufacturing Bill of Materials (MBOM). They were using Team Center PLM with a system brought in by a consulting engineering firm. The system had many problems, was taking too long to implement, and forced SkyDrive to change its systems to fit the software.

Kato brought in Aras Connector to bring engineering data from Team Center to the Aras PLM platform. He started the project in September, showed results in two months, and moved all the data in eight months. The Aras solution had all the features necessary for their manufacturing data with the additional benefit of flexibility to allow them to make the system work for them instead of the other way around.

Chief Technology Officer Rob McAveney asks “What if…”

McAveney noted Aras has 25 years of asking what if…

  • 2001 What if PLM could be flexible, webnative platform?
  • 2005 What if PLM applications were built to work together? Integrated data now called digital thread.
  • 2011 what if impact analysis were an interactive experience? Wizard style digital thread.
  • 2014 what if visual collaboration was available to everyone?
  • 2021 What if a SaaS delivery model came without compromise?
  • 2025 What if we could extend reach of the digital thread? Take advantage of Aras Effect, open, reachability; Aras Portals, apps product data platform, composable PLM apps, low code environment?

The digital thread + AI = Connected Intelligence:

The three areas of Connected Intelligence include:

  • Discover—conversation about data 
  • Enrich—connect more data and people business
  • Amplify—maximize impact
  • Pursuing all three together

Discover—natural language search, content synthesis, machine learning, text to SQL (natural language prompt to query; what if guided tour how to set effectivity conditions to sync multiple changes (context aware help), then phase in changes with confidence, eliminate rework and supply chain, what if assess global supply of a sourced component before submitting a change request, avoid wasting time on changes; what if you could ask AI assistant to ID common factors while root cause analysis, persistent quality issues become a thing of the past.

Enrich—entity  recognition, contextual reasoning, topic modeling, deep learning, what if missing of inconsistent links in digital thread could be easily identified and corrected, patterns, downstream analytics, stop wasting effort on redoing work, what if requirements could be automatically identified and ingested from reliable external data sources, then see next level requirements traceability with dynamic requirements, what if factory floor data could be linked to quality planning parameters, planning for feedback loop.

Amplify—agentic AI, surrogate modeling, generative engineering, reinforcement learning, what if engineer-to-order business could be transformed by leveraging all your past engineering work to create a common variability model, engineer shift for individual customer projects to improving full product line.

Follow this blog

Get a weekly email of all new posts.