The last that I wrote about CESMII the output had been several educational initiatives regarding smart manufacturing.
I contacted Jillian Kupchella, marketing director, last month initiating some conversations so that I could get an updated.
For those not familiar with the organization: CESMII – the Smart Manufacturing Institute – has a total current investment commitment of $201M from Department of Energy funding and public/private partnership contributions, with a mandate to create a more competitive manufacturing environment here in the US through advanced sensing, analytics, modeling, control and platforms. CESMII is one of 18 Manufacturing USA institutes on this mission to increase manufacturing productivity, global competitiveness, and reinvestment by increasing energy productivity, improving economic performance and raising workforce capacity. University of California at Los Angeles (UCLA) is the program and administrative home of CESMII.
The CEO is a former colleague from MESA, John Dyck.
The early education initiatives have blossomed over the ensuing few years to a community of nearly 100 Certified Smart Manufacturing Roadmapping Professionals who are equipped to engage manufacturers of all size – small, medium, and large – to assess current states, develop strategic roadmaps, align communications, and establish sustainable funding models. This work is accelerating the development of data-driven cultures and a true Smart Manufacturing mindset across all industries.
They identified manufacturing and systems interoperability as a strategic imperative – marking the end of siloed data and stovepipe architectures and enabling scalable data, application, and integration interoperability. I’ve heard and written about the data silo and stovepipe architecture for perhaps decades. I hope they can move that ball forward (to use an American football analogy given the recently completed Super Bowl).
Hearing from Dyck, CESMII have identified a couple new initiatives they consider key to the widespread deployment of Smart Manufacturing.
CESMII’s 3 Smart Manufacturing Architecture Imperatives represent a foundational set of requirements that address this demand for interoperability. We are advocating for open, standards-based information modeling (SM Profiles), interoperable platform requirements, and a common API that can drive scalability, reduce complexity, and unlock real-time value from manufacturing data across systems, applications, and the supply chain. You can learn more about these SM Imperatives here: SM Architecture Imperatives Workshop
We do want to draw your attention to the newest, and arguably most important of these imperatives. CESMII convened an international, open initiative to establish a common, vendor-agnostic API for contextualized manufacturing information. This effort addresses a longstanding challenge faced by manufacturers and application developers alike: the need to build against incompatible, proprietary platform interfaces. Adoption of this API is already underway among several leading manufacturing software and platform providers, with an official launch planned for early 2026.
We are also excited to share that several of our technology provider partners are actively working toward compliance with CESMII’s Smart Manufacturing Imperatives. As a result, we anticipate the addition of several new compliant Smart Manufacturing Interoperability Platforms (SMIPs) in 2026 – further strengthening the ecosystem. Stay tuned for announcements.
Scaling Smart Manufacturing for Impact
Through community engagement, CESMII has identified several strategic innovation and investment areas essential to scaling and deploying Smart Manufacturing, including:
Replicating Smart Manufacturing solutions across factories within an industry
Scaling from unit operations to factory and enterprise levels
Extending Smart Manufacturing solutions across the supply chain, including tier suppliers and small and medium-sized manufacturers
Scaling and deploying will demonstrate industry integration, implementation, and reusability of existing SM solutions, practices, and infrastructure.
The Institute have given itself some ambitious projects. We wish them success.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
I touched on this concept reporting from the Ignition Community Conference last September. It’s where I was sitting beside this excitable “influencer” who was overjoyed at the announcement from Inductive Automation that MCP was coming to Ignition sometime in 2026 and darn near put a big bruise on my thigh hitting me in his excitement.
Our company is working on an MCP Module for Ignition that will be released later in 2026. MCP is a very new technology on the scene, so you shouldn’t feel bad if you’re asking yourself, ‘Cool, but what exactly is MCP?’ In this blog post, we’ll give you a quick overview of what MCP does so you can start thinking of exciting ways to use the new module once it’s released.
As AI continues to evolve, one of the biggest limitations holding it back from widespread real-world adoption is its isolation. Large language models (LLMs) are powerful, but they are typically trained on a fixed dataset and are unable to access or act on real-time information.
The Model Context Protocol (MCP) breaks down that barrier. Introduced by Anthropic in November 2024 as an open standard protocol, MCP creates a standardized two-way communication bridge between AI systems and external tools, applications, and data sources. It extends LLMs with the ability to interact with enterprise resource planning (ERP) systems, customer relationship management (CRM) systems, databases, APIs, and external developer tools. You can think of it as a universal plug that allows LLMs to connect seamlessly with information outside of their training data.
Traditional LLMs are limited in two critical ways: they are static and isolated. This means that once an LLM is trained, its knowledge is frozen in time, and it cannot access external tools or databases unless you build custom integrations. MCP solves both of these problems by turning LLMs into dynamic agents. Through MCP, AI systems can query real-time data, update records, and trigger workflows.
For example, an enterprise assistant built with MCP could answer questions about project timelines, check your Google Calendar, update a ticketing system, query metrics, update internal systems, book events, or send an email within the same conversation. In creative fields, MCP-enabled AIs could write code and deploy it to production environments or generate 3D designs and send them directly to a printer.
Simply put, MCP increases LLM utility and automation by enabling it to perform a wide range of actions that would be impossible without extensive custom engineering.
One of the most important advantages of MCP is that it significantly reduces the hallucinations or inaccuracies that LLMs often generate by allowing models to access authoritative, real-time sources like your databases and APIs. This ensures that your LLMs’ outputs are more grounded in reality rather than relying on probabilistic text generation.
Additionally, unlike proprietary integrations that lock AI applications into a specific tool or vendor ecosystem, MCP is an open standard, which enables developers to share pre-built MCP server frameworks. This allows AI systems to evolve over time, and provides the critical benefit of solving the N x M problem of integration, where N (AI models) and M (tools) require N x M number of custom connectors. MCP provides a consistent grammar and communication protocol, standardizing the interface and allowing a single tool to be shared across models via a plug-and-play architecture. This makes it easier to reuse components, accelerates development, and fosters open collaboration across vendors and platforms without rewriting application logic, positioning MCP as foundational infrastructure rather than a short-lived integration layer.
MCP uses a client-server architecture. The AI application acts as the MCP host, while MCP clients serve as bridges to external systems and tools. These clients handle session management, parsing, reconnection, and translation of user requests into MCP’s structured format. Each MCP client communicates with a unique MCP server, which connects to external databases, APIs, and web services, enabling it to execute tool functions, fetch data, or provide prompts.
MCP servers expose three core primitives: resources, tools, and prompts. Resources provide read-only access to data sources like databases or files; tools perform actions, such as making API calls or triggering workflows; and prompts are reusable templates that set the structure for how the LLM communicates with tools and data. MCP uses these primitives as structured, declarative interfaces rather than allowing the LLM to issue arbitrary API calls. This streamlines the AI by shielding it from low-level system complexity, ensuring that it invokes well-defined actions with clearly scoped inputs and outputs.
MCP can be deployed in many ways to align with the needs of different environments and industries:
Local servers for privacy-sensitive and high-speed offline tasks
Remote servers for cloud-based, shared services
Managed servers for scalability and operational simplicity
Self-hosted servers for compliance, control, on-premise, or legacy environments
Using AI with MCP is very simple from the user’s perspective. You prompt your LLM as you normally would, and the MCP-connected system handles the rest. For example, if you ask, “Build me a report,” the AI host initiates a tool discovery process by querying the MCP server. It retrieves a list of available tools, selects the appropriate one, and calls the function with the necessary parameters.
If your system needs a real-time update, such as a tool becoming unavailable, the MCP server can push a notification to the client without waiting for a new prompt. Once the tool completes its task, MCP integrates the results into the AI’s response or uses them to trigger the next action in a multi-step workflow.
This orchestration model makes MCP ideal for building advanced AI agents capable of reasoning with live data, executing actions across systems, and adapting dynamically as tools and environments change.
MCP represents a foundational shift in how AI connects to systems. It transforms LLMs from static knowledge engines into intelligent, action-capable systems. As adoption grows, MCP is poised to become a core part of modern software infrastructure, powering a new generation of agentic and adaptive AI applications.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
Reports of sales “wins” seldom constitute any of my posts. For many years, that was the only “news” emanating from the process automation suppliers. This one from Rockwell Automation is intriguing because I probed fruitlessly for updates on software during the Automation Fair in November and also because applications of manufacturing enterprise solutions (MES) are rare and further because Rockwell Automation is no longer listed as a premier member of the MES Association (MESA). Whew.
Another unusual aspect in this news lies in the fact that it involves automotive manufacturing in the Middle East.
Rockwell Automation Inc., the world’s largest company dedicated to industrial automation and digital transformation, announced January 21, 2026 a deepened collaboration with Lucid, maker of the world’s most advanced electric vehicles, to support the automaker’s expanding manufacturing facility in the Kingdom of Saudi Arabia. The facility, located in King Abdullah Economic City (KAEC), marks a historic milestone as the country’s first vehicle manufacturing site.
Lucid will deploy Rockwell Automation’s enterprise software solutions, including its FactoryTalk manufacturing execution system (MES) software, to manage and optimize production operations across all major shops: general assembly, paint, stamping, body, and powertrain. The FactoryTalk MES platform will provide Lucid with real-time visibility, traceability, and control across its operations, helping enable production of the company’s future midsize vehicles.
“Lucid’s adoption of FactoryTalk MES is a strategic move that will deliver measurable outcomes in operational efficiency, quality, and scalability,” said Ahmad Haydar, country leader for Rockwell Automation in Saudi Arabia. “Our software will help Lucid meet its ambitious production goals while ensuring seamless integration with global supply chains and compliance with local standards. This is a proud moment for Rockwell Automation and a testament to our commitment to supporting the Kingdom’s Vision 2030 through advanced manufacturing technologies and workforce development.”
In addition to software, Rockwell’s local team in Saudi Arabia will deliver instructor-led and virtual training programs. By equipping local Saudi talent with cutting-edge EV manufacturing expertise through tailored training, this partnership will cultivate a skilled workforce that will drive sustainable industrial growth and help power the Kingdom’s Vision 2030 objectives.
“Rockwell Automation has been a trusted partner throughout our journey, from our Arizona factory to our expansion in Saudi Arabia,” said Faisal Sultan, president of Middle East at Lucid. “Their software solutions and local expertise will help us scale production while maintaining the highest standards of quality and innovation our customers have come to expect. We’re excited to continue this collaboration as we expand world-class electric vehicle manufacturing in the region.”
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
Siemens seems to have found a home at CES over the past few years. I don’t know what it costs to give a keynote, but it’s probably well worth it, since no other major automation supplier seems to attend. I did write about a robotic exhibitor in my last post. Oh, and I’m still not likely to travel to Las Vegas for the next CES. I’ll save a ton of money and grief by receiving the news at home.
Siemens has maintained strong collaboration with Microsoft for decades—see all the Copilot news below. Recently, NVIDIA has joined the collaboration dance. Also, see news below. I think Siemens thought they’d gain penetration into the North American market through Chrysler’s acquisition by a German company plus the plants constructed by VW and BMW. That market is not so hot—see the proportion of sales into automotive by competitor Rockwell Automation, for example. Check out the customers featured by Siemens at CES this year: PepsiCo, Commonwealth Fusion Systems, Meta Ray-Ban, and Haddy.
Perhaps the best acquisition, and most successful, that Siemens ever made was with UGS years ago. While rivals have struggled with software (and competitors have nibbled at some of the Siemens applications), Siemens continues to strengthen Xcelerator and Copilot technologies. And check out the launch of Digital Twin Composer. Digital Twin technology and application seems to be finally gaining traction.
In short, Siemens announcements:
Siemens and NVIDIA expand their partnership to build the Industrial AI Operating System, reinventing the entire end-to-end industrial value chain through AI – from design and engineering to manufacturing, production, operations, and into supply chains.
Siemens launches Digital Twin Composer software, available on Siemens Xcelerator Marketplace mid-2026, to power the industrial metaverse at scale
PepsiCo using Siemens Digital Twin Composer to simulate upgrades to its facilities in the U.S. with plans to scale globally
Siemens unveils nine industrial copilots to bring intelligence across the industrial value chain
Siemens highlights new technologies for accelerating drug discovery, autonomous driving and shop floor efficiency
“Industrial AI is no longer a feature; it’s a force that will reshape the next century. Siemens is delivering AI-native capabilities, intelligence embedded end-to-end across design, engineering and operations, to help our customers anticipate issues, accelerate innovation and reduce cost,” said Roland Busch, President and CEO of Siemens AG.
“Just as electricity once revolutionized the world, industry is shifting toward elements where AI powers products, factories, buildings, grids and transportation. Industrial AI is no longer a feature; it’s a force that will reshape the next century. Siemens is delivering AI-native capabilities, intelligence embedded end-to-end across design, engineering and operations, to help our customers anticipate issues, accelerate innovation and reduce cost,” continued Busch. “From the most comprehensive digital twin and AI-powered hardware to copilots on the shop floor, we’re scaling intelligence across the physical world, so businesses realize speed, quality and efficiency all at once. This is how we scale a once-in-a-generation technology shift into measurable outcomes.”
Siemens and NVIDIA are expanding their partnership to build the Industrial AI Operating System – helping customers revolutionize how they design, engineer, and operate physical systems. They will work together to build AI-accelerated industrial solutions across the full lifecycle of products and production, enabling faster innovation, continuous optimization, and more resilient, sustainable manufacturing. The companies also aim to build the world’s first fully AI-driven, adaptive manufacturing sites globally, starting in 2026 with the Siemens Electronics Factory in Erlangen, Germany, as the first blueprint.
To support development, NVIDIA will provide AI infrastructure, simulation libraries, models, frameworks and blueprints, while Siemens will commit hundreds of industrial AI experts and leading hardware and software. The companies have identified impact areas to make this vision a reality: AI-native EDA, AI-native Simulation, AI-driven adaptive manufacturing and supply chain, and AI-factories.
Integration with Siemens software.
Siemens also announced that it will be integrating NVIDIA NIM and NVIDIA Nemotron open AI models into its electronic design automation (EDA) software offerings to advance generative and agentic workflows for semiconductor and PCB design. This will both maximize accuracy through domain specialization and significantly lower operational costs by enabling the most efficient model to handle and adapt to every specific need.
Product Launch
Siemens’ primary product launch at CES 2026 is the Digital Twin Composer, available on the Siemens Xcelerator Marketplace mid-2026. This new technology brings together Siemens’ comprehensive digital twin, simulations built using NVIDIA Omniverse libraries, and real-time, real-world engineering data.
With the Digital Twin Composer, companies can create a virtual 3D model of any product, process, or plant; put it in a 3D scene of their choosing; then move back and forth through time, precisely visualizing the effects of everything from weather changes to engineering changes. With Siemens’ software as the data backbone, the Digital Twin Composer builds Industrial Metaverse environments at scale, empowering organizations to apply industrial AI, simulation and real-time physical data to make decisions virtually, at speed and scale. Digital Twin Composer is part of Siemens Xcelerator, an industry proven portfolio of software used by companies worldwide to develop digital twins.
Customer application of digital twins
PepsiCo and Siemens are digitally transforming select U.S. manufacturing and warehouse facilities by converting them into high-fidelity 3D digital twins that simulate plant operations and the end-to-end supply chain to establish a performance baseline. Within weeks, teams optimized and validated new configurations to boost capacity and throughput, giving PepsiCo a unified, real-time view of operations with flexibility to integrate AI-driven capabilities over time.
Leveraging Siemens’ Digital Twin Composer, NVIDIA Omniverse libraries and computer vision, PepsiCo can now recreate every machine, conveyor, pallet route and operator path with physics-level accuracy, enabling AI agents to simulate, test, and refine system changes – identifying up to 90 percent of potential issues before any physical modifications occur. This approach has already delivered a 20 percent increase in throughput on initial deployment and is driving faster design cycles, nearly 100 percent design validation and 10 to 15 percent reductions in capital expenditure (Capex) by uncovering hidden capacity and validating investments in a virtual environment.
New Industrial Copilots Streamline Manufacturing Operations
Siemens also spotlighted its partnership with Microsoft highlighting co-building the industrial copilot.
Siemens also announced that it is expanding its set of AI-powered copilots across the industrial value chain. This will embed intelligence that extends from design and simulation to product lifecycle management, manufacturing, and operations.
Siemens will deploy nine new AI-powered copilots for its software offerings, this will include Teamcenter, Polarion, and Opcenter. These copilots, respectively, streamline product data navigation, reducing errors and accelerating time to market; automate compliance, helping to ensure faster regulatory approvals and lower risk; and transform manufacturing processes, driving cost savings and operational efficiency.
These copilots, along with the rest of Siemens’ expanding portfolio of industrial AI solutions, are available to companies of every size on the Siemens Xcelerator Marketplace.
AI-Driven Innovations in Life Sciences, Energy and Manufacturing
Siemens acquired Dotmatics whose Luma platform enables scientists to unify billions of data points generated across instruments and labs, creating a coherent foundation for AI-driven exploration. Combined with Siemens Simcenter simulation and digital twins, teams can rapidly test molecules, identify promising candidates, and virtually scale production to help life-changing therapies reach patients up to 50% faster and at a lower cost.
Bob Mumgaard, CEO and co-founder of Commonwealth Fusion Systems, described how the company uses Siemens’ technologies as it leads the path to commercial fusion. Commonwealth Fusion Systems uses design software and a strong data backbone to help it accelerate the development of fusion machines that promise clean, limitless energy for generations to come.
In manufacturing, Siemens announced a collaboration to bring Industrial AI to Meta Ray-Ban AI Glasses. With hands-free, real-time audio guidance, safety insights, and feedback, shop floor workers will feel empowered to solve problems efficiently and confidently.
Haddy is reshaping manufacturing through AI-powered 3D printing and localized micro factories that deliver sustainable, high-quality products faster and closer to customers. Facing challenges around supply chain disruption, sustainability, and production agility, Haddy partnered with Siemens to streamline design, optimize operations, and scale efficiently.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
We met in a conference room at an office in Barrington, IL. A place where sometime later a couple guys thought they’d screw me in a business deal. I came out ahead in the end, but the place has mixed memories.
This meeting involved thinking about the future of asset data and systems interoperability. We had a system diagram. The idea was to solve a huge problem for owner/operators of process manufacturing enterprises—flowing engineering data into other software systems for operations, maintenance, and enterprise. The incumbent system was a morass of paper (or pdf documents which was much the same thing).
We did trademark searches and domain name searches and eventually settled on the Open Industrial Interoperability Ecosystem—OIIE.
I plot this history for context for the conference I attended recently—the 2nd ADIF Workshop at Texas A&M University dubbed Driving Asset Data and Systems Interoperability Toward an Open and Neutral Data Ecosystem.
This workshop brought together owner/operators, EPCs, System Integrators, university researchers, standards organizations, and software vendors. Each group conducted a panel discussion of its needs and successes. I was there for a short presentation and to moderate the standards panel.
Professor David Jeong from Texas A&M and the session leader previewed the discussions. One of his colleagues later presented research his team has performed to provide a method for taking P&ID documentation into a standard format usable by other software systems.
The message that came to me from the panel of owner/operators (grossly summarized, as will be all the discussions) included two key words—collaborate and operationalize. They are impatient about solving this data interoperability problem. One panelist quipped, “We know the project is finished when the large van backs into the loading dock and disgorges mountains of paper.”
What blows my mind is that I was moved to a position called Data Manager in 1977 to tackle the (much smaller) mountain of paper our product engineering department provided to operations, accounting, and inventory management. I led a digitalization effort in 1978 to tackle the problem. The problem not only remains, but it is immensely more complicated and critical.
The EPCs basically said that their hands were tied by the owner/operators mandating which design and engineering software to use and the inflexibility of the vendors of said design and engineering software. When owner/operators had requested digital documentation, they had responded with pdfs. Hardly interoperable data.
Our standards panel included the leader of DEXPI, whose organization has developed a method of changing P&ID data into an xlsx (Excel) format. That, of course, is a good start.
An organization called CFIHOS (see-foss) presented their take on standards. I’m afraid I got a bit lost in the slides (note: more research needed). What I gathered was that they were attempting one overriding standard—and that that work was years away. Interesting that I listened to Benedict Evans’ podcast this morning. He is a long-time tech industry analyst. He remarked in another context, “It seems that where there are 10 standards and someone comes along with a standard to encompass them all, you wind up with 11 standards.”
The ISA-95 was presented. This messaging (and more) standard is incorporated with the OIIE, which was presented next. Dr. Markus Stumptner of the University of South Australia presented his research work on proof of concept of the OIIE.
If we can get enough momentum focusing on this area and find some SIs willing to take the OIIE to an owner/operator, perhaps we can finally prove the business case of asset data and systems interoperability.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.
I’ve been invited to two Aras community events over the past two years. Prior to that, my PLM market knowledge was dominated by three companies. To be honest, I’d never even heard about the company. With one visit and a few interviews, I knew there was something different and better here. (See this report from this year’s event Agentic AI, SaaS, Community—The Aras Community Gathering.)
Aras holds a smaller market position (based on conversations, not market research—something I shun), but it offers something that larger companies don’t. Enterprise and manufacturing software developers usually require users to change their operations systems to fit within the constraints of the software system. Aras provides a more flexible system—something that both Aras product people and customers have told me.
Lauritsen worked for a partner called Minerva for many years prior to its acquisition by Aras. He has held a couple positions within Aras mostly in sales leadership. His background also includes programming and product management—providing him with a background to lead the company in its next iteration.
Aras was a founder-led company until growth required someone to provide professional organization and systems. That leader was Roque Martin. After four years, the board felt it was time for the next step. Lauritsen told me this next step is to incorporate AI into the offerings. In fact, he looks to have the company “supercharge with AI.” He obviously didn’t get into the AI weeds, but I gathered the impression that his product people are working with a variety of approaches for the best fit for each application.
He starts with the customer as he defines his vision of the company. PLM defines the best ways of working for the customer. He has the company working in its labs to find innovative ways to implement AI for both within the organization’s development team and for best practices for customers.
Interesting given my recent work with organizations seeking data interoperability, Aras is seeking ways to coexist with current enterprise solutions.
Many times conversations with company spokespeople center on the product. I asked Lauritsen to define business values provided to customers. He told me about two customers at about the same stage of market development. One used the Aras PLM solution to improve systems to increase quality. The other had a different problem—product development time to launch. Aras provided solutions to fit the business need of the client.
While researching for the interview, I saw that Lauritsen had been on the Danish national Judo team and remains on the national Judo board. Judo requires as much mind training as physical training. So, I had to ask how Judo helps his thought process as a leader and marketer. He laughed, saying the other Aras folks on the call had probably heard enough about Judo. He gave an example from strategic marketing. The principle of Judo is to use the opponent’s force against them. When you face a larger opponent, you know you cannot directly engage, but you must look for the weak point where you can leverage their size agains them.
Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.