Select Page

Complexity—the Enemy of Effectiveness

I recently wrote an article for my website about technology complexity within industrial technology. Engineering managers have stood at conferences pleading with the standards and technology developers to find ways to simplify interfaces and connectivity.

OPC Foundation keeps adding layers of companion specifications. ODVA members listened to engineers who need help implementing EtherNet/IP (or just ethernet networks) and proceeded to ignore the plea. Paul Miller, an analyst at Forrester reported from a survey where 90% of executives reported data problems from their digital transformation. 71% reported measurement related data problems.

Mattias Stenberg, head of the new software company spinning out from Hexagon called Octave, reported from another survey his group has performed that only one in five executives thought they were getting any value from digital transformation.

The Vice President of Product Development of the company where I worked in the 1970s (back in the day before layers of vice presidents) offered a job to me to leave manufacturing and become his data manager. He was prescient. 45 years later, companies are still trying to manage data. Solutions have become more complex, technology has advance exponentially, yet we still have problems gathering, refining, contextualizing, and using data. 

These thoughts were generated from the Hexagon Live Global Conference I attended this week in Las Vegas. I have a lot of trouble wrapping my head around just who Hexagon is. Evidently, I’m not alone. But the company is making it easier by splitting off four groups into Octave.

The simplest definition, yet also most definitive, came from Ola Rollén Hexagon Chairman recounting the company’s 25-year history of growth. “Hexagon is the world’s most sophisticated measuring tape.” Indeed, several of my interviews delved into the world of accurately measuring the very large and the very small. This year’s slogan, “When it has to be done right.”

The new ATS800 laser tracker can easily capture complex shapes with up to micron precision. The company released Autonomous Metrology Suite, software developed on its cloud-based Nexus platform that is designed to transform quality control across manufacturing industries worldwide. By removing all coding from coordinate measuring machine (CMM) workflows, it helps manufacturers speed up critical R&D and manufacturing processes as experienced metrologists become harder to find.

Hexagon and several partners are solving what has been an intractable and troubling problem—data locked into paper-based formats such as pdf files. Several demonstrated the ability to read text and pdf documents that are unstructured data, use a form of AI to tag the data, and then extract to a useable database. This is truly a great advance. Several workforce solutions designed to give companies the ability to attract younger workers into technical positions were demonstrated on the show floor.

Stenberg talked of another problem executives cited—data silos that prevent people from using data to make good decisions. I have been writing about solutions designed to break through data silos for 25 years. I’m beginning to wonder if it is not a technology problem. Perhaps it’s a people problem.

Agentic AI, SaaS, Community—The Aras Community Gathering

The Aras ACE2025 Community Event in Boston closed two weeks ago. It has taken me that long to wrap my head around everything I learned. Normally there are many really important-sounding words that sound so enlightening at the time, yet when I sit to write I find no substance. In this case, there was so much substance that I have trouble filtering to the most important themes.

Let’s say that not only were the expected buzz words in evidence but the underlying concepts were demonstrably in use. Aras is a PLM (product lifecycle management) developer. They are solving problems that I had in the late 70s while working at a manufacturer. Mainly, how to make usable sense from all the engineering data.

The principle phrase of the week was digital thread. They are all about the digital thread. Companies were also using Large Language Model (LLM) technology trained on their own data. Agentic AI rears its head and will become even more important with use. (See my interview with John Harrington of HighByte for more on Agentic AI.)

Customer presentations that showcase actual use cases provide reality to the theory.

I sat in a presentation by the sensor manufacturer Sick. They have applied AI to unstructured data turning them into useful structured data. Using Aras PLM, they have realized better speed to market finding product data via natural language query. They have instances of development times cut from 3 years to 6 months.

Another customer presentation came from Denso. Engineers find the digital thread from PLM as a tool for collaboration. The connected flow of data ensures continuity from design to manufacturing to operations. Inconsistent data hurts the business. PLM is the heart of their digital strategy with the BOM as centerpiece. Once again an example of someone actually using GenerativeAI trained on their data to fill in gaps.

The highlight of customer applications came from my half-hour discussion with Tetsuya Kato, Manager of the Technical Management Group from SkyDrive in Japan—the Flying Car company. OK, it’s not the Toyota in your driveway suddenly flying to the store. But it’s close. Check out the goodies on their website.

He was hired to bring order to the product information system. In other words, to develop a better Manufacturing Bill of Materials (MBOM). They were using Team Center PLM with a system brought in by a consulting engineering firm. The system had many problems, was taking too long to implement, and forced SkyDrive to change its systems to fit the software.

Kato brought in Aras Connector to bring engineering data from Team Center to the Aras PLM platform. He started the project in September, showed results in two months, and moved all the data in eight months. The Aras solution had all the features necessary for their manufacturing data with the additional benefit of flexibility to allow them to make the system work for them instead of the other way around.

Chief Technology Officer Rob McAveney asks “What if…”

McAveney noted Aras has 25 years of asking what if…

  • 2001 What if PLM could be flexible, webnative platform?
  • 2005 What if PLM applications were built to work together? Integrated data now called digital thread.
  • 2011 what if impact analysis were an interactive experience? Wizard style digital thread.
  • 2014 what if visual collaboration was available to everyone?
  • 2021 What if a SaaS delivery model came without compromise?
  • 2025 What if we could extend reach of the digital thread? Take advantage of Aras Effect, open, reachability; Aras Portals, apps product data platform, composable PLM apps, low code environment?

The digital thread + AI = Connected Intelligence:

The three areas of Connected Intelligence include:

  • Discover—conversation about data 
  • Enrich—connect more data and people business
  • Amplify—maximize impact
  • Pursuing all three together

Discover—natural language search, content synthesis, machine learning, text to SQL (natural language prompt to query; what if guided tour how to set effectivity conditions to sync multiple changes (context aware help), then phase in changes with confidence, eliminate rework and supply chain, what if assess global supply of a sourced component before submitting a change request, avoid wasting time on changes; what if you could ask AI assistant to ID common factors while root cause analysis, persistent quality issues become a thing of the past.

Enrich—entity  recognition, contextual reasoning, topic modeling, deep learning, what if missing of inconsistent links in digital thread could be easily identified and corrected, patterns, downstream analytics, stop wasting effort on redoing work, what if requirements could be automatically identified and ingested from reliable external data sources, then see next level requirements traceability with dynamic requirements, what if factory floor data could be linked to quality planning parameters, planning for feedback loop.

Amplify—agentic AI, surrogate modeling, generative engineering, reinforcement learning, what if engineer-to-order business could be transformed by leveraging all your past engineering work to create a common variability model, engineer shift for individual customer projects to improving full product line.

Secret Agent Man

I’m not talking about the Johnny Rivers theme for a late Sixties Saturday afternoon spy TV show. We’re talking software agents. Some may be secret, but none are men.

I once had an annual meeting with the CTO of a large automation company where I shared (non-privileged) information I’d gathered about the market while trying to learn what technologies I should be watching for.

With artificial intelligence (AI) and Large Language Models (LLMs) grabbing the spotlight at center stage, I’m watching for what technologies will make something useful from all the hype.

I’m looking for a return to the spotlight of these little pieces of software called agents. John Harrington, Co-Founder and Chief Product Officer at HighByte, an industrial software company, believes in 5 or so years from now, LLMs won’t be the game-changer in manufacturing that many expect. Instead, Agentic AI is set to have a far bigger impact.

So, John and I had a brief conversation just before my last trip. It was timely due to the nature of my trip—to a software conference where LLMs and AgenticAI would be important topics—and not just in theory.

From Harrington, “Agentic AI is revolutionizing the tech industry by addressing AI’s biggest limitation—making decisions that are more human like.  AI agents are yet another application that analyzes and turns large amounts of data into actionable next steps, but this time they promise it will be different.”

He told me that Agentic AI will become more “human-like” going beyond LLMs. HighByte started up as an Industrial DataOps play at a time when I was just hearing about DataOps from the IT companies I followed. I told the startup team that they were entering a good niche. They have been doing well since then. They extended DataOps with Namespace work and now LLMs and agents.

“AI agents can enhance data operations by providing greater structure, but their success depends on analyzing contextualized data. Without proper context, the data they process lacks the depth needed for accurate insights and decision-making,” added Harrington.

Take an example. An agent can be a way to contextualize data, model an asset. Working with an LLM trained on data specific to the application, it can ask the LLM to scan the namespace to see if there are other assets in the database. HighByte’s can work through OPC, and also works with Ignition from Inductive Automation or the Pi database. It looks for patterns and can propose options as the engineer goes in to configure the application.

Not shy in his forecast, Harrington says the future is agents. They can affect and act on data. They can reach out to a control engineer, operator, quality group. It’s a targeted AI tool focused on one small thing. Perhaps there’s a maintenance agent, or one for OEE, or line quality on a work cell. Don’t think of a monolithic code in the cloud. Rather, think of smaller routines that could even work together helping business like Jarvis in Iron Man. Data is food for these agents, and HighByte’s business is data. 

I’ve been impressed with HighByte’s growth and sustainability. Also that they’ve managed to remain independent for so long. Usually software companies want to build fast and sell fast. Watch for more progress as HighByte marries agentic AI with data.

API Builder for Intelligence Hub

HighByte seems to be making more of a splash lately with its DataOps plus Universal Namespace developments. Several years ago I spotted DataOps as an important technology application. Usage still grows.

This news from HighByte concern the ability to build custom APIs.

HighByte released HighByte Intelligence Hub version 4.1 with Callable Pipelines, enabling users to build custom APIs for their operations and industrial data sources. Callable Pipelines, together with the REST Data Server, functionally serve as an “Industrial Data API Builder” that allows users to build custom API endpoints to interact with industrial data. In addition to Pipelines enhancements, version 4.1 delivers native integration with Aspen InfoPlus.21, support for the new Amazon S3 Tables service, granular audit logging for regulated industries, and seamless event capture from file-based data sources.

HighByte Intelligence Hub is an Industrial DataOps solution that contextualizes and standardizes industrial data from diverse sources at the edge to help bridge the gap between OT and IT systems, networks, and teams. HighByte leads the evolving Industrial DataOps market with the most complete solution to optimize the orchestration of usable industrial data across the enterprise. Highly regulated industries like life sciences and oil & gas have become the company’s fastest growing markets in the last twelve months.

The latest release also introduces Pipeline Debug, allowing users to test and troubleshoot individual pipeline stages without impacting the pipeline connected systems. Furthermore, the File connector has received major enhancements, including support for directory reads, SFTP, and character set decoding. The File connector is complemented with a set of new Pipeline stages to parse and write different file formats and extract and process file metadata. Together, the File connector and Pipelines create a holistic and seamless approach to advanced file processing for industry.

HighByte Intelligence Hub version 4.1 is now commercially available. All new features and capabilities introduced in version 4.1 are included in standard pricing.

Yokogawa Enhances IT/OT Integration with OpreX Collaborative Information Server

I quit my full-time job to become an independent blogger in 2013. One of the best decisions I ever made. It didn’t cost me much income those first years, and it saved me much grief over the ensuing time.

The blog’s name changed to The Manufacturing Connection. I thought at the time technology advances and applications would be all about connection/connectivity. That’s still true.

This news from Yokogawa emphasizes advances toward the nirvana of bringing information technology and operations technology into closer relationship. The problem solved by this is, of course, data. Data makes the enterprise go round.

Yokogawa states this new release of OpreX Collaborative Information Server (CI Server) strengthens connectivity with a range of devices and applications to “support digital transformation.”

OpreX Collaborative Information Server brings together large volumes of data from various plant equipment and systems to enable the optimized management of production activities across an entire enterprise, and provide the environment needed to remotely monitor and manage operations from any location and make swift decisions.

Here is what’s new

1. Addition of CI Gateway component—At renewable energy facilities and the like, gateways are deployed at various locations for the collection of data required for integrated monitoring and operations. And in all kinds of industries, the use of gateways for the relay of plant data to assemble higher-level applications is a common practice. In this release, a new component, CI Gateway, has been added. Compared to the previous version, this allows for simpler and more flexible implementation, making it easier to deploy OpreX Collaborative Information Server as a dedicated gateway.

I don’t know how many organizations I’ve been involved with where I’ve advocated RESTful APIs. This from Yokogawa.

2. Support of RESTful API—The RESTful API is a standard web application interface that is widely used in IT applications. With this release, it is now possible to access OpreX Collaborative Information Server data via the RESTful API. This facilitates access to operational technology (OT) data and leads to closer convergence between IT and OT systems, thereby realizing seamless integration and the efficient use of data. For example, it is now possible to construct a general-purpose web browser-based KPI dashboard that aggregates and displays data from various systems.

And finally:

3. Enhanced IEC 61850 communication driver—IEC61850 is an international protocol for communication networks/systems that is essential in the renewable energy industry. The enhanced support for this standard enables users to select safer interactive operations in place of certain operations that previously were executed automatically. For greater flexibility in operations, it is also now possible to utilize device report data as OpreX Collaborative Information Server data.

Creating an Adaptive Future for the Industrial Workforce

I have had a busy month. Good think I didn’t take four days to travel to Orlando. I’m wrapping up my last interview from there today. There are a few more pending if the media relations person can find a way to coordinate calendars.

This interview is with Kim Fenrich, ABB Global Product Marketing Manager, Process Automation, PC. He brought up the term “Digital Habitat”—something not found on the ABB website, but still an interesting concept.

The problem statement recognizes new people entering the industrial workforce. Many of these will not have much background in process operations. Meanwhile our digital technologies contain immense amounts of data that could be used to guide operators toward better decisions.

Fenrich brought a concept called Digital Habitat. This is the area alongside the core process control. This core contains monitoring and optimization. It houses process data. The data then gathers at the edge. In the ABB architecture, data at the edge becomes freely available to other applications, such as asset management and optimization. 

Not all data is created equally. Some are “dirty” data that must be cleaned before using. Some is good data from trusted sources with solid metadata. These many applications ride atop the system to run analytics, support decision-making, optimize operations. Sometimes operators are new lacking operations experience and knowledge. Data science to the rescue to clean up and provide interfaces to support these new workers. Sometimes the data science supports engineers working in maintenance and reliability performing predictive analytics or enhancing asset management.

ABB had a suite of applications called the Augmented Operator. The system does pattern mining. Perhaps the operator sees something new. They can ask the system, “Have you seen this before? If so, what happened and how was it resolved?” This greatly helps the younger generation operator. 

Should the situation be new to the system, then it can run simulations to predict outcomes and resolutions.

In short, the system:

  • Freeing up operators time for more meaningful work such as using data and advanced analytics to optimise processes for energy efficiency and carbon emission savings.
  • Enabling early warning of potential failure with AI-powered systems that can use real and historic data to offer troubleshooting solutions, much like a virtual assistant.
  • Workflow simulation to check outcomes and for training and augmented reality (AR) headsets to access experts working offsite.

This is from the ABB web site. The next step to achieve this reality is to fuse together the Distributed Control Systems’, operations technology and real-time control system with the Edge and newer IT technology, such as machine learning and AI. As well as incorporating historical data and the mining of other data sources for pattern recognition and knowledge extraction. This will shift the automation system beyond only real time control to one that allows the operator to augment operations from day-to-day. It will be a journey, but humans working with technological systems to augment their cognitive capabilities can amplify their potential and provide huge value to both the workforce and the industry at large – as well as attract new generations to the sector.

Follow this blog

Get a weekly email of all new posts.