Select Page

OMAC and ei3 Release New Manufacturing Data Governance Guide 

OMAC birthed about the same time that I left manufacturing to become an editor at Control Engineering. Originally Open Modular Architecture Computer it held center stage at the annual ARC Industry Forum for many years. The open computer as a PLC replacement designed to drive down the high costs of controllers never really made it. The organization did drive a number of initiatives benefiting users, especially PackML, a uniform way of describing processes of a machine to an operator.

Now called The Organization for Machine Automation and Control, it has found a home within the Packaging Machine Manufacturers Institute (PMMI) which also was the temporary home of my old magazine Automation World.

I haven’t seen or heard much from OMAC for quite some time. I’m glad to see some news with the release of some useful information. 

OMAC, the leading industry association for machine automation and control, in collaboration with ei3, releases a new guide on industrial data sharing titled “Practical Considerations for Data Governance”. The guide provides expert insights and actionable recommendations to help organizations establish effective data governance and sharing practices. 

The document covers several critical topics, including the importance of data governance, the need for a common language to facilitate data exchange across systems, and the various applications that use plant floor data. It examines the key components for plant floor data, the significance of data governance standards and organizations in manufacturing, and establishes a structure for data access and integration from multiple sources.

Spencer Cramer, OMAC’s Chairman, and the Founder and CEO of ei3, emphasized the critical need for organizations to collect data and transform it into actionable insights to optimize production and efficiency in the constantly evolving manufacturing landscape. “We are excited to launch this guide and provide the industry with a resource that outlines practical considerations for facilitating data sharing within and across organizations to improve processes, save costs, and mitigate errors,” he said.

Glad to see Mark Fondl still active and looking at how to use new technologies. Once again as a new technology editor, Fondl and I had long discussions about the future of Ethernet as an industrial network. That was not a foregone conclusion in the late 90s. Now, it’s standard.

Mark Fondl, OMAC’s Digital Transformation Workgroup Leader and Vice President of Product Management at ei3, explained that plant floor data can be valuable for different user groups, and data scientists can help maximize its potential. He added, “To efficiently use plant floor data, an assessment should be made, and a team should be created to ensure all stakeholders are coordinated. The data governance policy should include both IT and OT, and guidelines should be provided for internal and external companies. The plan should be adaptable to changing capabilities to ensure its long-term success.”

OPC, MQTT, IoT, Edge, Power Future Manufacturing Technology

There was a time when I would take information from OPC Foundation and chat with the MQTT people and then return the favor. It was much like being in the midst of a religious war.

My response was (is) that the market will decide. Individual engineers will choose the solution that best fits their needs at the time. If both technologies have sufficient benefit to enough engineers to form a market, then both will survive. I think there is room in the market for both, since they sort of do the same thing, but actually each provides unique benefits.

I’ve been thinking about this for a while since I’ve had so many other things to digest. The impetus came from a couple of directions—OPC Foundation President Stefan Hoppe’s editorial in the June newsletter and from Stacey Higginbotham’s IoT Newsletter recently that discussed edge.

Hoppe wrote, “Still to this day people only think of OPC UA merely as a secure protocol to move information. It is so much more than that. It is a modeling language in cloud applications and digital twins. It is capable of file transport (since 2009). Most people know that OPC UA started as an initiative in the OT world and expanded from the PLC control plane to SCADA and later to MES and ERP. More and more people are realizing that OPC UA via MQTT is the bridge between OT and IT and is able to push information directly into Microsoft and AWS cloud dashboards without the need for an adapter.”

From Data to Data Sources

Stacey Higginbotham writing in Stacey on IoT Bringing AI to the farthest edge requires new computing.

Stacey writes about IoT generally. Most of her topics are commercial/consumer and chips (her reporting background). She does follow the IoT trail into manufacturing at times. In this newsletter she broaches into something I’ve been expounding for a long time, that is, how edge devices have become smarter with better communications. Then the IT world came up with the term Edge, which is, of course everything manufacturing.

We’re in the midst of a computing shift that’s turning the back-and-forth between cloud and edge computing on its head. This new form of computing has been creeping to the forefront for the last few years, driven by digital transformations and complicated connected devices such as cars.

But the more recent hype around AI is providing the richest examples of this shift. And it will ultimately require new forms of computing in more places, changing both how we think about the edge and the types of computing we do there. In short, the rise of AI everywhere will lead to new forms of computing specialized for different aspects of the edge. I’m calling this concept the complex edge.

As part of this shift in computing, we have to become more nuanced about what we mean when we talk about the edge. I like to think of it as a continuum moving from the most compute and power-constrained devices such as sensors to the most powerful servers that happen to be located on premise in a factory. In the middle are devices such as tablets, smartphones, programmable logic controllers (PLCs), and gateways that might handle incoming data from PLCs or sensors.

Moreover, each of these devices along the continuum might run their own AI models and require their own specialized type of computing to compare the data coming into those models. For example, I’ve written about the need for sensors to get smarter and process more information directly.

Smart sensors turn to analog compute

Cameras or image sensors are popular examples of such devices. This vision sensor from Useful Sensors, which can do person detection on a $10 device, runs a simple algorithm that looks for people and counts them. At a higher level, which requires more processing power, sensors from Sony or chips from CEVA are able to detect specific movements, faces, or other options.

A few weeks ago at the Sensors Converge event, a company called Polyn Technology showed off a version of a chip designed to take raw data and quickly convert it into an insight. To quickly process analog signals from the environment (such as vibrations or sound), the Polyn chip uses analog processing to process the signal and then sends the “insight” to another computer for more processing.

We not only have cameras shooting pictures for QA purposes, but also they are streaming video for applications from industrial engineering to surveillance to predictive maintenance. This is a vast amount of data. 

We have tools, but we will need more. Chips with built in communication and analytics are a start.

Data Management and Analytics with Emerson Asset Management Software

Data management and analytics are two consistent current trends in the industrial / manufacturing technology market. This announcement from Emerson, now calling itself “a global software and engineering leader,” discusses how they brought these to asset management.

AMS Device Manager Data Server securely extends intelligent field device data to outside systems to make it easier for reliability and maintenance teams to further capitalize on modern advanced analytics software, providing a step change in operational efficiency and smart manufacturing.

AMS Device Manager Data Server publishes intelligent field device data nearly instantaneously to industrial software analytics solutions already in use by customers, eliminating the need for complex custom data integration and manual workarounds that often cause delayed results and siloed data. This data is relayed via secure industry protocols.

AMS Device Manager Data Server makes it easy to import critical instrument and valve data into common dashboarding tools and applications like Microsoft PowerBI, Emerson software tools such as the Plantweb Optics platform, Plantweb Insight, Aspen MTell and AspenTech Inmation, plant historians and others.

Honeywell User Group Recap–Many New Technologies, Applications

Honeywell Process Solutions held its annual HUG (Honeywell User Group) conference the week of June 19 in Orlando. I’ve taken some time to compile my many notes and think about the experience.

The marketing communications staff did an excellent job with media and analysts. We did not have time to waste what with presentations and 1:1 conversations.

I had not attended for a few years. For maybe three years I was in the influencer program with Hewlett Packard Enterprise and HPE Discover is the same week. That program was disbanded a year or so ago. That marked the end of my IT affiliations. Those companies figured out there was not a lot of money to be made in manufacturing.

There were many questions begging for answers as I traveled to Florida. What was Honeywell HIVE, and how does it relate to the ExxonMobil initiated Open Process Automation group? What is Honeywell Digital Prime and what customer problems does it address? What successes have Honeywell achieved with sustainability initiatives? Honeywell was an early mobility developer. What has progressed in that regard? What role does Honeywell see for AR and VR?

Pramesh Makeshwari, CEO

He mentioned he’d been CEO of this group for only about nine months. Here are a few points of overview.

  • Honeywell is not replacing people with technology but helping them perform better
  • People have different learning styles and Honeywell products adapt to them
  • Digitalization is a significant customer requirement
  • Companies are on the Path to Net Zero Carbon
  • Focus on Digital Workforce Competency

Evan Van Hook, Chief Sustainability Officer

He looks at sustainability as similar to the Quality Revolution where the goal was to produce quality outputs consistently creating a culture of quality. His question, “Can we create culture of sustainability?” Honeywell is taking a Lean approach—quality, delivery, inventory, cost, then add sustainability.

Lean is a systematic approach. The company overall has generated more than 6,500 projects over 13 years with ideas coming from the floor and everywhere else. Not a political statement, sustainability cuts costs and adds efficiency. A few milestone points:

  • 92% reduction of CO2
  • 70% improvement in energy efficiency
  • Restored 3,000 acres of land
  • Water savings
  • 4x industry average safety

Act your way into a new way of thinking—Lean—put sustainability into Lean

Tiffany Barnes – Digital Prime

I perhaps had the most difficulty understanding Digital Prime. This is the Honeywell offering responding to the customer need for digital transformation. So, the conversation with Tiffany Barnes from that group was most instructive. Part of my cognitive dissonance perhaps came from this being a new offering only having one part released.

Digital Prime is most easily described as cloud-hosted digital twin of DCS. Some of the customer pressures Digital Prime addresses include:

  • Risk of disruption, production downtime and plant safety
  • Pressure to reduce overall lifecycle cost
  • Do more with less through digitalization
  • Data overload
  • Reduced skilled workforce onsite

It is perhaps an irony that Honeywell build a virtual infrastructure to help with system acceptance then deleting it upon that acceptance. Customers began looking at digital transformation programs and realized that all this data Honeywell had was useful. This grew to a digital twin.

Honeywell’s Digital Prime is the up-to-date digital twin for tracking, managing, and testing process control changes and system modifications. It brings the highest level of quality control to the smallest projects: An efficient, compliant, and collaborative solution for managing changes, factory acceptance tests, improved project execution and training.

Providing secure cloud-based connectivity and a virtual engineering platform, it’s a collaborative environment for managing and testing additions, patches, upgrades and other system changes:

  • Enabling functional reviews and impact analysis
  • Supporting remote FAT tests 
  • Providing a training tool
  • Documenting digital changes.

Joe Bastone — HIVE

Veteran editors and analysts were most curious about any Honeywell response to the initiatives undertaken by The Open Group to solve problems of economically and efficiently upgrading control systems.

This led to my intense interest in Honeywell HIVE and a subsequent conversation with Joe Bastone.

The problem lies with traditionally tightly coupled control hardware, software, and I/O.

Honeywell mostly solved the I/O problem years ago with its configurable I/O. That part of the control system continues to evolve.

The company then worked with a major customer about how to upgrade control software with minimal disruption. First, they worked out how to move the existing control software to a modern hardware platform leaving all the I/O in place. They realized that was in reality a form of virtualization. Moving to a virtualized compute environment effectively decoupling hardware and software was the obvious next step. Their I/O was already virtualized and decoupled. 

So, Honeywell HIVE solves that upgrade problem that customers are searching for.

Thanks to Joe for walking me through the technology evolution.

Sarang Gadre — Battery Technology

The well documented issue with intermittent renewables (solar, wind) results from the laws of climate—the wind does not always blow and the sun does not always shine. Honeywell has had a commercial battery storage product for a while. It is housed in shipping containers. Introduced to us at HUG is the Ionic—a scalable, forklift-able, virtual power plant,  with an energy control center in Experion. It is battery agnostic—you specify and buy your batteries of choice. The unit also features peak load shaving.

Naved Reza—Carbon Capture

I always enjoy conversations with Naved regarding sustainable technology solutions.

First up was reference to the ExxonMobil Baytown deployment of one of Honeywell’s carbon capture technologies – Honeywell’s CO2 Fractionation and Hydrogen Purification System. This technology is expected to enable ExxonMobil to capture about 7 million tons of carbon dioxide (CO2) per year, the equivalent of the emission of 1.5 million of automobiles for one year.

Then we discussed Honeywell Ecofining—Renewable Fuel projects such as Diesel/Aircraft from biofuels. Also Ethanol to Jet and Methanol to Jet.

Aside from Baytown, there are a number of Carbon Capture (CO2) to blue hydrogen, renewable green, low carbon processing.

Manas Dutta — SafetyWatch Mobility

Performing maintenance on a pump involves an average of 3.5 round trips for the technician. Using augmented reality (AR) platforms can save many hour by providing the right documentation and required tools up front.

I made this trip closely following both the Apple Vision Pro announcement along with all the AI chat hype. So I had to ask Manas for his take from the industrial viewpoint.

“AR/VR are excellent for training especially as individualized based on AI feedback. AR/VR can also be useful for construction. When planning turnarounds, I can answer questions such as can I get a crane in, do I need scaffolding, without a visit remote site.”

HiveMQ Expands Cloud-Based MQTT Platform Offering

MQTT is a fast, light-weight protocol for moving data. The origins were with an IBM application. It is now widely adopted in manufacturing and industrial applications when you don’t need the full information modeling of OPC UA. HiveMQ is a relatively new company specializing in MQTT application development. This new product from them is a starter kit for using a cloud-based MQTT solution.

HiveMQ announced HiveMQ Cloud Starter plan, a dedicated, fully-managed, self-service MQTT platform. HiveMQ Cloud allows customers to rapidly procure a production-ready MQTT platform based on hourly usage, with unlimited connections and devices. 

With HiveMQ Cloud, companies can develop, test, deploy, and scale production IoT use cases without the large investment and complexity of maintaining their own infrastructure. Customers in any industry from transportation to connected products to manufacturing can get started quickly and easily for free, then scale as use cases expand and require additional features to enable security, observability, administration and data integrations. 

“We are proud to be the first and only MQTT platform on the market to offer multiple plans that are not capped by the number of connections to the platform,” said Dominik Obermaier, Co-founder and CTO, HiveMQ. “We feel this will be a game changer for the adoption of MQTT from the developer to the enterprise.” 

HiveMQ Cloud offers the following plans:

  • Serverless: Community-supported, multi-tenant MQTT broker ideal for experimenting and learning; 100 sessions free.
  • Starter: Self-service, single-tenant MQTT platform deployed in a Platform-as-a-Service environment; pay hourly as you grow.
  • Professional: Everything in Starter plus access to robust support, observability, security, and integration extensions; pay hourly as you grow or choose an annual subscription.
  • Enterprise: Dedicated MQTT infrastructure and custom systems integration for scaling sophisticated, enterprise-grade projects; pricing upon request.

The HiveMQ Cloud Serverless, Starter, Professional, and Enterprise plans address a wide range of customer needs for value-priced, cloud-based MQTT Platform-as-a-Service offerings to fit any deployment stage or budget. Any size customer from start-up to enterprise can start with the Serverless or Starter plans for pilot projects and POCs, and feel confident in their ability to scale to a comprehensive and feature-rich offering on the same platform.

Litmus Adds Digital Twin Support

There were a couple of process automation news items. Now, let’s switch to the industrial edge and talk data. Litmus announced the availability of digital twins for users of Litmus Edge and Litmus Edge Manager. Now manufacturers have a streamlined way to collect, contextualize, normalize and analyze data with visual representations of purpose-driven digital twin models.

Litmus’ infrastructure supports:

  • Asset and site twins
  • Use case driven models like energy monitoring, predictive maintenance, production optimization and quality control
  • Flexibility to support various digital twin models
  • Easy modeling of collected data
  • Real-time decision-making and simulation of different scenarios

Follow this blog

Get a weekly email of all new posts.