Select Page

API Builder for Intelligence Hub

HighByte seems to be making more of a splash lately with its DataOps plus Universal Namespace developments. Several years ago I spotted DataOps as an important technology application. Usage still grows.

This news from HighByte concern the ability to build custom APIs.

HighByte released HighByte Intelligence Hub version 4.1 with Callable Pipelines, enabling users to build custom APIs for their operations and industrial data sources. Callable Pipelines, together with the REST Data Server, functionally serve as an “Industrial Data API Builder” that allows users to build custom API endpoints to interact with industrial data. In addition to Pipelines enhancements, version 4.1 delivers native integration with Aspen InfoPlus.21, support for the new Amazon S3 Tables service, granular audit logging for regulated industries, and seamless event capture from file-based data sources.

HighByte Intelligence Hub is an Industrial DataOps solution that contextualizes and standardizes industrial data from diverse sources at the edge to help bridge the gap between OT and IT systems, networks, and teams. HighByte leads the evolving Industrial DataOps market with the most complete solution to optimize the orchestration of usable industrial data across the enterprise. Highly regulated industries like life sciences and oil & gas have become the company’s fastest growing markets in the last twelve months.

The latest release also introduces Pipeline Debug, allowing users to test and troubleshoot individual pipeline stages without impacting the pipeline connected systems. Furthermore, the File connector has received major enhancements, including support for directory reads, SFTP, and character set decoding. The File connector is complemented with a set of new Pipeline stages to parse and write different file formats and extract and process file metadata. Together, the File connector and Pipelines create a holistic and seamless approach to advanced file processing for industry.

HighByte Intelligence Hub version 4.1 is now commercially available. All new features and capabilities introduced in version 4.1 are included in standard pricing.

Yokogawa Enhances IT/OT Integration with OpreX Collaborative Information Server

I quit my full-time job to become an independent blogger in 2013. One of the best decisions I ever made. It didn’t cost me much income those first years, and it saved me much grief over the ensuing time.

The blog’s name changed to The Manufacturing Connection. I thought at the time technology advances and applications would be all about connection/connectivity. That’s still true.

This news from Yokogawa emphasizes advances toward the nirvana of bringing information technology and operations technology into closer relationship. The problem solved by this is, of course, data. Data makes the enterprise go round.

Yokogawa states this new release of OpreX Collaborative Information Server (CI Server) strengthens connectivity with a range of devices and applications to “support digital transformation.”

OpreX Collaborative Information Server brings together large volumes of data from various plant equipment and systems to enable the optimized management of production activities across an entire enterprise, and provide the environment needed to remotely monitor and manage operations from any location and make swift decisions.

Here is what’s new

1. Addition of CI Gateway component—At renewable energy facilities and the like, gateways are deployed at various locations for the collection of data required for integrated monitoring and operations. And in all kinds of industries, the use of gateways for the relay of plant data to assemble higher-level applications is a common practice. In this release, a new component, CI Gateway, has been added. Compared to the previous version, this allows for simpler and more flexible implementation, making it easier to deploy OpreX Collaborative Information Server as a dedicated gateway.

I don’t know how many organizations I’ve been involved with where I’ve advocated RESTful APIs. This from Yokogawa.

2. Support of RESTful API—The RESTful API is a standard web application interface that is widely used in IT applications. With this release, it is now possible to access OpreX Collaborative Information Server data via the RESTful API. This facilitates access to operational technology (OT) data and leads to closer convergence between IT and OT systems, thereby realizing seamless integration and the efficient use of data. For example, it is now possible to construct a general-purpose web browser-based KPI dashboard that aggregates and displays data from various systems.

And finally:

3. Enhanced IEC 61850 communication driver—IEC61850 is an international protocol for communication networks/systems that is essential in the renewable energy industry. The enhanced support for this standard enables users to select safer interactive operations in place of certain operations that previously were executed automatically. For greater flexibility in operations, it is also now possible to utilize device report data as OpreX Collaborative Information Server data.

Creating an Adaptive Future for the Industrial Workforce

I have had a busy month. Good think I didn’t take four days to travel to Orlando. I’m wrapping up my last interview from there today. There are a few more pending if the media relations person can find a way to coordinate calendars.

This interview is with Kim Fenrich, ABB Global Product Marketing Manager, Process Automation, PC. He brought up the term “Digital Habitat”—something not found on the ABB website, but still an interesting concept.

The problem statement recognizes new people entering the industrial workforce. Many of these will not have much background in process operations. Meanwhile our digital technologies contain immense amounts of data that could be used to guide operators toward better decisions.

Fenrich brought a concept called Digital Habitat. This is the area alongside the core process control. This core contains monitoring and optimization. It houses process data. The data then gathers at the edge. In the ABB architecture, data at the edge becomes freely available to other applications, such as asset management and optimization. 

Not all data is created equally. Some are “dirty” data that must be cleaned before using. Some is good data from trusted sources with solid metadata. These many applications ride atop the system to run analytics, support decision-making, optimize operations. Sometimes operators are new lacking operations experience and knowledge. Data science to the rescue to clean up and provide interfaces to support these new workers. Sometimes the data science supports engineers working in maintenance and reliability performing predictive analytics or enhancing asset management.

ABB had a suite of applications called the Augmented Operator. The system does pattern mining. Perhaps the operator sees something new. They can ask the system, “Have you seen this before? If so, what happened and how was it resolved?” This greatly helps the younger generation operator. 

Should the situation be new to the system, then it can run simulations to predict outcomes and resolutions.

In short, the system:

  • Freeing up operators time for more meaningful work such as using data and advanced analytics to optimise processes for energy efficiency and carbon emission savings.
  • Enabling early warning of potential failure with AI-powered systems that can use real and historic data to offer troubleshooting solutions, much like a virtual assistant.
  • Workflow simulation to check outcomes and for training and augmented reality (AR) headsets to access experts working offsite.

This is from the ABB web site. The next step to achieve this reality is to fuse together the Distributed Control Systems’, operations technology and real-time control system with the Edge and newer IT technology, such as machine learning and AI. As well as incorporating historical data and the mining of other data sources for pattern recognition and knowledge extraction. This will shift the automation system beyond only real time control to one that allows the operator to augment operations from day-to-day. It will be a journey, but humans working with technological systems to augment their cognitive capabilities can amplify their potential and provide huge value to both the workforce and the industry at large – as well as attract new generations to the sector.

HiveMQ Introduces HiveMQ Pulse, a Next-Generation Distributed Data Intelligence Platform

Even though I have worked in this industry a long time and have configured, programmed, installed many software applications and devices, sometimes I have trouble reading though a press release and determining just what the company has done.

In this case, the email message included a sentence by the guy who runs the ProveIt! conference that this product will be the talk of the show. So, I asked. They gave me a demo. OK, it’s pretty cool. I’m glad they are embedding some AI (or ML, which is a subset of AI) in the product. That is the price of entry today. But I did like the way they could configure data points on the fly and its use of namespace technology. It’s worth checking out depending upon your needs.

HiveMQ, the global leader in enterprise MQTT solutions, announced the launch of HiveMQ Pulse, a next-generation distributed data intelligence platform. With a flexible architecture uniquely suited to support a unified namespace (UNS) approach, HiveMQ Pulse enables organizations to manage, transform, govern, and derive insights from distributed devices and systems, providing a single, structured view of operational data across the enterprise. 

Use cases include:

  • Unified Data Management: Data cataloging, transformation, and governance ensure context is established and standardized across the enterprise. 
  • Real-time Actionable Insights: Works on data in motion for contextualization and better decision-making.
  • Distributed Intelligence: Make decisions close to the data source for immediate impact. Add AI/ML models to make more complex decisions.

Built on MQTT’s publish/subscribe architecture, HiveMQ Pulse efficiently moves data from edge to cloud, supporting real-time processing even in resource-constrained and high-throughput environments. It features flexible data storage for historical analysis, enabling rapid, informed decision-making at the edge. Unified namespace tools simplify integration by providing consistent, reusable data structures, while AI/ML integration empowers predictive insights and smarter operations. With scalable event-driven streaming and in-flight data transformation, HiveMQ Pulse delivers contextualized data exactly where and when it’s needed.

HiveMQ Pulse is now available in private preview. Apply here

How Ignition Fits Within the Purdue Model

I have written several times over the past decade about how the Industrial Internet of Things and such hardware advancements as the Edge have blown up the Purdue Pyramid Model of industrial Architecture. Especially as an information model.

Rather than a static hierarchy, information can flow from source to consumer bypassing layers as required.

However, the Purdue Model (and ISA95) certainly do describe different types of functions of a manufacturing enterprise. How these functions are distributed and how they interrelate has undergone subtle changes over the years, yet they remain relevant.

Inductive Automation recently published an article explaining how its Ignition platform fits within the Purdue model. It’s an interesting read. Certainly there remain levels of applicability within the model. It depends upon what you are building. 

Check out the article.

Yokogawa Releases OpreX Intelligent Manufacturing Hub

The automation side of Yokogawa has not contacted me for years. I’ve lost all my contacts there. Recently some news has come my way. This news incorporates a lot of things currently receiving media attention—data integration and visualization solution that incorporates robotic process automation (RPA).

Yokogawa Electric Corp. has announced the global release in all markets other than Japan of OpreX Intelligent Manufacturing Hub. By utilizing robotic process automation (RPA) implemented in a low-code / no-code environment or through customization by Yokogawa, this data integration solution can significantly reduce reporting time. OpreX Intelligent Manufacturing Hub covers the full range of key performance indicators (KPIs), workflows, and reporting at every level of the organization, from the C-suite to the plant floor, and employs a single database to integrate and display on dashboards data that customers need to make the right decision at the right time.

Main Features

  1. User-friendly dashboards that visualize data for decision makers at each layer of the organization
  2. Reduction in reporting time

The OpreX Intelligent Manufacturing Hub also allows for the drilling down through data to find root causes and gain insights. It is suited for use in a wide variety of industries, from oil & gas to chemicals and pharmaceuticals. 

Along with this solution, Yokogawa will provide holistic support and services through its global network that are essential for the success of any intelligent business tool project, including definition of specifications, training, maintenance, and technical support.

Follow this blog

Get a weekly email of all new posts.