PICMG Announces Release of New InterEdge Standard

  • Targeted at Open, Modular Process Control Systems
  • Modular compute, switch, and I/O architecture enables interoperable standard for industrial PCs, PLCs, and distributed control systems.
  • Supports IEC 61499 and IEC 61131 for compatibility with a wide range of automation systems.
  • Hot-swappable modules can be replaced or upgraded while the system is running, minimizing downtime and maintenance costs.

I am trying to understand this one. Reading it for the third time, I guess the Open Process Automation Forum decided that one of its defined components among its “standard of standards” needed to be a standard. Although this “open, modular process control system” sounds eerily familiar. I hope it does help move the industry forward.

­­PICMG, the consortium for open hardware specifications, announced the release of InterEdge, a modular architecture for process control systems (PCS). The IEC 61499 and IEC 61131-compatible InterEdge specification promises to revolutionize the industry with an interoperable, multi-vendor alternative to proprietary Industrial PCs (IPCs), Programmable Logic Controllers (PLCs), and Distributed Control Systems (DCSs).

Given that the OPAF initiative was begun by ExxonMobil engineering leaders, this quote is informative:

“Business needs evolve at an ever-increasing rate,” said Francisco Garcia, Americas Regional Instrument Lead at ExxonMobil Technology & Engineering Company and member of the InterEdge technical working group. “InterEdge delivers an interchangeable base hardware standard for industrial manufacturers looking to adapt to changing business needs. As a result, providers can deploy and scale dedicated physical assets and focus on value-added software and services.”

And from the press release:

InterEdge defines a vendor-neutral, open standard for edge computing and I/O module hardware. It segments hardware into Compute Modules, Switch Modules, and I/O Modules. All of these modules are connected via a common backplane, enabling easy customization and expansion of industrial automation functions.

An overview of the specification and an architecture diagram are available here. The full specification is available to purchase. 

And the reason for the standard:

By replacing proprietary edge devices, InterEdge eliminates vendor lock-in, simplifies integration and maintenance, and enables online upgrades, all of which contribute to significant cost savings.

Emerson Jumps Into The Software-Defined Automation Architecture Fray

  • Sees Boundless Automation as Industry Inflection Point to Address Data Barriers & Modernize Operations
  • Advanced software-defined automation architecture to integrate intelligent field, edge and cloud, unlocking a new era of productivity
  • Global automation leaders convene to learn about Boundless Automation at Emerson Exchange in Düsseldorf

I seem have become sort of persona non grata by the new marketing regime at Emerson Automation group. However, I picked up this news from it’s meeting last month in Düsseldorf, Germany. I found this statement by automation President and CEO Lal Karsanbhai interesting. It reflects the underlying philosophy I wanted to address when Dave and Jane and I started Automation World back in 2003. The world requires suppliers to go beyond proprietary control and leverage all the data for higher level decision making.

“After decades of implementing evolving automation strategies, manufacturers recognize the need to extract greater value from data that is locked in a rigid and now outdated automation architecture,” said Emerson President and CEO Lal Karsanbhai. “The proliferation of data and the development of advanced software are moving us to an era of unprecedented productivity. Rich data and advanced software are converging to form the next major inflection point in the industry.”

Acknowledging the foundational problems we’ve identified for years, Emerson says it is “poised to transform industrial manufacturing with the next-generation automation architecture designed to break down data silos, liberate data and unleash the power of software with Boundless Automation.”

I applaud Emerson’s strategy, although I do wish it had been done along with the standards efforts of OPAF. But only a couple of competitors seem to be serious about that one. Further, I continue to find companies in my research still trying to break down the silos. I thought we had accomplished that 10 years ago. I guess not. We still have complex networks of Microsoft Excel spreadsheets and every department for itself on data definition and retention.

To address this challenge and help customers achieve their operational improvements, Emerson is introducing a vision and actionable strategy to push more computing power closest to where it’s needed and establish the blueprint for a modern industrial computing environment. This environment includes flexibility to deploy software across the intelligent field; a modern, software-defined edge; and the cloud. All three domains will be connected through a unifying data fabric, helping to maintain data context, improve its usability and increase security.

Emerson’s modern, software-defined automation architecture will break down hierarchical networks, securely democratizing and contextualizing data for both people and the artificial intelligence (AI) engines that depend on a continuous flow of information.

Here are the components within Boundless Automation:

  • Intelligent Field: An intelligent field will simplify access to more data from more sources and a greater diversity of applications. With smarter devices and new connection technologies like 5G and APL, customers can streamline both connectivity from anywhere in the world, and integration across the new architecture
  • Edge: The new OT edge creates a modern, secure, low-latency computing environment, putting new software tools and actionable data closest to its user. This enhanced edge environment establishes a platform for IT and OT colleagues to innovate and collaborate more than ever before.
  • Cloud: The cloud will power complex operations and engineering capabilities on-premise and across the enterprise by providing infinite analytical computing power, enterprise collaboration, attractive lifecycle costs and on-demand support and service.

The Open Group Welcomes Shell as Its Latest Platinum Member

I just released a podcast where I thought about standards, interoperability, and open technologies. This news came my way, speaking of open, that Shell Information Technology International has become a platinum member of The Open Group.

Shell has been a Member of The Open Group since 1997, and has contributed to its numerous Forums which enable collaboration to develop open technology standards and certifications. The company played a critical role in the foundation of The Open Group OSDU Forum that facilitates the development of transformational technology for the world’s changing Energy needs, and donated important intellectual property that formed the basis of the OSDU Data Platform. Shell also contributed to the inception of The Open Group Open Footprint Forum that focuses on creating an environmental footprint data model standard applicable to all industries.

The Open Group is a global consortium that enables the achievement of business objectives through technology standards. Its diverse membership of more than 900 organizations includes customers, systems and solutions suppliers, tool vendors, integrators, academics, and consultants across multiple industries.

Glad to see end user companies taking an active part in openness. Their support is the only way open technologies will grow.

Addressing the Increase in Wireless Demand with Frequency-Hopping Metasurfaces

We all know that the Industrial Internet of Things and other wireless devices are straining the wireless spectrum. Spectrum turns out to be a scarce resource. With continually growing communication requirements, both data and voice, this is a problem searching for a solution. (Unlike many things floating around these days that are solutions searching for a problem.)

I am publishing this entire release regarding research into something called metasurfaces that could provide some of the solution to our spectrum strangling problem.

Recent advances in communication systems, such as the increase in mobile phone users, the adoption of Internet-of-Things devices, and the integration of smart sensors in applications ranging from smart homes to manufacturing have given rise to a surge in wireless traffic. Similar to how a roadway becomes congested with vehicles, the rising wireless traffic is resulting in congestion in the available frequency bands. New frequency bands have been introduced to accommodate more communication signals to operate wireless devices without severe interference with each other.

However, supporting a broad spectrum is challenging. There are only a limited number of frequency bands available. Additionally, it increases the complexity of wireless devices and infrastructure. One possible solution for accommodating signals within existing frequency bands is to tune them in a way to further distinguish them.

Now, in a new study published in Nature Communications, a team of researchers from Japan, led by Associate Professor Hiroki Wakatsuchi from Nagoya Institute of Technology, along with co-authors Ashif Aminulloh Fathnan and Associate Professor Shinya Sugiura of the University of Tokyo, has designed a metasurface that can distinguish wireless signals based on their frequency and pulse width.

In simpler words, metasurfaces are engineered surfaces that can manipulate incident electromagnetic waves to achieve specific modifications leading to the generation of different signals. This ensures that signals are separated and do not interfere with each other, reducing the likelihood of congestion-related issues. These materials can be integrated into radio-frequency devices like antennas and filters to accommodate more users and devices within the same frequency spectrum.

The metasurface developed by researchers in this study distinguishes signals more effectively than traditional materials. “Conventionally, when the number of frequencies available was N, electromagnetic waves and related phenomena could be controlled in N manners, which is now markedly extended to the factorial number of N (i.e., N!),” explains Dr. Wakatsuchi.

The developed metasurface consists of several unit cells that respond to specific frequencies. By activating multiple unit cells, it becomes capable of handling signals across multiple frequency bands. The metasurface can be thought of as a filter that selectively transmits signals based on specific frequency sequences. The researchers liken this to frequency-hopping, where devices switch frequencies rapidly to avoid interference. However, in this case, the metasurface can be tuned to alter incoming signals based on their frequencies. This property makes it possible to receive and distinguish a variety of signals of different frequencies from wireless devices.

As a result, with the new metasurface, the number of signals that can be distinguished grows from a linear relationship to a factorial-based one. “If four or five frequencies are available, the number of signals to be distinguished increases from four or five to 24 or 120,” remarks Dr. Wakatsuchi, adding further, “Going ahead, this could help in more wireless communication signals and devices being made available even with limited frequency resources.”

According to the researchers, the number of devices connected to wireless networks per square kilometer is projected to increase from a million in 5G to 10 million in 6G by 2030. This substantial increase will inevitably strain existing frequency bands. However, with their capability to distinguish wireless signals, metasurfaces represent a novel approach to operate numerous Internet-of-Things sensors and communication devices without severe interference.

In the long run, this will be important for next-generation communication services, such as autonomous driving, smart factories, digital twin, cyber-physical systems, and behavior recognition systems!

Dr. Hiroki Wakatsuchi is an Associate Professor in the Department of Engineering at Nagoya Institute of Technology (NITech), Japan. He completed his Ph.D. from The University of Nottingham, UK whereupon he did his postdoctoral research at UC San Diego, USA. His research interests include electromagnetics, electronics, and communications. He has so far published 62 papers (49 papers between 2005 and 2023) with over 800 citations to his credit. Dr. Wakatsuchi was also a part of the Precursory Research for Embryonic Science and Technology (PRESTO) in the Japan Science and Technology Agency (JST) until March 2023. Currently, he is involved with Fusion Oriented Research for Disruptive Science and Technology (FOREST), another JST program.

FDT Group Certifies First FDT 3.0 DTM From Flowserve Corporation

Modern flow control software driver based on FDT/DTM technology extends standardized device management to mobile and OPC UA applications.

Glad to see movement with the latest technology from the FDT Group. It certified the first Device Specific DTM based on FDT 3.0 standard supporting the HART protocol from Flowserve Corp. The newest certified Logix 3820 Series DTM is deployable using their positioners supporting HART 6 / 7, tackling flow control challenges designed for modern IIoT architectures. 

FDT DTM certification to the FDT 3.0 specification and webUI is a process whereby rigorous compliance testing using dtmINSPECTOR5 ensures the viability of the states of the DTM; its correct installation, de-installation, and multi-user environment capability; mandatory and optional user interface functionality and robustness; network scanning communication performance and the ability to import and export the topology; and the audit trail capability.

FDT 3.0 DTMs are crucial to unlocking universal device integration with essential advantages such as platform independence, mobility solutions, and a contemporary development environment to reduce costs and expedite the DTM certification process. Users can use secure and seamless data exchange/interrogation from the sensor to the cloud and achieve new levels of information technology (IT) and operational technology (OT) integration. 

“This certification is a milestone in market penetration and technology development,” says Steve Biegacki, FDT Group managing director. “Flowserve has always been a leader in flow control using DTM technology and now offers the first flow control management DTM standardized for IIoT architectures based on FDT 3.0 for HART applications. HART users can deploy this new DTM and reap the benefits by using an FDT 3.0-based device management tool, such as PACTware 6.1, and can enjoy an IT/OT data-centric model by deploying an FDT Server, extending the data reach to mobile applications and the enterprise.”

Compression Brings Bandwidth Boost to Vision Applications

As long as I have been working with and covering vision technology in manufacturing bandwidth has been the constraint to robust applications. A Canadian company called Pleora Technologies has introduced a patented lossless compression technology called RapidPIX that is said to increase data throughput by almost 70 percent while meeting the low latency and reliability demands of machine vision and medical imaging applications.

RapidPIX is initially available on Pleora’s new iPORT NTx-Mini-LC platform, which provides a compression-enabled drop-in upgrade of the widely deployed NTX-Mini embedded interface. With added compression, designers can deploy the iPORT NTx-Mini-LC to support low latency transmission of GigE Vision compliant packets at more than 1.5 Gbps throughput rates over existing 1 Gb Ethernet infrastructure. Manufacturers are designing the iPORT NTx-Mini-LC embedded interface with RapidPIX compression into X-ray panels for medical and dental imaging, contact image sensors (CIS), and industrial camera applications.

Pleora’s RapidPIX compression is now available on the iPORT NTX-Mini-LC embedded interface to support low latency transmission of GigE Vision compliant packets at more than 1.5 Gbps throughput rates over existing 1 Gbps infrastructure. To speed time-to-market, the iPORT NTx-Mini-LC with RapidPIX Development Kit helps manufacturers develop system or camera prototypes and proof-of-concepts easily and rapidly, often without undertaking hardware development.

Follow this blog

Get a weekly email of all new posts.