Distribution Management Standardizes on Aruba Network Infrastructure

I’ve helped a number of people, including myself, with the installation of networking and WiFi in homes. When I consulted with an organization with a large building and many users, I ran into Aruba. About five years ago I started attending HPE conferences and getting deeper dives into robust networking. Perhaps the best story was meeting the IT head of the European PGA tour who talked about the WiFi requirements for hosting the Ryder Cup.

Aruba is one of my go-to sources for networking technology and application news these days. Today, this news came about a rugged network solution that includes IoT and robots and other things near and dear to my heart. Use this for an example of what you can accomplish with this equipment whether in distribution or other industrial or manufacturing settings.

Aruba announced that Distribution Management, a leading fulfillment and supplies network, is standardizing on Aruba wireless, switching, and security solutions across all of its locations to enable digital transformation initiatives to support the company’s rapid growth and increasing customer demand. 

With headquarters in St. Charles, Missouri, and five distribution centers across the U.S., Distribution Management operates a supplies and order fulfillment network that can reach 99 percent of the country within one-to-two days. The company also operates a Foreign Trade Zone that services international needs, and offers managed services for printer fleets, dispatching technicians for troubleshooting and repair, as well as supplying printer repair parts.

As Distribution Management has grown and seen increased demand from customers, modernizing and streamlining operations, and improving efficiencies and employee productivity, have become key objectives. According to Tom Huck, Distribution Management’s Director of Infrastructure Technologies, the company realized it needed to upgrade its network infrastructure to achieve these objectives.

“We knew that the underlying network foundation would be critical to advancing all of the modernization and expansion efforts we had in mind,” Huck said. “It was pretty clear our existing network couldn’t accommodate our growing needs.”

As the company planned for expansion into larger warehouse spaces, with an increasing number of IoT devices on the network, and new distribution models that would include replacing traditional conveyor equipment with robots, it realized their legacy Cisco network had to be replaced. After evaluating new solutions from both Cisco and Aruba, Distribution Management chose to standardize on Aruba across all of its sites.

Working with partners Insight and InterVision, Distribution Management began installing Aruba access points, mobility controllers, and access switches, as well as ClearPass, so the organization can authenticate every wired and wireless device that accesses the network and begin implementing consistent role-based policies. 

According to Jim Adelmann, network administrator for Distribution Management, the new Aruba network is accelerating the organization’s warehouse digital transformation, allowing the IT team to connect a variety of crucial devices including robots to the network for automated inventory management. The robots use the wireless network to check in and “call home” for real-time order details that indicate where to go within the warehouse and how to process orders.

Other devices and applications benefiting from the Aruba network include Zebra barcode printers and wrist-mounted TC52 devices, which warehouse employees use to scan, pick and pack inventory, QubeVu hardware which is essential in viewing and analyzing the dimensions of packages coming through their warehouse so they can be processed properly, and RingCentral softphones for videoconferencing and employee laptops and mobile devices.

From an IT perspective, Adelmann said the Aruba network is providing the kind of reliability, redundancy and always-on connectivity that Distribution Management needed to move forward with their expansion and modernization efforts confidently.

“Had we stuck with Cisco, we would have spent twice the amount of time getting the network up and running. With Aruba, network management is so easy – we had everything rolling within a week,” Adelmann said.

Adelmann also noted that the modularity of the Aruba solutions allowed the IT team to stage everything ahead of time and that the uptime has been “through the roof,” freeing up the IT team to focus on more strategic initiatives. Added Huck, “Lost productivity translates into lost revenue. That’s why having a solid network foundation was so vital to supporting our digital initiatives and meeting our growth and expansion needs.”

To date, Distribution Management has installed the Aruba infrastructure in three of its five distribution centers and plans to continue its roll-out to its additional two centers in the near future. In 2021, the company also plans to continue its move to role-based access control with ClearPass, and will evaluate Aruba Central for management, as well as Aruba User Experience Insight sensors to provide additional troubleshooting and diagnostic capabilities. 

Said Huck, “We now feel confident that as our business grows and evolves, our network can grow with it, supporting whatever initiatives we undertake.”

IoT Starts With Sensors, Here Is a Bunch of Sensor News

Sometimes similar news comes in bunches, a little bit like a graph of an FFT. News from Swift Sensors and ABB take us from Covid to Space and back.

Swift Sensors Launches Sub-Zero Temperature Sensor to Meet COVID-19 Vaccine Monitoring and Storage Requirements

24/7 cloud-based wireless monitoring ensures vaccines are stored throughout the cold chain in the required sub-zero temperature ranges down to -100°C.

Swift Sensors, a provider of industrial IoT sensor solutions, launched a secure wireless vaccine storage unit monitoring and alert system to enable medical facilities and pharmacies to remotely monitor COVID-19 vaccine storage temperatures, automate data logging, and respond quickly in case of an equipment problem or power failure.

As vaccine suppliers and public health agencies expand the number of locations for vaccine delivery, pharmacies and clinics must quickly and safely store the vaccines to preserve the vaccines’ efficacy, prevent waste, and comply with data monitoring regulations. Swift Sensors has developed a wireless sensor system to achieve these goals.

“Data loggers have historically been used in cold-chain monitoring of vaccines, pharmaceuticals, and other critical perishable items. However, they lack the low cost, simplicity, and connectivity of wireless sensors connected to the internet,” said Ray Almgren, Swift Sensors CEO. “Our new sub-zero temperature sensor delivers an all-in-one, cost-effective solution for the safe and fast distribution and delivery of much-needed vaccines.”

Each Swift Sensors vaccine package includes at least one wireless remote temperature sensor to relay storage temperature data to an included wireless gateway.

The gateway sends data to a secure cloud-based Swift Sensors Console account. Pharmacy and clinic managers can view temperatures in real time on their computer or mobile device. They can also receive instant alerts via text, voice or email if the storage unit temperature exceeds established thresholds.

“Pharmacies and clinics can use our new sub-zero temperature sensor to monitor the super-cold temperatures the Pfizer vaccine requires or use our standard wireless remote temperature sensor to monitor Moderna vaccine storage conditions,” Almgren said. “Installation typically requires only a few minutes, and the device batteries last six to eight years.”

The Swift Sensors Console stores historical temperature readings so pharmacies and clinics can easily comply with CDC and state health department data logging requirements, without having to spend employee time manually recording or updating temperature data. 

ABB sensor onboard SpaceX rocket to detect greenhouse gas emissions 

An optical sensor manufactured by ABB was deployed with the successful launch of satellite Hugo from GHGSat, the emerging leader in greenhouse gas sensing services in space.

The ABB supplied optical sensor can map methane emissions from space at a resolution that is 100 times higher than any other sensors. Whilst previously only larger regions could be surveyed, for the first time the new greater granularity now allows the identification of the source of emissions. An additional nine units are currently under manufacture at ABB to be launched by the end of 2022 ready to be on-board across the first private satellite constellation dedicated to emission measurement.

Space offers the ideal location to freely monitor emissions across jurisdictions and quantitatively report on improvements. The ABB sensors will provide valuable insights which will enable governments and industries around the world to meet their emission reduction targets and reduce the negative impact on global warming.

With its involvement in the Canadian SCISAT mission and the Japanese GOSAT series of satellites, ABB has been at the forefront of the field of greenhouse gas sensing from space for more than two decades. ABB optical equipment already in space cumulates more than 100 years of reliable operation. The SCISAT sensor tracks long-term subtle composition changes in the earth’s atmosphere down to parts per trillion of more than 70 molecules and pollutants since 2003. Weather agencies across the world base their predictions on ABB equipment flying onboard the US National Oceanographic and Atmospheric Administration (NOAA) weather satellites (NPP and JPSS), which saves lives by improving the timeliness and accuracy of weather forecasts for up to seven days. 

ABB is also a global leader in earthbound continuous emission monitoring with over 60,000 systems installed in more than 50 countries worldwide. Continuous Emissions Monitoring Systems (CEMS) continuously record and evaluate emission data across all industries. They provide important information for the environmental and economic operation of production facilities. The range includes the ACF5000 that accurately and reliably monitors up to 15 gas components simultaneously.

New ABB emission monitoring solution helps the maritime industry achieve decarbonization targets

The launch of ABB’s CEMcaptain will help shipping comply with the sulphur emission regulations that were enforced in 2020 and keep in check their CO2 footprint.

In January 2020, the low sulphur and nitrous oxide emission limits in the International Maritime Organization regulations became effective worldwide. CEMcaptain is a powerful emissions monitoring system from ABB designed to help the maritime industry meet these new regulations and become more sustainable. Its measurement and digital capabilities increase on-board safety, provide process optimization and substantially reduce ownership costs. By consistently achieving 98 percent and more uptime, the new system not only requires less maintenance effort but also saves time otherwise spent on handling non-compliance issues. 

Designed with busy mariners and a regularly changing crew in mind, CEMcaptain is a multi-component analyzer system that continuously provides real-time data offering reliable measurement of emissions with the highest stability. Operating in even the harshest of conditions it integrates analyzer modules and sample handling components in a standalone cabinet, making installation easy.  

Equipped with ABB’s renowned Uras26 non-dispersive IR gas analyzer, CEMcaptain simultaneously and continuously measures sulphur dioxide (SO2) and carbon dioxide (CO2) in line with regulation requirements. Each analyzer has two separate gas paths to allow for continuous CO2/SO2 measurement of separate streams, with up to four different components per analyzer module.

Fast fault reporting, diagnosis and repair are achieved via the on-site and remote digital services which help operators get closer to 100 percent availability for their gas analysis instrumentation. Dynamic QR codes are integrated into the ABB CEMcaptain system display panel. All relevant diagnostic information can be collected from the analyzer via a scanned code and transferred to ABB support. This means that maritime instrumentation technicians can send real-time information to an ABB service expert to get immediate guidance on appropriate maintenance. ABB Ability™ Remote Assistance with secured connectivity direct to ABB support is also offered for real-time solutions to problems. These features reduce the costly training of changing crews as well as the number of experts required on board. They also increase on-board safety by reducing crew exposure to emissions. 

CEMcaptain GAA610-M is approved by all major classification societies (DNV GL, ABS Group, Lloyds Register, Bureau Veritas, ClassNK, Korean Register). 

Intelligent Agents as a booster for European production

  • Artificial Intelligence coordinates multi-agent systems
  • Implementing European projects on the demonstrator in Kaiserslautern

The Chief Technology Officer of a major automation supplier once told me that an important technology I should keep an eye on was intelligent agents. Indeed, the poor little software object rarely gets star billing on the program. The technology does exist. This information came to me last month about multi-agent systems. It encompasses a European smart factory initiative. This initiative bears watching.

A consortium of seventeen European partners is developing multi-agent systems for autonomous modular production in the research project called MAS4AI (Multi-Agent Systems for pervasive Artificial Intelligence to assist humans in modular production environments). The European Union (EU) has funded the project with almost 6 million euros.

MAS4AI is a project focused on selected sectors of industry that plans for their smart digital transformation over the next three years using the tools of Artificial Intelligence (AI). The aim is to achieve resilient production that can react flexibly to changing requirements or disruptions in the added value networks. The underlying basis is the large variety of products in lot size 1 in complex manufacturing operations.

Single agents acting in concert
Multi-agent systems are an area of distributed artificial intelligence research, in which several differently specialized “intelligent” and mostly autonomous software components (agents or bots) act in a coordinated manner to jointly solve a problem. The researchers are working towards the long-term goal of stable production, which among other things, relies on Shared Production and Production-as-a-Service. Communication, synchronization, and coordination of skills (production capabilities) are needed in a production network in order to implement our vision. This coordination will be performed by AI processes in the future. The European project partners envision a future production that can be distributed in European networks (like GAIA-X).



People make the decisions​​​​​​​
Scientists and engineers from Greece, Germany, Italy, Lithuania, the Netherlands, Poland and Spain are initially working on a modular system architecture and a communication structure to create the foundation on which to integrate industrial AI services for smart production. In the process, human participants will always retain control over the AI technologies. The prerequisite for this is to have AI processes designed in a way that is always understandable to the operator. Only then can they be validated, optimized, or modified. Demonstrators oriented on a series of industrial use cases are being developed in MAS4AI. The use cases are in European industrial sectors of high added value, such as companies from the automotive industry, contract manufacturing, bicycle production, or wood processing.

Production Level 4 as the visionary basis​​​​​​​
“MAS4AI fits perfectly into our concept of Production Level 4, which is based on production-bots and modular networks. Our concept envisions future production resources that offer their capabilities (skills) to the networks and autonomously (self-directed) call up the products,” said Prof. Martin Ruskowski, Chairman of the Executive Board of SmartFactory-KL, Head of DFKI’s Innovative Factory Systems research, and Chair of the department of Machine Tools and Controls at TU Kaiserslautern. “The products in our vision know their attributes and their current production progress. Such products search their own way among the skills to complete their own production. This may take place in a facility, but also in a Europe-wide network.”

Four scientific and technological goals

The consortium is developing the following four topics:

  1. Multi-agent systems for the distribution of AI components at various levels of a hierarchy. The key idea is to control interaction between agents on a task-specific basis with agents integrated to form an overall system.
  2. AI agents that use knowledge-based representations with semantic web technologies. Every agent can detect what skills it has to offer and those of other agents and, in this way, decide what action should be executed. This also makes it easier to integrate people into the production, because the data is also prepared in a way that is understandable to them.
  3. AI agents for the hierarchical planning of production processes. Processes are broken down into individual steps and optimally reassembled according to the current requirements. Disturbances in the flow can be compensated.
  4. Model-based AI agents for Machine Learning (ML). These hybrid models are designed to combine human knowledge about physical processes with data acquired for machines.

A fundamental concept in MAS4AI is the integration of all smart components (machines with attributes like self-direction, self-description, and self-learning abilities) in a holistic system architecture. This facilitates easy development and use of industrial AI technologies. Software developers, system integrators, and end users will all benefit because the hurdle for the use of AI is low. “We expect this to generate revolutionary ideas for business models as well as brand new market opportunities,” said Ruskowski.

Partners:

  • Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, Deutschland
  • Nederlandse Organisatie voor Toegepast Natuurwetenschappelijk Onderzoek, Niederlande
  • University of Patras – Laboratory for Manufacturing Systems & Automation, Griechenland
  • Fundacion Tecnalia Research and Innovation, Spanien
  • Asociacion De Investigacion Metalurgica del Noroeste, Spanien
  • University of Silesia, Polen
  • Volkswagen AG, Deutschland
  • SCM Group Spa, Italien
  • SC Baltik Vairas, Litauen
  • VDL Industrial Modules, Niederlande
  • Fersa Bearings S.A., Spanien
  • Semaku B.V., Niederlande
  • Symvouloi Kai Proionta Logismikou, Griechenland
  • flexis AG, Deutschland
  • Sisteplant, S. L., Spanien
  • D.M.D. Computers SRL, Italien
  • Smart Manufacturing Competences Centre INTECHCENTRAS, Litauen

ONF Announces Aether 5G Connected Edge Cloud Platform

Many industry pundits and observers seem to not understand all the ramifications and potentials for 5G. I’ve listened to podcasts from John Gruber at Daring Fireball and the guys at Accidental Tech Podcast talk about how 5G isn’t providing the anticipated boost for data speeds for their new iPhone 12s. But 5G provides for so much more than that.

I’ve had an opportunity to talk with people from the new Open Networking Foundation and check out this open-source community springing up. Here is a recent press release. Open source is burgeoning right now. Cynics say it’s just a way for big companies to cut development costs. I think it goes much deeper than that given licensing protocols and the spread of technology. This one is interesting and poised to take (among other things) Industrial Internet of Things to a deeper level.

The Open Networking Foundation (ONF) announced that ONF’s Aether 5G Connected Edge Cloud platform is being used as the software platform for the $30M DARPA Pronto project, pursuing research to secure future 5G network infrastructure.

DARPA is funding ONF to build, deploy and operate the network to support research by Cornell, Princeton and Stanford universities in the areas of network verification and closed-loop control. ONF will enhance and deploy its open source Aether software platform as the foundation for the Pronto research work, and in turn the research results will be open sourced back into Aether to help advance Aether as a platform for future secure 5G network infrastructure.

Aether – 5G Connected Edge Cloud Platform

Aether is the first open source 5G Connected Edge Cloud platform. Aether provides mobile connectivity and edge cloud services for distributed enterprise networks as a cloud managed offering. Aether is an open source platform optimized for multi-cloud deployments, and it simultaneously supports wireless connectivity over licensed, unlicensed and lightly-licensed (CBRS) spectrum.

Aether is a platform for enabling enterprise digital transformation projects. Coupling robust cellular connectivity with connected edge cloud processing creates a platform for supporting Industrial Internet-of-Things (IIoT) and Operational Technology (OT) services like robotics control, onsite inference processing of video feeds, drone control and the like.

Given Aether’s end-to-end programmable architecture coupled with its 5G and edge cloud capabilities, Aether is well suited for supporting the Pronto research agenda.

Aether Beta Deployment

ONF has operationalized and is running a beta production deployment of Aether.  This deployment is a single unified cloud managed network interconnecting the project’s commercial partners AT&T, Ciena, Intel, Google, NTT, ONF and Telefonica. This initial deployment supports CBRS and/or 4G/LTE radio access at all sites, and is cloud managed from a shared core running in the Google public cloud.

The University campuses are being added to this Aether deployment in support of Pronto. Campus sites will be used by Pronto researchers to advance the Pronto research, serving as both a development platform and a testbed for use case experimentation. The Aether footprint is expected to grow on the university campuses as Aether’s 5G Connected Edge Cloud capabilities are leveraged both for research on additional use cases as well as for select campus operations.

Aether Ecosystem
A growing ecosystem is backing Aether, collectively supporting the development of a common open source platform that can serve as an enabler for digital transformation projects, while also serving as a common platform for advanced research poised to help unlock the potential of the programmable network for more secure future 5G infrastructure.

At Google Cloud, we are working closely with the telecom ecosystem to help enable 5G transformation, accelerated by the power of cloud computing. We are pleased to support the Open Networking Foundation’s work to extend the availability of 5G and edge capabilities via an open source platform.” 

Shailesh Shukla, VP and GM, Networking, Google Cloud

Cornell is deploying Aether on campus to bring private 5G/LTE connectivity services with edge cloud capabilities into our research facilities.  We expect private 5G/LTE with connected edge cloud to become an important and integral part of our research infrastructure for many research and operational groups on the campus.  We also see the value of interconnecting a nation-wide leading infrastructure with Stanford, Princeton and ONF for collaborative research among university researchers across the country.”

David Lifka, Vice President for Information Technologies and CIO, Cornell University

Princeton University is deploying Aether on campus in the Computer Science Department in order to support the Pronto research agenda and offer it as an experimental infrastructure for other research groups. This deployment will enable private 5G/LTE connectivity and edge cloud services and will complement Princeton’s existing P4 enabled infrastructure on campus. We plan to also explore how some of our mission critical production use cases can be supported on a private 5G Connected Edge Cloud.”

Jay Dominick, Vice President & CIO, Princeton University

Ciena is pleased to be an early collaborator on the ONF’s Aether project.  We have an Aether site running in our 5G lab in Montreal, and we are excited by the prospect of helping enterprises leverage the 5G and edge cloud capabilities of Aether to help build transformative solutions.”

Stephen Alexander, Senior Vice President and Chief Technology Officer, Ciena

 “Intel is an active participant of the ONF’s innovative Aether project to advance the development of 5G and edge cloud solutions on high volume servers. ONF has been leading the industry with advanced open source implementations in the areas of disaggregated Mobile Core, e.g. the Open Mobile Evolved Core (OMEC), and we look forward to continuing to innovate by applying proven principles of disaggregation, open source and AI/ML with Aether, the Enterprise 5G/LTE Edge-Cloud-as-a-Service platform. As open source, Aether will help accelerate the availability of innovative edge applications. Aether will be optimized to leverage powerful performance, AI/ML, and security enhancements, which are essential for 5G and available in Intel® Xeon® Scalable Processors, network adapters and switching technologies, including Data-Plane Development Kit (DPDK), Intel® Software Guard Extensions (Intel SGX), and Intel® Tofino™ Programmable Ethernet Switch.”

Pranav Mehta, Vice President of Systems and Software Research, Intel Labs

Learn More

The Aether ecosystem is open to researchers and other potential partners who wish to build upon Aether, and we welcome inquiries regarding collaboration.  You can learn more at the Aether website.

About the Open Networking Foundation:

The Open Networking Foundation (ONF) is an operator led consortium spearheading disruptive network transformation. Now the recognized leader for open source solutions for operators, the ONF first launched in 2011 as the standard bearer for Software Defined Networking (SDN). Led by its operator partners AT&T, China Unicom, Deutsche Telekom, Google, NTT Group and Türk Telekom, the ONF is driving vast transformation across the operator space. For further information visit http://www.opennetworking.org

CIP Security Updated to Support User Level Authentication

This release has been sitting in my Dropbox for a month or so. It’s still worth noting especially since security became news a couple of times in the past few weeks.

ODVA announces that user level authentication has been added to CIP Security, the cybersecurity network extension for EtherNet/IP. Previous publications of the specifications for CIP Security included key security properties including a broad trust domain across a group of devices, data confidentiality, device authentication, device identity, and device integrity. CIP Security now adds a narrow trust domain by user and role, an improved device identity including the user, and user authentication. 

As IT and OT converge in industrial automation, the ability for controls engineers, IT administrators, and maintenance operators to securely access and modify device parameters grows even more critical. Device level security is a building block requirement of IIoT to protect critical assets and people from potential physical and increasingly likely financial harm. To meet this requirement, the robust CIP Security User Authentication Profile will provide user level authentication with a fixed user access policy based on well-defined roles and basic authorization via both local and central user authentication. CIP Security’s ability to authenticate via the device or through a central server allows for simplicity in smaller, simple systems and efficiency in large, complicated installations.

CIP Security already included robust, proven, and open security technologies including TLS (Transport Layer Security) and DTLS (Datagram Transport Layer Security); cryptographic protocols used to provide secure transport of EtherNet/IP traffic, hashes or HMAC (keyed-Hash Message Authentication Code) as a cryptographic method of providing data integrity and message authentication to EtherNet/IP traffic; and encryption as a means of encoding messages or information in such a way as to prevent reading or viewing of EtherNet/IP data by unauthorized parties. The new CIPTM User Authentication Profile provides user-level authentication for CIP communication at the application layer. In the future, CIP Security may make use of a CIP authorization profile that will enhance CIP to provide additional security properties such as general, flexible authorization where access policy can be based on any attribute of the user and/or system and potentially extending CIP Security to support other non-EtherNet/IP networks.

The new User Authentication Profile makes use of several open, common, ubiquitous technologies, including OAuth 2.0 and OpenID Connect for cryptographically protected token-based user authentication, JSON Web Tokens (JWT) as proof of authentication, usernames and passwords, and already existing X.509 certificates to provide cryptographically secure identities to users and devices. It uses a cryptographically secure user authentication session ID, generated by the target on presentation of a valid JWT by the user, to map between an authentication event and the messages sent by a user for CIP communications. The user authentication session ID is transmitted over EtherNet/IP using (D)TLS and a confidentiality-enabled cipher suite per CIP Security’s EtherNet/IP confidentiality profile.

“User authentication is another critical step in the development of CIP Security, a key network extension that is a part of the complete EtherNet/IP industrial communication ecosystem. CIP Security, as a part of a defense in depth approach, is designed as an effective deterrence to malicious cyber attackers who are looking for targets to disrupt plant operations,” stated Jack Visoky, EtherNet/IP System Architecture Special Interest Group (SIG) vice-chair. “With connected infrastructure and automation systems, CIP Security is more critical than ever before to protect valuable investments and production of essential products around the world from malicious cybersecurity attacks” said Dr. Al Beydoun, President and Executive Director of ODVA. “ODVA will continue to invest in the future development of CIP Security and EtherNet/IP to ensure that end users are protected from physical and financial harm perpetrated by bad actors.”

Through this update, CIP Security now offers even stronger device level security with a narrow trust domain by user and role, an improved device identity including the user, and fixed user authentication. ODVA continues to work to make sure that CIP Security stays on the cutting edge of device defense to best protect critical industrial automation assets to make sure that the promise of IIoT and Industry 4.0 can be fully achieved. Visit odva.org to obtain the latest version of The EtherNet/IP Specification including CIP Security.

OPC Foundation In The News

OPC Foundation has released a lot of news at the end of 2020. Its working groups have been busy.

  • Cloud Library with CESMII
  • Field Level Communications
  • ECLASS Standard for M2M Communication
  • Update to PLCOpen 61131-3 specification

Cloud Library

The OPC Foundation, in collaboration with CESMII, is pleased to announce the launch of the “OPC UA Cloud Library” Joint Working Group (JWG). The goal of the JWG is to specify how OPC UA information models of machines, SCADA and Manufacturing Execution Systems will be stored in and accessed from a cloud-based database.  Such a database will enable manufacturers to draw from a wide range of OPC UA information models and profiles for use in their pre-built shopfloor and business digitalization applications.

Collaboration between the OPC Foundation and the Smart Manufacturing Institute is a natural fit given their complementary efforts. On one hand, the US government-backed Smart Manufacturing Institute sets out to help accelerate the adoption of Smart Manufacturing by businesses of all sizes by enabling frictionless movement of information (data and context) between real-time operations and the people and systems that create value in their organizations. On the other hand, the OPC Foundation created a globally adopted open data interoperability standard via its OPC UA specification. The specification’s information modeling capabilities and secure, scalable communications made it a cornerstone of Industrie 4.0 and virtually every other national Industrial IoT initiative. By working together, CESMII and the OPC Foundation aim to enable the broadest range of US manufacturers and beyond to innovate and go-to-market in their digital transformation using the right data modeling foundation.

Field Level Communications

OPC Foundation announces today that its Field Level Communications Initiative has accomplished a significant milestone in the ongoing project by completing their initial release candidate with the focus on the Controller-to-Controller (C2C) use case. In addition, a technical paper has been published that explains the technical approach and the basic concepts to extend OPC UA to the field level for all use cases and requirements in Factory and Process Automation.

Peter Lutz, Director Field Level Communications of the OPC Foundation says: “We are happy about the progress that our working groups have made over the last months, despite COVID-19 and the associated restrictions. The initial release candidate is a major achievement because it is used to build prototypes and to create test specifications that will be converted to corresponding test cases for the OPC UA certification tool (CTT). Furthermore, it lays the foundation for specification enhancements to also cover the Controller-to-Device (C2D) and Device-to-Device (D2D) use cases in the next step.”

Since the start of the Field Level Communications Initiative in November 2018 more than 300 experts from over 60 OPC Foundation member companies have signed up for the various technical working groups to create the technical concepts and elaborate the specification contents for extending the OPC UA framework for field level communications, including Determinism, Motion, Instruments and Functional Safety.

Cooperation with ECLASS

An important step for interoperability in the field of M2M communication: the OPC Foundation and ECLASS e.V. signed a cooperation agreement.

The goal of this cooperation is to combine the power of the OPC UA and ECLASS standards to better enable M2M interoperability via seamless communication of data and semantics using a standardized set of interfaces. To serve as the basis for semantic interoperability across full product life cycles in an international application environment, a manufacturer and industry independent standard for product description is needed. Once created, such a standard can serve as a semantic reference for the Internet of Things. The ECLASS standard, developed by ECLASS e.V. meets these requirements in a unique way.

The OPC UA standard enables secure transmission of data and facilitates the definition and dynamic exchange of its underlying structure via robust OPC UA information modeling functionality. Standardized information models implemented using OPC UA are called OPC UA Companion Specifications, which taken together can serve as common libraries of Information Models. Products utilizing OPC UA Companion Specifications enable seamless 3rd party data interoperability in the operating phase in the product lifecycle. Today, ECLASS identifiers are already being used in various Companion Specifications. 

Update to PLCOpen OPC UA for IEC61131-3

The OPC Foundation, in collaboration with PLCopen, announced the release of v1.02 of the “OPC UA for IEC61131-3” specification. Building on the first version of the specification, the joint working group added: 

  • support for all datatypes defined in the 3rd edition of IEC61131-3
  • an optimized, machine-readable version of the information model (i.e. nodesetfile) 
  • compliance with enhanced specification templates to support the tool chain used to generate validated information models
  • inclusion in the global online searchable specification reference 
  • OPC Foundation Compliance Test Tool (CTT)  test cases for validation of vendor implementations of “OPC UA for IEC61131-3”

Founded in 2008, this joint working group has the goal of expressing IEC 61131-3 information models using OPC UA.  By doing so, an IEC6-61131-3 PLC project that is loaded onto different control platforms can be displayed in a standardized form and made available for communication via the controllers’ OPC UA servers.

“This first step harmonizes the access of the project running in the controller” says Eelco van der Wal, Managing Director of the PLCopen organization. “With this an unprecedented transparency is created in the communication in industrial automation, enabling the configuration of the communication much faster and independent of the network and suppliers. For this reason, many suppliers have implemented this, providing their users with the ease of use in communication.”

In addition to the server specification “UA for IEC61131-3” the group also worked very successfully on the “client specification” which was originally released in 2014. The implementation of this functionality on a controller makes it possible to initiate a communication session to any other available OPC UA Server. The controller can exchange complex data structures horizontally with other controllers independent of the fieldbus system used, or vertically with other devices using an OPC UA server service oriented architecture, like an MES/ERP system in order to collect data or write new production orders to the cloud.