Biden Memo and Research Highlight Recent Security Updates

A number of security-related news items came my way during the past couple of weeks. The Biden administration memo brought a surge of comments. I’ve included one from Marty Edwards. Several companies research vulnerabilities and discover interesting and useful threats and vulnerabilities.

  • MITRE Engenuity ATT&CK Evaluations
  • Google on Measuring Risk in Open Source
  • Open Source Security Foundation Adds Members
  • Claroty Research Team82 Finds ICS Vulnerabilities
  • Industry Veteran Marty Edwards Shares Thoughts on Biden’s Security Memo

Engenuity ATT&CK Evaluations

MITRE Engenuity released results from its first round of independent ATT&CK Evaluations for Industrial Control Systems (ICS). The evaluations examined how cybersecurity products from five ICS vendors detected the threat of Russian-linked Triton malware.

The malware targets safety systems, preventing officials from responding to failures, hazards and other unsafe conditions, potentially causing physical destruction. 

The evaluations use ATT&CK for ICS, a MITRE-curated knowledge base of adversary tactics, techniques, and procedures based on known threats to industrial control systems.

The evaluations, which were paid for by the participating vendors, included products from Armis; Claroty; Microsoft (via CyberX acquisition); Dragos; and the Institute for Information Industry

“MITRE Engenuity’s ATT&CK Evaluations program is built on the backbone of MITRE’s integrity and commitment to making the world a safer, more secure place,” said Frank Duff, general manager of the ATT&CK Evaluations program. “Vendors trust us to improve their offerings, and the community trusts that we’ll provide transparency into the technology that is necessary to make the best decisions for their unique environment. Unlike closed door assessments, we use a purple teaming approach with the vendor to optimize the evaluation process. MITRE experts provide the red team while the vendor provides the blue team to ensure complete visibility, while allowing the vendor to learn directly from ATT&CK experts.” 

Google Measuring Risk in Open Source

by Kim Lewandowski, Azeem Shaikh, Laurent Simon, Google Open Source Security Team

Contributors to the Scorecards project, an automated security tool that produces a “risk score” for open source projects, have accomplished a lot since our launch last fall. Today, in collaboration  with the Open Source Security Foundation community, we are announcing Scorecards v2. We have added new security checks, scaled up the number of projects being scored, and made this data easily accessible for analysis.

Since last fall, Scorecards’ coverage has grown; we’ve added several new checks, following the Know, Prevent, Fix framework proposed by Google earlier this year, to prioritize our additions.

Contributors with malicious intent or compromised accounts can introduce potential backdoors into code. Code reviews help mitigate against such attacks. With the new Branch-Protection check, developers can verify that the project enforces mandatory code review from another developer before code is committed.

Despite best efforts by developers and peer reviews, vulnerable code can enter source control and remain undetected. We have added checks to detect if a project uses Fuzzing and SAST tools as part of their CI/CD system.

A common CI/CD solution used by GitHub projects is GitHub Actions. A danger with these action workflows is that they may handle untrusted user input. Meaning, an attacker can craft a malicious pull request to gain access to the privileged GitHub token, and with it the ability to push malicious code to the repo without review. To mitigate this risk, Scorecard’s Token-Permissions prevention check now verifies that the GitHub workflows follow the principle of least privilege by making GitHub tokens read-only by default.

To date, the Scorecards project has scaled up to evaluate security criteria for over 50,000 open source projects. In order to scale this project, we undertook a massive redesign of our architecture and used a PubSub model which achieved horizontal scalability and higher throughput. This fully automated tool periodically evaluates critical open source projects and exposes the Scorecards check information through a public BigQuery dataset which is refreshed weekly.  

This data can be retrieved using the bq command line tool.

Scorecards data for available projects is now included in the recently announced Google Open Source Insights project and also showcased in OpenSSF Security Metrics project. The data on these sites shows that there are still important security gaps to fill, even in widely used packages like Kubernetes.

There are a couple of big enhancements we’re especially excited about:

Scorecards Badges – GitHub badges to show off compliance

Integration with CI/CD and GitHub Code Scanning Results

Integration with Allstar project – GitHub App for enforcing security policies

Open Source Security Foundation Adds 10 Members

OpenSSF, a cross-industry collaboration to secure the open source ecosystem, announced new membership commitments to advance open source security education and best practices. New members include Accurics, Anchore, Bloomberg Finance, Cisco Systems, Codethink, Cybertrust Japan, OpenUK, ShiftLeft, Sonatype and Tidelift. 

The new Scorecard 2.0 is also available now and includes new security checks, scaled up the number of projects being scored, and made this data easily accessible for analysis. The Scorecard is gaining adoption for automating analysis and trust decisions on the security posture of open source projects.

Its working groups include Securing Critical Projects, Security Tooling, Identifying Security Threats, Vulnerability Disclosures, Digital Identity Attestation, and Best Practices.  

Claroty Finds Critical Vulnerabilities

Claroty, the industrial cybersecurity company, launched Team82, its new research arm that provides indispensable vulnerability and threat research to Claroty customers and defenders of industrial networks worldwide. Additionally, Team82 released a new report on critical vulnerabilities found in cloud-based management platforms for industrial control systems (ICS), highlighting the rise of ICS in the cloud and the growing need to secure cloud implementations in industrial environments. 

In its latest report, “Top-Down and Bottom-Up: Exploiting Vulnerabilities in the OT Cloud Era,” Team82 researched the exploitability of cloud-based management platforms responsible for monitoring ICS, and developed techniques to exploit vulnerabilities in automation vendor CODESYS’ Automation Server and vulnerabilities in the WAGO PLC platform. Team82’s research mimics the top-down and bottom-up paths an attacker would take to either control a Level 1 device in order to eventually compromise the cloud-based management console, or the reverse, commandeer the cloud in order to manipulate all networked field devices. 

The new Team82 Research Hub includes the team’s latest research reports, a vulnerability dashboard for tracking the latest disclosures, its coordinated disclosure policy for working with affected vendors, its public PGP Key for securely and safely exchanging vulnerability and research information, and other resources. 

To access the Team82 Research Hub, visit claroty.com/team82

Read the report, “Top-Down and Bottom-Up: Exploiting Vulnerabilities In the OT Cloud Era.”

Marty Edwards, Tenable, on Biden Memo

You can find Edwards’ thoughts at this blog site. Below are some excerpts.

Recent activity from the Biden Administration represents a watershed moment in the establishment of baseline standards for preparing, mitigating and responding to attacks that impact the critical infrastructure we all rely on.

The most substantive thrust of these government actions is recognizing and acting on the accelerated trend of reconnaissance and attack by establishing the Industrial Control Systems (ICS) Cybersecurity Initiative. The ICS Initiative is a voluntary, collaborative effort between the federal government and the critical infrastructure community to protect U.S. critical infrastructure “by encouraging and facilitating deployment of technologies and systems that provide threat visibility, indications, detection, and warnings, and that facilitate response capabilities for cybersecurity in essential control system and operational technology networks,” with a primary goal of “greatly expand[ing] deployment of these technologies across priority critical infrastructure.”

Tenable encourages CISA and the U.S. government to take an open, technology-neutral, standards-based approach in the development of these goals. Core elements for consideration as the most appropriate and successful methods of disrupting attack paths and securing critical infrastructure and OT environments revolve around three key pillars:

Visibility: Gain full visibility and deep situational awareness across your converged IT/OT environment.

Security: Protect your industrial infrastructure from advanced cyberthreats and risks posed by hackers and malicious insiders.

Control: Take full control of your operations network by continuously tracking ALL changes to any ICS device.

Linux Foundation Launches Research, Training, and Tools to Advance Adoption of Software Bill of Materials

My latest podcast topic contains thoughts on open source. This announcement from The Linux Foundation merges open source with the latest concerns about cybersecurity with several product launches regarding the Software Bill of Materials (SBOM). The industry continues to take small steps toward security. When a community gathers to work on a solution, it’s a big help.

Home to the industry’s most supported open standard for exchanging information about what is in software – SPDX – the Linux Foundation brings its complete resources to bear to support private and public sector supply chain security 

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced new industry research, a new training course, and new software tools to accelerate the adoption of Software Bill of Materials (SBOMs). 

President Biden’s recent Executive Order on Improving the Nation’s Cybersecurity referenced the importance of SBOMs in protecting and securing the software supply chain.

The de-facto industry standard, and most widely used approach today, is called Software Package Data Exchange (SPDX). SPDX evolved organically over the last ten years to suit the software industry, covering issues like license compliance, security, and more. The community consists of hundreds of people from hundreds of companies, and the standard itself is the most robust, mature, and adopted SBOM in the market today. 

“As the architects of today’s digital infrastructure, the open-source community is in a position to advance the understanding and adoption of SBOMs across the public and private sectors,” said Mike Dolan, Senior Vice President and General Manager Linux Foundation Projects. “The rise in cybersecurity threats is driving a necessity that the open-source community anticipated many years ago to standardize on how we share what is in our software. The time has never been more pressing to surface new data and offer additional resources that help increase understanding about how to generate and adopt SBOMs.” 

An SBOM is an account of the components contained in a piece of software. It can be used to ensure developers understand what software is being shared throughout the supply chain and in their projects or products and supports the systematic review of each component’s licenses to clarify what obligations apply to the distribution of the supplied software.

SBOM Readiness Survey

Linux Foundation Research is conducting the SBOM Readiness Survey. It will examine obstacles to adoption for SBOMs and future actions required to overcome them related to the security of software supply chains. The recent US Executive Order on Cybersecurity emphasizes SBOMs, and this survey will help identify industry gaps in SBOM application. Survey questions address tooling, security measures, and industries leading in producing and consuming SBOMs, among other topics. For more information about the survey and to participate, please visit {Hilary blog}. 

New Course: Generating a Software Bill of Materials

The Linux Foundation is also announcing a free, online training course, Generating a Software Bill of Materials (LFC192). This course provides foundational knowledge about the options and the tools available for generating SBOMs and how to use them to improve the ability to respond to cybersecurity needs. It is designed for directors, product managers, open-source program office staff, security professionals, and developers in organizations building software. Participants will walk away with the ability to identify the minimum elements for an SBOM, how they can be assembled, and an understanding of some of the open-source tooling available to support the generation and consumption of an SBOM.

New Tools: SBOM Generator

Also announced today is the availability of the SPDX SBOM generator, which uses a command-line interface (CLI) to generate SBOM information, including components, licenses, copyrights, and security references of your software using SPDX v2.2 specification and aligning with the current known minimum elements from NTIA. Currently, the CLI supports GoMod (go), Cargo (Rust), Composer (PHP), DotNet (.NET), Maven (Java), NPM (Node.js), Yarn (Node.js), PIP (Python), Pipenv (Python), and Gems (Ruby). It is easily embeddable in automated processes. It is easy to embed in automated processes such as continuous integration (CI) pipelines and is available for Windows, MacOS, and Linux.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open-source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration.

Proprietary Software Is An Evolutionary DeadEnd

The thing that gets everyone working together, each person doing their own self-interested thing, makes the whole better. This was Matt Mullenweg, WordPress founder and CEO of Automattic, on his podcast Distributed discussing distribute work and open source software.

WordPress.org is an open source content management tool. WordPress.com is a for-profit company that sells tools and services for WordPress. Automattic owns both and other companies (having acquired Tumblr last year). WordPress powers about 38% of the world’s Websites including this one.

Mullenweg’s podcast shares his experiences with distributed work (the only office is the Tumblr one it acquired with that company). Automattic gets the benefit of talented people no matter where they reside. And, the asynchronous work and other policies it’s adopted has created a loyalty where turnover is very low.

Back to open source.

The IT companies discovered long ago the immense benefits of open standards, open APIs, and open source. Microsoft, HPE, Dell Technologies, and many others have put a lot of their code into open source. They allow employees to work on open source projects. Why? Just as Mullenweg said, the community makes everything better. Can you build a company with a technological foundation on open source? Check out Red Hat. Or, WordPress.com.

Interviews and posts over the past few months point to a momentum in the industrial market, always a technological laggard, toward open standards and open source. I, for one, hope it not only continues, but grows. End users are finally getting enough clout to get their voices heard by suppliers.

Check out the Open Process Automation Forum work. And the advances made by the Linux Foundation. And the Industrial Internet Consortium. We are getting close to the end users’ desired state of interoperability.

This will not be bad for the proprietary vendors–at least those who can adapt and think ahead. Customer lock-in only works for a while until the customer finally feels it’s been gouged too much and bites the bullet for the pain of change. The end game for the benefit of the market and for society is more efficient and productive end user manufacturers. Sometimes we forget that.

LF Edge Organization Adds Members, Projects, SIG

  • LF Edge welcomes Foxconn Industrial Internet (FII), HCL Technologies, OpenNebula, Robin.io, and  Shanghai Open Source Information Technology Association to robust roster of community members working to advance open source at the edge
  • FII sponsors Smart Agriculture SIG within the Open Horizon Project 
  • Home Edge’s Coconut release leverages EdgeX Foundry to enable more data storage options across different devices

Momentum within the open-source community continues to build. New members are joining and new work projects add utility.  LF Edge, an umbrella organization within the Linux Foundation that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system, has announced addition of four new general members (FII, HCL, OpenNebula, and Robin.io) and one new Associate member (Shanghai Open Source Information Technology Association). Additionally, Home Edge has released its third platform update with new Data Storage and Mult-NAT Edge Device Communications (MNDEC) features. 

“We are pleased to see continued growth among the LF Edge ecosystem,” said Arpit Joshipura, general manager, Networking, Edge and IOT, the Linux Foundation. “Open edge technology crosses verticals and is poised to become larger than the cloud native ecosystem. We are honored to be at the forefront of this budding industry and welcome all of our new members.”

Approaching its two-year mark as an umbrella project, LF Edge is currently comprised of nine projects – including AkrainoBaetylFledge,  EdgeX FoundryHome EdgeOpen HorizonProject EVESecure Device Onboard (SDO) and State of the Edge – that support emerging edge applications across areas such as non-traditional video and connected things that require lower latency, and faster processing and mobility. LF Edge helps unify a fragmented edge market around a common, open vision for the future of the industry.

One of the project’s newest general members, FII, plans to sponsor a Smart Agriculture Special Interest Group (SIG) within the Open Horizon project and soon, an LF Edge Smart Agriculture End User Solution Group (EUSG).  Open Horizon SIGs are designed to identify use cases within an industry or vertical, develop end-to-end reference solutions, and promote the solutions, use cases, and the group itself.  LF Edge EUSGs gather user requirements through cross-project discussion with end users and generate demos, white papers, and messaging in order to prioritize project features and inform project roadmaps.

Home Edge’s  third release, Coconut,  is now available. Coconut includes new features such as Multi-NAT communications (which enables discovery of devices in different NATs) and Data Storage.  In collaboration with EdgeX Foundry, a centralized device can be designated as a primary device to store the data from different devices. Home Edge expects its next release will be available in 2021 and will include real time data analytics features. More information on Home Edge Coconut is available here.

LF Edge sessions from the 2020 Open Networking and Edge Summit (ONES) virtual experience are available on-demand

LF Edge’s “On the Edge with LF Edge” webinar series, launched this year, has produced seven episodes to date, with overview discussions of each of the projects.

More details on LF Edge, including how to join as a member, details on specific projects and other resources, are available.

“HCL Technologies is excited to join the Linux Foundation Edge community. We look forward to collaborating with industry leaders to help create and promote Edge Computing frameworks for wider adoption, leveraging our expertise and investments in IoT, 5G and Cloud technologies”, said GH Rao, resident, Engineering R&D Services, HCL Technologies.

“We are delighted to join the LF Edge community as part of our ONEedge initiative”, said Constantino Vazquez, Chief Operations Officer, OpenNebula. “OpenNebula, which has traditionally been used for Private Clouds, has now become a True Hybrid Cloud platform with a number of powerful features for making deployments at the edge much easier for organizations that want to rely on open-source technologies and retain the freedom of being able to use on-demand the providers, locations and resources that they really need.”

“Robin.io is excited to join the Linux Foundation Edge,” said Partha Seetala, CEO and founder of Robin.io. “Open edge technologies is a large and exciting opportunity  for Robin technology and cloud-native ecosystem; we are honored to be at the forefront of the enabling and promoting of Kubernetes-based Edge Computing frameworks for wider adoption.”

“The Shanghai Open Source Information Technology Association has joined LF Edge to implement open source innovation in China, promote the establishment of an open source ideological and cultural system that is compatible with the innovative application of the Digital Economy, and promote the design of laws, regulations and institutional frameworks that are conductive to the ecological development of open source.”

ONF Announces Aether 5G Connected Edge Cloud Platform

Many industry pundits and observers seem to not understand all the ramifications and potentials for 5G. I’ve listened to podcasts from John Gruber at Daring Fireball and the guys at Accidental Tech Podcast talk about how 5G isn’t providing the anticipated boost for data speeds for their new iPhone 12s. But 5G provides for so much more than that.

I’ve had an opportunity to talk with people from the new Open Networking Foundation and check out this open-source community springing up. Here is a recent press release. Open source is burgeoning right now. Cynics say it’s just a way for big companies to cut development costs. I think it goes much deeper than that given licensing protocols and the spread of technology. This one is interesting and poised to take (among other things) Industrial Internet of Things to a deeper level.

The Open Networking Foundation (ONF) announced that ONF’s Aether 5G Connected Edge Cloud platform is being used as the software platform for the $30M DARPA Pronto project, pursuing research to secure future 5G network infrastructure.

DARPA is funding ONF to build, deploy and operate the network to support research by Cornell, Princeton and Stanford universities in the areas of network verification and closed-loop control. ONF will enhance and deploy its open source Aether software platform as the foundation for the Pronto research work, and in turn the research results will be open sourced back into Aether to help advance Aether as a platform for future secure 5G network infrastructure.

Aether – 5G Connected Edge Cloud Platform

Aether is the first open source 5G Connected Edge Cloud platform. Aether provides mobile connectivity and edge cloud services for distributed enterprise networks as a cloud managed offering. Aether is an open source platform optimized for multi-cloud deployments, and it simultaneously supports wireless connectivity over licensed, unlicensed and lightly-licensed (CBRS) spectrum.

Aether is a platform for enabling enterprise digital transformation projects. Coupling robust cellular connectivity with connected edge cloud processing creates a platform for supporting Industrial Internet-of-Things (IIoT) and Operational Technology (OT) services like robotics control, onsite inference processing of video feeds, drone control and the like.

Given Aether’s end-to-end programmable architecture coupled with its 5G and edge cloud capabilities, Aether is well suited for supporting the Pronto research agenda.

Aether Beta Deployment

ONF has operationalized and is running a beta production deployment of Aether.  This deployment is a single unified cloud managed network interconnecting the project’s commercial partners AT&T, Ciena, Intel, Google, NTT, ONF and Telefonica. This initial deployment supports CBRS and/or 4G/LTE radio access at all sites, and is cloud managed from a shared core running in the Google public cloud.

The University campuses are being added to this Aether deployment in support of Pronto. Campus sites will be used by Pronto researchers to advance the Pronto research, serving as both a development platform and a testbed for use case experimentation. The Aether footprint is expected to grow on the university campuses as Aether’s 5G Connected Edge Cloud capabilities are leveraged both for research on additional use cases as well as for select campus operations.

Aether Ecosystem
A growing ecosystem is backing Aether, collectively supporting the development of a common open source platform that can serve as an enabler for digital transformation projects, while also serving as a common platform for advanced research poised to help unlock the potential of the programmable network for more secure future 5G infrastructure.

At Google Cloud, we are working closely with the telecom ecosystem to help enable 5G transformation, accelerated by the power of cloud computing. We are pleased to support the Open Networking Foundation’s work to extend the availability of 5G and edge capabilities via an open source platform.” 

Shailesh Shukla, VP and GM, Networking, Google Cloud

Cornell is deploying Aether on campus to bring private 5G/LTE connectivity services with edge cloud capabilities into our research facilities.  We expect private 5G/LTE with connected edge cloud to become an important and integral part of our research infrastructure for many research and operational groups on the campus.  We also see the value of interconnecting a nation-wide leading infrastructure with Stanford, Princeton and ONF for collaborative research among university researchers across the country.”

David Lifka, Vice President for Information Technologies and CIO, Cornell University

Princeton University is deploying Aether on campus in the Computer Science Department in order to support the Pronto research agenda and offer it as an experimental infrastructure for other research groups. This deployment will enable private 5G/LTE connectivity and edge cloud services and will complement Princeton’s existing P4 enabled infrastructure on campus. We plan to also explore how some of our mission critical production use cases can be supported on a private 5G Connected Edge Cloud.”

Jay Dominick, Vice President & CIO, Princeton University

Ciena is pleased to be an early collaborator on the ONF’s Aether project.  We have an Aether site running in our 5G lab in Montreal, and we are excited by the prospect of helping enterprises leverage the 5G and edge cloud capabilities of Aether to help build transformative solutions.”

Stephen Alexander, Senior Vice President and Chief Technology Officer, Ciena

 “Intel is an active participant of the ONF’s innovative Aether project to advance the development of 5G and edge cloud solutions on high volume servers. ONF has been leading the industry with advanced open source implementations in the areas of disaggregated Mobile Core, e.g. the Open Mobile Evolved Core (OMEC), and we look forward to continuing to innovate by applying proven principles of disaggregation, open source and AI/ML with Aether, the Enterprise 5G/LTE Edge-Cloud-as-a-Service platform. As open source, Aether will help accelerate the availability of innovative edge applications. Aether will be optimized to leverage powerful performance, AI/ML, and security enhancements, which are essential for 5G and available in Intel® Xeon® Scalable Processors, network adapters and switching technologies, including Data-Plane Development Kit (DPDK), Intel® Software Guard Extensions (Intel SGX), and Intel® Tofino™ Programmable Ethernet Switch.”

Pranav Mehta, Vice President of Systems and Software Research, Intel Labs

Learn More

The Aether ecosystem is open to researchers and other potential partners who wish to build upon Aether, and we welcome inquiries regarding collaboration.  You can learn more at the Aether website.

About the Open Networking Foundation:

The Open Networking Foundation (ONF) is an operator led consortium spearheading disruptive network transformation. Now the recognized leader for open source solutions for operators, the ONF first launched in 2011 as the standard bearer for Software Defined Networking (SDN). Led by its operator partners AT&T, China Unicom, Deutsche Telekom, Google, NTT Group and Türk Telekom, the ONF is driving vast transformation across the operator space. For further information visit http://www.opennetworking.org

NI Joins Open Manufacturing Platform Organization

I first heard about the Open Manufacturing Platform during my last trip to Germany, well, my last business trip anywhere, last February. I wrote about it here–Open Manufacturing Platform Expands.This effort, led by Microsoft and BMW joined by ZF, Bosch, and ABInBev, “helps manufacturers leverage advanced technologies to gain greater operational efficiencies, factory output, customer loyalty, and net profits.” That’s a tall order. These are companies that I’ve seen leverage technology for improvements over the years. This should be an advancement.

This month’s news items (2) relating to OMP include NI through its recent acquisition Optimal Plus joining the organization and a new deliverable from the OMP’s working group.

NI says that it has joined OMP “with the goal of establishing an architecture and standards for auto manufacturers to better leverage and automate analytics to improve quality, reliability and safety.”

I had an opportunity to interview Michael Schuldenfrei, NI Fellow and OptimalPlus CTO about smart manufacturing, what OptimalPlus adds to NI, and OMP. The roots of OptimalPlus lie in enterprise software relative to manufacturing of semiconductors. An early customer was Qualcomm who used the software to collect and analyze data from its numerous manufacturing plants. It branched out into assemblies, such as with a new customer Nvidia. Later the company added mechatronics to its portfolio. That was a good tie in with NI.

Rather than become just another smart manufacturing application focusing on machines, OptimalPlus brings its focus to the product being manufactured. Given NI’s strength in test and measurement, this was a definite synergy. As I have written before here and here, this enterprise software addition to NI’s portfolio is just what the company needs to advance a level.

Michael told me he was an early advocate for OMP due to seeing how his technology worked with Tier 1 automotive suppliers to drive digital transformation process. 

NI announced that its latest acquisition, OptimalPlus, has joined the Open Manufacturing Platform (OMP), a consortium led by BMW, Microsoft, ZF, Bosch and ABInBev that helps manufacturers leverage advanced technologies to gain greater operational efficiencies, factory output, customer loyalty, and net profits.

The OMP’s goals include creating a “Manufacturing Reference Architecture” for platform-agnostic, cloud-based data collection, management, analytics and other applications. This framework will provide a standard way to connect to IoT devices on equipment and define a semantic layer that unifies data across disparate data sources. All in all, this has the potential to create a rich, open-source ecosystem that enables faster and easier adoption of smart manufacturing technologies.

In the same way that interpreters at the United Nations help delegates communicate and make new policies, standardized data formats accelerate the adoption of big data and machine learning, creating a universal translator between multiple machine and process types. OptimalPlus, now part of NI, will bring to OMP its vast domain expertise in automotive manufacturing processes and provide leading production companies with actionable insights and adaptive methods from its big data analytics platform.

“We’re honored to be invited to join the prestigious Open Manufacturing Platform, which plays a key role in helping manufacturers all over the world innovate,” said Uzi Baruch, VP of NI’s Transportation business unit. “With pressure mounting to ensure quality and prevent faulty parts from shipping, it’s important that manufacturers have access to the transformative powers of AI, machine learning and big data analytics. We’re excited to collaborate with industry leaders in the OMP consortium to help manufacturers evolve and optimize their processes.”

AI and advanced analytics help to streamline manufacturing, reduce costs and improve quality, reliability and safety. OMP makes it easier for manufacturers to deploy this technology across their operations and fulfill the promise of smart manufacturing.

White Paper: Insights Into Connecting Industrial IoT Assets

The second bit of news describes a first deliverable from the OMP as it progresses toward its objective.

OMP announced delivery of a critical milestone with the publication of our first white paper. The IoT Connectivity Working Group, chaired by Sebastian Buckel and co-chaired by Dr. Veit Hammerstingl of the BMW Group, authored Insights Into Connecting Industrial IoT Assets. Contributions from member companies Capgemini, Cognizant, Microsoft, Red Hat, and ZF present a consensus view of the connectivity challenges and best practices in IIoT as the 4th industrial revolution unfolds. This paper is the initial publication laying out an approach to solving connectivity challenges while providing a roadmap for future OMP work.

Manufacturing at an Inflection Point

The intersection of information technology (IT) and operational technologies (OT), as well as the advent of the Internet of Things (IoT), presents opportunities and threats to the entire manufacturing sector. In manufacturing, multiple challenges complicate the connection of sensors, actuators, and machines to a central data center. Lack of common standards and proprietary interfaces leads each engineer to solve similar problems, introducing inefficiencies and forcing the same learning curve’s ascension over and over. The long renewal cycles of shop floor equipment, software, and processes present gaps in modern technologies and a general avoidance of making significant institutional changes. This initial publication begins to tackle these problems and lays the groundwork for future, more detailed work.

IT/OT Convergence

Each connectivity challenge will have a range of diverse constituents and the content of this paper addresses issues faced by individuals and teams across job functions. Operational technology (OT) professionals are responsible for the commissioning, operation, and maintenance of shop floor equipment. Information technology (IT) personnel look after overall data processing, the hardware and software infrastructure, and enterprise-wide IT strategy. General managers and logistics teams are typically aligned at a corporate level, coordinating processes across a network of plants. Each of these functions will have roles spanning from operational hands-on to strategic and managerial. The unique demands of each part will require connectivity solutions to be forward-thinking and value-accretive while offering practical solutions implemented with minimal incremental investment.

Industrial IoT Challenges

Also explored in the paper, are the IIoT devices’ critical real-time needs for repeatability and high availability. An example is an AI model that optimizes the parameters of a bending machine based on the current air temperature and humidity. Possible connection failures or high latencies can lead to stopped or interrupted processes or products with insufficient quality.

Manufacturing throughput requirements vary from low bandwidth for simple sensors using small packets to much higher bandwidth required for streaming data for video analytics, vibration sensors, or AR/VR visualization.  A holistic connectivity solution can address this complexity successfully, spanning from the individual devices on the shop floor up through edge gateways and servers to the central data center or cloud resources such as compute and storage.

Network Levels

Networks are usually customized to their precise environment and the desired function, and therefore can be very complex.

In the white paper, we discuss the functions of each of the network levels, their benefits and limitations, and security considerations. Additional sections of the document cover common challenges in IIoT, connectivity levels, basic principles for successful connectivity solutions, communication types, and best practices for program implementation.