Intent to form the Open Metaverse Foundation

I picked up this bit of news last month on a blog  at the Open Metaverse Foundation by Royal O’Brien published on December 15, 2022. I had written a couple of things on “industrial metaverse” speculating on what is actually new and what could possibly be realistic. The Linux Foundation has begun an effort to bring companies together to work on definitions and security issues. Befitting a project of the Linux Foundation, openness is the plea and the work.

In October, we brought top experts from diverse sectors together with leaders from many of the projects across the Linux Foundation to discuss what it will take to transform the emerging concept of the Metaverse from promise to reality—from digital assets, simulations and transactions, to artificial intelligence, networking, security and privacy, and legal considerations. 

One thing I found interesting is the list of interest groups as initially defined. This provides a bit of definition as to their thinking of what constitutes a metaverse market.

  • Users, 
  • transactions, 
  • digital assets, 
  • virtual worlds and simulation, 
  • artificial intelligence, 
  • networking, 
  • security and privacy, 
  • legal and policy.

They are looking for members. I am curious about what companies will join and work on this project. Of course, one thing I won’t discover will be the companies that join to slow down the process.

ABB and Red Hat Partner for Scalable Digital Solutions

Much as some of its large industrial competitors, ABB is quickly building out industrial software solutions. A friend who is a financial analyst told me that Wall Street and other investors prize software right now. A company focused on instrumentation and automation platforms doesn’t evoke the same eyes full of longing and desire as when they add software.

In this announcement, ABB and Red Hat, the open source enterprise software company, are partnering to deliver ABB automation and industrial software solutions at the intersection of information technology (IT) and operational technology (OT), equipping the industrial ecosystem with extended deployment capabilities and greater agility. This is consistent with ABB’s vision of the evolution of process automation.

  • ABB will deliver digital solutions to customers on-demand and at scale using Red HatOpenShift
  • Customers will be better able to harness the potential of data-based decisions by using applications that can be deployed flexibly from the edge to the cloud

The partnership enables virtualization and containerization of automation software with Red Hat OpenShift to provide advanced flexibility in hardware deployment, optimized according to application needs. It also provides efficient system orchestration, enabling real-time, data-based decision making at the edge and further processing in the cloud.

Red Hat OpenShift, the industry’s leading enterprise Kubernetes platform, with Red Hat Enterprise Linux  as its foundation, provides ABB with a single consistent application platform, from small single node systems to scaled-out hyperconverged clusters at the industrial edge, which simplifies development and management efforts for ABB’s customers. 

“This exciting partnership with Red Hat demonstrates ABB’s commitment to meet customer needs by seeking alliances with other innovative market leaders,” said Bernhard Eschermann, Chief Technology Officer, ABB Process Automation. “The alliance with Red Hat will see ABB continue helping our customers improve their operations as they navigate a rapidly evolving digital landscape. It will give them access to the tools they need to integrate plantwide IT and OT, while reducing risks and optimizing performance.” 

Red Hat OpenShift increases the deployment flexibility and scalability of ABB Ability Edgenius, a comprehensive edge platform for industrial software applications, together with ABB Ability Genix Industrial Analytics and AI Suite, an enterprise-grade platform and applications suite that leverages industrial AI to drive Industry 4.0 digital business outcomes for customers. ABB’s Edgenius and Genix can both be scaled seamlessly and securely across multiple deployments. With this partnership, ABB will have access to capabilities like zero-touch provisioning (remote configuration of networks) which can increase manageability and consistency across plant environments. 

“Red Hat is excited to work with ABB to bring operational and information technology closer together to form the industrial edge. Together, we intend to streamline the transition from automated to autonomous operations and address current and future manufacturing needs using open-source technologies,” said Matt Hicks, executive vice president, Products and Technologies, Red Hat. “As we work to break down barriers between IT and the plant level, we look to drive limitless innovation and mark a paradigm shift in operational technology based on open source.” 

ODVA Wraps Up Annual General Meeting Detailing Much Activity

I’m sitting in the San Diego airport following my second post-pandemic conference. ODVA wrapped up its 2022 Annual General Meeting at lunch today with technical committee sessions continuing the rest of the day. This organization may be the most active of any similar one of its kind currently. Working groups met virtually during the two years of the pandemic following the 2020 meeting and maybe were more productive than ever.

Technical Sessions

Yesterday, March 9, I sat in two technical sessions relevant to my interests. The first, ”Edge to Cloud”, discussed the work being done to map CIP data to OPC UA. A large amount of detail has been by the ODVA working group as well as work with a joint working group writing a companion specification for OPC Foundation. Much field-level data that may not even be used by the control function bears content useful to other systems—many of which use the cloud for storage and retrieval.

The second technical session concerned using CIP networks in process automation applications. ODVA originally developed DeviceNet, a fieldbus most useful for discrete applications. Even EtherNet/IP found most uses in factory automation. Process automation users also discovered a need to use EtherNet/IP (a CIP network). The technology enticing for process automation users is Advanced Physical Layer (APL). This network can handle identified required areas including safety, hazardous areas, configuration, process improvement, secure remote access, and 24/7 uptime. Work continues to define and implement standards.

ODVA Growth

Al Beydoun, executive director of ODVA and Adrienne Meyer, VP of operations, reviewed the many association activities of the past two years.

  • Grew membership to greater than 365
  • Focused on growth in China
  • Development work for EtherNet/IP over TSN
  • CIP Safety was recertified with IEC
  • Collaboration continued with Fieldcomm Group and FDT Group
  • Worked with OPD Foundation
  • Worked on xDS device descriptions
  • Extensive online training and promotion.

The technical committees recorded activities of 80 SEs and TDEs, completed two publication cycles in 2020 and three in 2021 one of which concerned APL, and recorded 27 volume revisions. They also worked on standards for resource constrained devices, process industry requirements, and Time Sensitive Networking (TSN).

User Requirements from P&G

Paul Maurath, Technical Director—Process Automation from Procter & Gamble’s Central Engineering, presented the user’s view of automation. I will dispense with suspense. His conclusion, ”Help us manage complexity.”

Maurath told the story of setting up a test process cell in the lab. They used it to test and demonstrate Ethernet APL devices and the network. They discovered that APL worked, the controller didn’t see any issues. The discouraging discovery was the amount of configuration required and the complexity of setup. He referred to an E&I technician working the shift on a Sunday morning at 3 am. Call comes in. Device is down. With a regular HART / 4-20 mA device, the tech has the tools. But with an Ethernet device configuration can be a problem.

Conclusion:

  • There is a need for new technology to deliver functionality and simplicity
  • Standards are great
  • Please keep end users in mind when developing standards and technology

ARC Advisory Group Glimpses the Future

Harry Forbes, research director for ARC Advisory Group. devoted a substantial part of his keynote to open source. ”There is,” he noted, ”an IT technology totally overlooked by OT—open source software.” He principally cited the Linux Foundation. You’ll find news and comments from LF throughout this blog. I see great value from this technology. That an ARC researcher also sees the power was somewhat a surprise, though. ”It’s not software that’s eating the world,” said Forbes, ”it is open source eating the world.”

The problem to solve as detailed by presentations at the last ARC Industry Forum (and I think also worked on by the Open Process Automation Forum which also appears often on this blog) is the need to decouple hardware and software allowing easier updates to the software through containers (Docker, Kubernetes) and virtual machines.

Is that the future? I’m not sure where the vendors are that will bring this innovation, but I’m sure that many users would welcome it.

Conclusion

ODVA appears to be thriving. It is at the forefront of pushing new standards. It is looking forward at new technologies. It is growing membership and mindshare. The staff also assembled an outstanding event.

Digital Twin Consortium Announces Digital Twin System Interoperability Framework

I applaud these efforts to improve and increase digital interoperability through industry or formal standards and open source. These efforts over many years, and even ones that pre-date digital, have provided progress not only in technology but in the lives of users. This announcement comes from the Digital Twin Consortium, a project of The Object Management Group (OMG)

Digital Twin Consortium (DTC) announced the Digital Twin System Interoperability Framework. The framework characterizes the multiple facets of system interoperability based on seven key concepts to create complex systems that interoperate at scale.

“Interoperability is critical to enable digital twins to process information from heterogeneous systems. The Digital Twin System Interoperability Framework seeks to address this challenge by facilitating complex system of systems interactions. Examples include scaling a smart building to a smart city to an entire country, or an assembly line to a factory to a global supply chain network,” said Dan Isaacs, CTO, Digital Twin Consortium.

The seven key concepts of the DTC Digital Twin System Interoperability Framework are:

1 System-Centric Design – enables collaboration across and within disciplines—mechanical, electronic, and software—creating systems of systems within a domain and across multiple domains.

2 Model-Based Approach – with millions and billions of interconnections implemented daily, designers can codify, standardize, identify, and reuse models in various use cases in the field.

3 Holistic Information Flow – facilitates an understanding of the real world for optimal decision-making, where the “world” can be a building, utility, city, country, or other dynamic environment.

4 State-Based Interactions – the state of an entity (system) encompasses all the entity’s static and dynamic attribute values at a point in time.

5 Federated Repositories – optimal decision-making requires accessing and correlating distributed, heterogeneous information across multiple dimensions of a digital twin, spanning time and lifecycle.

6 Actionable Information – ensures that information exchanged between constituent systems enables effective action.

7 Scalable Mechanisms – ensures interoperability mechanism(s) are inherently scalable from the simplest interoperation of two systems to the interoperability of a dynamic coalition of distributed, autonomous, and heterogeneous systems within a complex and global ecosystem.

“The Digital Twin System Interoperability Framework enables USB-type compatibility and ease for all systems connected to the Internet and private networks, which until now, has been the domain of system integrators,” said Anto Budiardjo, CEO, Padi.io. “This means system integrators can concentrate on designing applications rather than point-to-point integrations.”

Biden Memo and Research Highlight Recent Security Updates

A number of security-related news items came my way during the past couple of weeks. The Biden administration memo brought a surge of comments. I’ve included one from Marty Edwards. Several companies research vulnerabilities and discover interesting and useful threats and vulnerabilities.

  • MITRE Engenuity ATT&CK Evaluations
  • Google on Measuring Risk in Open Source
  • Open Source Security Foundation Adds Members
  • Claroty Research Team82 Finds ICS Vulnerabilities
  • Industry Veteran Marty Edwards Shares Thoughts on Biden’s Security Memo

Engenuity ATT&CK Evaluations

MITRE Engenuity released results from its first round of independent ATT&CK Evaluations for Industrial Control Systems (ICS). The evaluations examined how cybersecurity products from five ICS vendors detected the threat of Russian-linked Triton malware.

The malware targets safety systems, preventing officials from responding to failures, hazards and other unsafe conditions, potentially causing physical destruction. 

The evaluations use ATT&CK for ICS, a MITRE-curated knowledge base of adversary tactics, techniques, and procedures based on known threats to industrial control systems.

The evaluations, which were paid for by the participating vendors, included products from Armis; Claroty; Microsoft (via CyberX acquisition); Dragos; and the Institute for Information Industry

“MITRE Engenuity’s ATT&CK Evaluations program is built on the backbone of MITRE’s integrity and commitment to making the world a safer, more secure place,” said Frank Duff, general manager of the ATT&CK Evaluations program. “Vendors trust us to improve their offerings, and the community trusts that we’ll provide transparency into the technology that is necessary to make the best decisions for their unique environment. Unlike closed door assessments, we use a purple teaming approach with the vendor to optimize the evaluation process. MITRE experts provide the red team while the vendor provides the blue team to ensure complete visibility, while allowing the vendor to learn directly from ATT&CK experts.” 

Google Measuring Risk in Open Source

by Kim Lewandowski, Azeem Shaikh, Laurent Simon, Google Open Source Security Team

Contributors to the Scorecards project, an automated security tool that produces a “risk score” for open source projects, have accomplished a lot since our launch last fall. Today, in collaboration  with the Open Source Security Foundation community, we are announcing Scorecards v2. We have added new security checks, scaled up the number of projects being scored, and made this data easily accessible for analysis.

Since last fall, Scorecards’ coverage has grown; we’ve added several new checks, following the Know, Prevent, Fix framework proposed by Google earlier this year, to prioritize our additions.

Contributors with malicious intent or compromised accounts can introduce potential backdoors into code. Code reviews help mitigate against such attacks. With the new Branch-Protection check, developers can verify that the project enforces mandatory code review from another developer before code is committed.

Despite best efforts by developers and peer reviews, vulnerable code can enter source control and remain undetected. We have added checks to detect if a project uses Fuzzing and SAST tools as part of their CI/CD system.

A common CI/CD solution used by GitHub projects is GitHub Actions. A danger with these action workflows is that they may handle untrusted user input. Meaning, an attacker can craft a malicious pull request to gain access to the privileged GitHub token, and with it the ability to push malicious code to the repo without review. To mitigate this risk, Scorecard’s Token-Permissions prevention check now verifies that the GitHub workflows follow the principle of least privilege by making GitHub tokens read-only by default.

To date, the Scorecards project has scaled up to evaluate security criteria for over 50,000 open source projects. In order to scale this project, we undertook a massive redesign of our architecture and used a PubSub model which achieved horizontal scalability and higher throughput. This fully automated tool periodically evaluates critical open source projects and exposes the Scorecards check information through a public BigQuery dataset which is refreshed weekly.  

This data can be retrieved using the bq command line tool.

Scorecards data for available projects is now included in the recently announced Google Open Source Insights project and also showcased in OpenSSF Security Metrics project. The data on these sites shows that there are still important security gaps to fill, even in widely used packages like Kubernetes.

There are a couple of big enhancements we’re especially excited about:

Scorecards Badges – GitHub badges to show off compliance

Integration with CI/CD and GitHub Code Scanning Results

Integration with Allstar project – GitHub App for enforcing security policies

Open Source Security Foundation Adds 10 Members

OpenSSF, a cross-industry collaboration to secure the open source ecosystem, announced new membership commitments to advance open source security education and best practices. New members include Accurics, Anchore, Bloomberg Finance, Cisco Systems, Codethink, Cybertrust Japan, OpenUK, ShiftLeft, Sonatype and Tidelift. 

The new Scorecard 2.0 is also available now and includes new security checks, scaled up the number of projects being scored, and made this data easily accessible for analysis. The Scorecard is gaining adoption for automating analysis and trust decisions on the security posture of open source projects.

Its working groups include Securing Critical Projects, Security Tooling, Identifying Security Threats, Vulnerability Disclosures, Digital Identity Attestation, and Best Practices.  

Claroty Finds Critical Vulnerabilities

Claroty, the industrial cybersecurity company, launched Team82, its new research arm that provides indispensable vulnerability and threat research to Claroty customers and defenders of industrial networks worldwide. Additionally, Team82 released a new report on critical vulnerabilities found in cloud-based management platforms for industrial control systems (ICS), highlighting the rise of ICS in the cloud and the growing need to secure cloud implementations in industrial environments. 

In its latest report, “Top-Down and Bottom-Up: Exploiting Vulnerabilities in the OT Cloud Era,” Team82 researched the exploitability of cloud-based management platforms responsible for monitoring ICS, and developed techniques to exploit vulnerabilities in automation vendor CODESYS’ Automation Server and vulnerabilities in the WAGO PLC platform. Team82’s research mimics the top-down and bottom-up paths an attacker would take to either control a Level 1 device in order to eventually compromise the cloud-based management console, or the reverse, commandeer the cloud in order to manipulate all networked field devices. 

The new Team82 Research Hub includes the team’s latest research reports, a vulnerability dashboard for tracking the latest disclosures, its coordinated disclosure policy for working with affected vendors, its public PGP Key for securely and safely exchanging vulnerability and research information, and other resources. 

To access the Team82 Research Hub, visit claroty.com/team82

Read the report, “Top-Down and Bottom-Up: Exploiting Vulnerabilities In the OT Cloud Era.”

Marty Edwards, Tenable, on Biden Memo

You can find Edwards’ thoughts at this blog site. Below are some excerpts.

Recent activity from the Biden Administration represents a watershed moment in the establishment of baseline standards for preparing, mitigating and responding to attacks that impact the critical infrastructure we all rely on.

The most substantive thrust of these government actions is recognizing and acting on the accelerated trend of reconnaissance and attack by establishing the Industrial Control Systems (ICS) Cybersecurity Initiative. The ICS Initiative is a voluntary, collaborative effort between the federal government and the critical infrastructure community to protect U.S. critical infrastructure “by encouraging and facilitating deployment of technologies and systems that provide threat visibility, indications, detection, and warnings, and that facilitate response capabilities for cybersecurity in essential control system and operational technology networks,” with a primary goal of “greatly expand[ing] deployment of these technologies across priority critical infrastructure.”

Tenable encourages CISA and the U.S. government to take an open, technology-neutral, standards-based approach in the development of these goals. Core elements for consideration as the most appropriate and successful methods of disrupting attack paths and securing critical infrastructure and OT environments revolve around three key pillars:

Visibility: Gain full visibility and deep situational awareness across your converged IT/OT environment.

Security: Protect your industrial infrastructure from advanced cyberthreats and risks posed by hackers and malicious insiders.

Control: Take full control of your operations network by continuously tracking ALL changes to any ICS device.

Linux Foundation Launches Research, Training, and Tools to Advance Adoption of Software Bill of Materials

My latest podcast topic contains thoughts on open source. This announcement from The Linux Foundation merges open source with the latest concerns about cybersecurity with several product launches regarding the Software Bill of Materials (SBOM). The industry continues to take small steps toward security. When a community gathers to work on a solution, it’s a big help.

Home to the industry’s most supported open standard for exchanging information about what is in software – SPDX – the Linux Foundation brings its complete resources to bear to support private and public sector supply chain security 

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced new industry research, a new training course, and new software tools to accelerate the adoption of Software Bill of Materials (SBOMs). 

President Biden’s recent Executive Order on Improving the Nation’s Cybersecurity referenced the importance of SBOMs in protecting and securing the software supply chain.

The de-facto industry standard, and most widely used approach today, is called Software Package Data Exchange (SPDX). SPDX evolved organically over the last ten years to suit the software industry, covering issues like license compliance, security, and more. The community consists of hundreds of people from hundreds of companies, and the standard itself is the most robust, mature, and adopted SBOM in the market today. 

“As the architects of today’s digital infrastructure, the open-source community is in a position to advance the understanding and adoption of SBOMs across the public and private sectors,” said Mike Dolan, Senior Vice President and General Manager Linux Foundation Projects. “The rise in cybersecurity threats is driving a necessity that the open-source community anticipated many years ago to standardize on how we share what is in our software. The time has never been more pressing to surface new data and offer additional resources that help increase understanding about how to generate and adopt SBOMs.” 

An SBOM is an account of the components contained in a piece of software. It can be used to ensure developers understand what software is being shared throughout the supply chain and in their projects or products and supports the systematic review of each component’s licenses to clarify what obligations apply to the distribution of the supplied software.

SBOM Readiness Survey

Linux Foundation Research is conducting the SBOM Readiness Survey. It will examine obstacles to adoption for SBOMs and future actions required to overcome them related to the security of software supply chains. The recent US Executive Order on Cybersecurity emphasizes SBOMs, and this survey will help identify industry gaps in SBOM application. Survey questions address tooling, security measures, and industries leading in producing and consuming SBOMs, among other topics. For more information about the survey and to participate, please visit {Hilary blog}. 

New Course: Generating a Software Bill of Materials

The Linux Foundation is also announcing a free, online training course, Generating a Software Bill of Materials (LFC192). This course provides foundational knowledge about the options and the tools available for generating SBOMs and how to use them to improve the ability to respond to cybersecurity needs. It is designed for directors, product managers, open-source program office staff, security professionals, and developers in organizations building software. Participants will walk away with the ability to identify the minimum elements for an SBOM, how they can be assembled, and an understanding of some of the open-source tooling available to support the generation and consumption of an SBOM.

New Tools: SBOM Generator

Also announced today is the availability of the SPDX SBOM generator, which uses a command-line interface (CLI) to generate SBOM information, including components, licenses, copyrights, and security references of your software using SPDX v2.2 specification and aligning with the current known minimum elements from NTIA. Currently, the CLI supports GoMod (go), Cargo (Rust), Composer (PHP), DotNet (.NET), Maven (Java), NPM (Node.js), Yarn (Node.js), PIP (Python), Pipenv (Python), and Gems (Ruby). It is easily embeddable in automated processes. It is easy to embed in automated processes such as continuous integration (CI) pipelines and is available for Windows, MacOS, and Linux.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open-source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration.

Follow this blog

Get a weekly email of all new posts.