SPDX Becomes Internationally Recognized Standard for Software Bill of Materials

The idea of a software bill of materials seems to be gaining traction. The Linux Foundation has had a group working on a standard. This news details the success of the effort. 

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the Software Package Data Exchange (SPDX) specification has been published as ISO/IEC 5962:2021  and recognized as the open standard for security, license compliance, and other software supply chain artifacts. ISO/IEC JTC 1 is an independent, non-governmental standards body. 

Intel, Microsoft, Phillips, Sony, Texas Instruments, Synopsys and VMware are just a handful of the companies using SPDX to communicate Software Bill of Materials (SBOM) information in policies or tools to ensure compliant, secure development across global software supply chains. 

“SPDX plays an important role in building more trust and transparency in how software is created, distributed and consumed throughout supply chains. The transition from a de-facto industry standard to a formal ISO/IEC JTC 1 standard positions SPDX for dramatically increased adoption in the global arena,” said Jim Zemlin, executive director, the Linux Foundation. “SPDX is now perfectly positioned to support international requirements for software security and integrity across the supply chain.” 

Ninety percent (90%) of a modern application is assembled from open source software components. An SBOM accounts for the software components contained in an application — open source, proprietary, or third-party — and details their quality, license, and security attributes. SBOMs are used as a part of a foundational practice to track and trace components across software supply chains. SBOMs also help to proactively identify software component  issues and risks, and establish a starting point for their remediation.

SPDX evolved organically over the last ten years through the collaboration of hundreds of companies, including the leading Software Composition Analysis (SCA) vendors – making it the most robust, mature, and adopted SBOM standard. 

To learn more about how companies and open source projects are using SPDX, recordings from the “Building Cybersecurity into Software Supply Chain” Town Hall that was held on August 18th.

ISO/IEC JTC 1 is an independent, non-governmental international organization based in Geneva, Switzerland. Its membership represents more than 165 national standards bodies with experts who share knowledge and develop voluntary, consensus-based, market relevant international standards that support innovation and provide solutions to global challenges.

Supporting Comments

Intel

“Software security and trust are critical to our Industry’s success. Intel has been an early participant in the development of the SPDX specification and utilizes SPDX both internally and externally for a number of software use-cases,” said Melissa Evers, Vice President – Software and Advanced Technology Group, General Manager of Strategy to Execution, Intel.

Microsoft

“Microsoft has adopted SPDX as our SBOM format of choice for software we produce,” says Adrian Diglio, Principal Program Manager of Software Supply Chain Security at Microsoft. “SPDX SBOMs make it easy to produce U.S. Presidential Executive Order compliant SBOMs, and the direction that SPDX is taking with the design of their next gen schema will help further improve the security of the software supply chain.”

Siemens

“With ISO/IEC 5962:2021 we have the first official standard for metadata of software packages. It’s natural that SPDX is that standard, as it’s been the defacto standard for a decade. This will make license compliance in the supply chain much easier, especially because several open source  tools like FOSSology, ORT, scancode and sw360 already support SPDX,” said Oliver Fendt, senior manager, open source at Siemens. 

Sony

”The Sony team uses various approaches to managing open source compliance and governance,” says Hisashi Tamai, Senior Vice President, Deputy President of R&D Center,  Representative of the Software Strategy Committee, Sony Group Corporation. “An example is the use of an OSS management template sheet that is based on SPDX Lite, a compact subset of the SPDX standard. It is important for teams to be able to quickly review the type, version and requirements of software, and using a clear standard is a key part of this process.”

Synopsis

“The Black Duck team from Synopsys has been involved with SPDX since its inception, and I personally had the pleasure of coordinating the activities of the project’s leadership for more than a decade. Representatives from scores of companies have contributed to the important work of developing a standard way of describing and communicating the content of a software package,” said Phil Odence, General Manager, Black Duck Audits.

VMware

“SPDX is the essential common thread among tools under the Automating Compliance Tooling (ACT) Umbrella. SPDX enables tools written in different languages and for different software targets to achieve coherence and interoperability around SBOM production and consumption. SPDX is not just for compliance, either; the well-defined and ever-evolving spec is also able to represent security and supply chain implications. This is incredibly important for the growing community of SBOM tools as they aim to thoroughly represent the intricacies of modern software,” said Rose Judge, ACT TAC Chair and open source engineer at VMware.

Wind River

“The SPDX format greatly facilitates the sharing of software component data across the supply chain. Wind River has been providing a Software Bill of Materials (SBOM) to its customers using the SPDX format for the past 8 years. Often customers will request SBOM data in a custom format. Standardizing on SPDX has enabled us to deliver a higher quality SBOM at a lower cost,” said Mark Gisi, Wind River Open Source Program Office Director and OpenChain Specification Chair.

Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. The Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, RISC-V, Hyperledger, Jenkins, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration.

NI Unlocks the Power of Test Data and Software

NI Connect, its annual user group done virtually again this year. It has announced several product advances this year. A couple relate to advanced driver-assisted systems with wider applicability, I’m sure. First was a brief discussion of digital thread—something NI was doing before the buzz word was invented. I loved the many years of co-founder Jeff Kodosky’s technical discussions of software defined instruments and data traces through the software.

To quote from this year, “NI’s software-connected approach creates a more complete enterprise data and insight chain, collecting and connecting the data that accelerates digital transformation, enabling customers to optimize every step of the product life cycle.”

“A digital thread of data across each phase of the product life cycle delivers powerful insights to enhance product performance,” said NI CEO Eric Starkloff. “At NI, our software-connected approach unlocks the power of test, from early research to the production floor and beyond. We continue to aggressively invest in the technology to make this compelling vision a reality.”

Product announcements include:

  • Streamlined SystemLink Software Interface to Increase Efficiency — By connecting test systems and test data to enterprise outcomes, SystemLink software substantially accelerates each phase of the product life cycle. With a unified view of test operations in design validation and production environments, SystemLink manages and simplifies test scheduling, resource utilization, system health and maintenance.​ The latest software enhancements include new UI customization options, simplified product navigation and expanded asset health monitoring capabilities. The result is test insight acceleration, more efficient use of assets and reduced cost of test.
  • New LabVIEW 2021 to Improve Interoperability with Python and MathWorks MATLAB Software — Open-source software is increasingly important as systems become more diverse and complex. NI’s 2021 version of LabVIEW, the leading software platform for building test and measurement systems, features improved interoperability with Python and MathWorks MATLAB software, improved support for version control using Git and usability enhancements. These updates make it easier for engineers to connect disparate systems and hardware to accelerate innovation, especially in the design and validation environments.
  • PXI Hardware Solution to Enable Software-Connected Workflow in a Smaller, Cost-Effective Package  Like open-source software, modular hardware is also increasingly important to flexibly connect with existing systems and workflows. PXI hardware delivers openness, software options, modularity and I/O coverage for customers seeking to develop adaptive and scalable systems. NI’s first 2-slot PXI chassis delivers these benefits in a smaller, more cost-effective package. Modular hardware like PXI enables a software-connected workflow to achieve better results. 
  • NI Collaboration with Seagate to Deliver First-of-Its-Kind In-Vehicle Edge Storage and Data Transfer Service — The next generation of autonomous vehicles requires more real road data than ever before, making efficient data storage exceedingly important. NI and Seagate Technology Holdings, a world leader in data storage infrastructure solutions, announced a new collaboration to enhance data storage services, including a first-of-its-kind advanced driver-assistance systems (ADAS) record offering. This in-vehicle data storage as a service (STaaS), powered by  Seagate’s Lyve Mobile edge storage and data transfer service, enables original equipment manufacturers (OEMs) and suppliers to modernize their data storage strategy from self-managed to STaaS, leading to reduced costs and efficient storage.
  • NI Ettus USRP X410 Software Defined Radio Platform to Accelerate Wireless Innovation — The next generation of wireless technologies, 5G and 6G, are poised to transform the way people and systems connect, making test data insights that much more important. Because wireless technologies are becoming increasingly complex, advanced tools to support research and prototyping are needed. The new NI Ettus USRP X410 Software Defined Radio Platform is high performance and fully open source, allowing engineers to achieve a faster time to prototype and accelerate wireless innovation 

Seeq Expands Machine Learning Features for Process Engineering and Data Science Integration

Machine Learning (ML) is a flavor of Artificial Intelligence (AI). This news release from Seeq illuminates a bit of the mystery surrounding much discussion of the topic.

Seeq Corporation released R52 with new features to support the use of machine learning innovation in process manufacturing organizations. These features enable organizations to deploy their own or third-party machine learning algorithms into the advanced analytics applications used by front line process engineers and subject matter experts, thus scaling the efforts of a single data scientist to many front-line OT employees. 

New Seeq capabilities include Add-on Tools, Display Panes, and User-defined Functions, each of which extend Seeq’s predictive, diagnostic, and descriptive analytics. The result is faster development and deployment of easy-to-use algorithms and visualizations for process engineers. With R52, end users will also be able to schedule Seeq Data Lab notebooks to run in the background, fulfilling a top customer request.

Seeq customers include companies in the oil and gas, pharmaceutical, chemical, energy, mining, food and beverage, and other process industries. Investors in Seeq—which has raised over $100M to date—include Insight Ventures, Saudi Aramco Energy Ventures, Altira Group, Chevron Technology Ventures, Cisco Investments, and Next47, the venture group for Siemens.

As a compliment to the new extensibility features, Seeq data scientists are working with customers to develop and deploy machine learning algorithms tailored to the industrial process domain. Current areas of focus include automatically detecting performance changes in monitored assets, identifying causal relationships among process variables, and improved diagnostics by identifying and labeling patterns within a data set. For example, a super-major oil & gas company is using Seeq extensibility features to enable easy access by process engineers to a neural-network algorithm created by their data science team, helping reduce greenhouse gas emissions.

“Analytics software for manufacturing organizations is an area overdue for innovation,” says Steve Sliwa, CEO and Co-Founder of Seeq. “Spreadsheets replaced pen and paper 30 years ago for analytics and haven’t changed much since. By leveraging big data, machine learning and computer science innovations, Seeq is enabling a new generation of software-led insights.”

Seeq first shipped easy to use machine learning-enabled features in 2017 in Seeq Workbench, and then in 2020 introduced Seeq Data Lab for Python scripting and access to any machine learning algorithm. This support for multiple audiences—with no code/low code features for process engineers and a scripting environment for data scientists engaged in feature engineering and data reduction efforts—democratized access to machine learning innovation.

Seeq’s approach to integrating machine learning features in its applications addresses many of the reasons data science initiative fail in manufacturing organizations.

  • Seeq connects to all underlying data sources—historian, contextual, manufacturing applications, or other data sources—for data cleansing and modeling.
  • Seeq supports the connected, two-way, interaction of plant data and process engineering expertise in OT departments with the data science and algorithm expertise in IT departments.
  • Seeq provides a complete solution for algorithm development, updating and improving algorithms over time, employee collaboration and knowledge capture, and publishing insights for faster decision making. 

In addition to Seeq Data Lab support for machine learning code and libraries, Seeq also enables access to the Seeq/Python library by third-party machine learning solutions including Microsoft Azure Machine Learning, Amazon SageMaker, and open source offerings such as Apache Anaconda. For example, a manufacturer using Amazon SageMaker is evaluating their machine learning insights with Seeq to create work orders in their SAP system.

ROKLive Into the Clouds but Weaker at the Edge

This week’s conference was called ROKLive. This is the annual Rockwell Software conference that has naturally morphed over the years. It was training for distributor tech specialists. This year was virtual only and seemed a bit more general. Themes this year could be cast as corporate strategy toward software and cloud. 

Software Strong

CEO Blake Moret has been making (for Rockwell Automation) bold moves during his tenure effectively remaking the company. It has always been a hardware product company.  Software was developed as necessary for support of the hardware products. You cannot, for example, sell a PLC without programming software. Or a drive without configuration software. Then a couple of small acquisitions moved the company tentatively into software in what was described to me as an experiment. And many of those acquisitions failed to fulfill the ambitions of the software leaders at the time.

Building the Cloud

Software is no longer relegated to an experiment. It has become a core part of the business. Rockwell made a huge investment in PTC gaining access to the IoT platform of ThingWorx and Kepware. This enabled a restructuring of the software group getting products to market quickly and surely contributing to both the bottom line and to customer satisfaction. For years they wanted to talk to me about asset management. Then I’d remember that that meant helping customers keep track of Allen-Bradley spare parts in their cribs. Now with the Fiix acquisition, Rockwell gained cloud expertise in addition to an EAM and CMMS suite. Further to the cloud was the recent announced acquisition of Plex giving Rockwell an updated MES product and even more cloud expertise.

At about the same time that I left magazine media and chose Manufacturing Connection (emphasis on connection) as a blog name, Rockwell Automation announced a corporate strategy called Connected Enterprise. I told the marketing executives at the time that great minds think alike <smile>. These investments flesh out that connected enterprise strategy building upon the Ethernet strategy established years ago.

Organization Alignment

Then there are the little corporate things I tend to notice. For many years the head of software was a VP level reporting to usually the SVP of control and automation. Now there is a Sr. VP whose title is Software and Control. Further, like many if not most organizations, the organizational structure had VPs in charge of businesses. If there was a technical or business reason for two products to work together the result would tend to cost one of the VPs bonus money. They have corrected that flaw to add incentives for executives to work together. Very good organizational move to forward overall business and technology strategy.

Weak on the edge

An IT conference was held the week before. One of the themes was “edge-to-cloud”. At least one presentation at ROKLive also discussed “edge-to-cloud”. I’ve already pointed out the beginnings of a cloud strategy at Rockwell. You would expect Rockwell to be an “edge” company. I attended a conference on that topic and came away less than enthused. The discussion included industrial PCs (IPCs) and a card in a PLC. If I’m in BusDev at a company with a solid edge compute solution realizing Rockwell’s newfound penchant for strong partnerships, I’m on the phone with an SVP or CTO with a pitch.

Linux Foundation Launches Research, Training, and Tools to Advance Adoption of Software Bill of Materials

My latest podcast topic contains thoughts on open source. This announcement from The Linux Foundation merges open source with the latest concerns about cybersecurity with several product launches regarding the Software Bill of Materials (SBOM). The industry continues to take small steps toward security. When a community gathers to work on a solution, it’s a big help.

Home to the industry’s most supported open standard for exchanging information about what is in software – SPDX – the Linux Foundation brings its complete resources to bear to support private and public sector supply chain security 

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced new industry research, a new training course, and new software tools to accelerate the adoption of Software Bill of Materials (SBOMs). 

President Biden’s recent Executive Order on Improving the Nation’s Cybersecurity referenced the importance of SBOMs in protecting and securing the software supply chain.

The de-facto industry standard, and most widely used approach today, is called Software Package Data Exchange (SPDX). SPDX evolved organically over the last ten years to suit the software industry, covering issues like license compliance, security, and more. The community consists of hundreds of people from hundreds of companies, and the standard itself is the most robust, mature, and adopted SBOM in the market today. 

“As the architects of today’s digital infrastructure, the open-source community is in a position to advance the understanding and adoption of SBOMs across the public and private sectors,” said Mike Dolan, Senior Vice President and General Manager Linux Foundation Projects. “The rise in cybersecurity threats is driving a necessity that the open-source community anticipated many years ago to standardize on how we share what is in our software. The time has never been more pressing to surface new data and offer additional resources that help increase understanding about how to generate and adopt SBOMs.” 

An SBOM is an account of the components contained in a piece of software. It can be used to ensure developers understand what software is being shared throughout the supply chain and in their projects or products and supports the systematic review of each component’s licenses to clarify what obligations apply to the distribution of the supplied software.

SBOM Readiness Survey

Linux Foundation Research is conducting the SBOM Readiness Survey. It will examine obstacles to adoption for SBOMs and future actions required to overcome them related to the security of software supply chains. The recent US Executive Order on Cybersecurity emphasizes SBOMs, and this survey will help identify industry gaps in SBOM application. Survey questions address tooling, security measures, and industries leading in producing and consuming SBOMs, among other topics. For more information about the survey and to participate, please visit {Hilary blog}. 

New Course: Generating a Software Bill of Materials

The Linux Foundation is also announcing a free, online training course, Generating a Software Bill of Materials (LFC192). This course provides foundational knowledge about the options and the tools available for generating SBOMs and how to use them to improve the ability to respond to cybersecurity needs. It is designed for directors, product managers, open-source program office staff, security professionals, and developers in organizations building software. Participants will walk away with the ability to identify the minimum elements for an SBOM, how they can be assembled, and an understanding of some of the open-source tooling available to support the generation and consumption of an SBOM.

New Tools: SBOM Generator

Also announced today is the availability of the SPDX SBOM generator, which uses a command-line interface (CLI) to generate SBOM information, including components, licenses, copyrights, and security references of your software using SPDX v2.2 specification and aligning with the current known minimum elements from NTIA. Currently, the CLI supports GoMod (go), Cargo (Rust), Composer (PHP), DotNet (.NET), Maven (Java), NPM (Node.js), Yarn (Node.js), PIP (Python), Pipenv (Python), and Gems (Ruby). It is easily embeddable in automated processes. It is easy to embed in automated processes such as continuous integration (CI) pipelines and is available for Windows, MacOS, and Linux.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open-source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration.

Financial Risks When Delaying PLM Upgrades

Senior management have always been reluctant to invest in technology and especially upgrades once a technology is in place. I have seen instances where management lays off the senior engineers who implemented something like Advanced Process Control or Manufacturing Execution Systems keeping a recent graduate engineer to maintain the system, if even that. Management sees only a large salary cost reduction. Rarely is maintaining momentum a virtue.

I have been in way too many of these discussions in my career. I’ve seen results one way or another. There have been the instances where they had to hire back the laid off engineer at higher consultant rates to get the system back up and running properly.

So, this report from CIMdata detailing research on PLM software upgrading was hardly surprising. Disturbing, perhaps, but not surprising.

Digital transformation is a popular topic, and CIMdata has written much about it. While many still wonder whether digital transformation is real or just the latest buzzword, many industrial companies are taking its promise very seriously.

While it is clear to all within the PLM community that PLM is foundational to a meaningful digitalization program (or digital transformation strategy), this truth is not always understood by senior leadership within companies. While CIMdata believes that the level of investment in digital transformation is appropriate, based on our research and experience we find that executive awareness of the dependency of digital transformation on PLM is lacking. This lack of understanding of its association to PLM-related investment, sustainability and impacts on business performance and benefits puts many digital transformation programs at risk of becoming yet another program of the month.

This research on obsolescence identified areas that increased the cost of technology refresh and found that heavy customization was at the top of the list. This aligns with CIMdata’s experience in the field and is why companies strive to be more out-of-the-box with their PLM implementations. CIMdata’s view is that customization can add significant value to a PLM implementation, but it needs to be either business or cost justified and deliver an appropriate return on investment over the long-term (i.e., even through subsequent solution upgrades).

A new study from CIMdata exposes the financial risk many organizations face when they take PLM upgrades for granted. According to the study, the cost of upgrades with legacy PLM vendors can average between $732,000 and $1.25 million. The study – which compares industry heavyweights such as Dassault, PTC, and Siemens – finds the Aras PLM platform is easiest to keep current. Aras users upgrade more frequently, over a shorter duration, and at less cost than other leaders in the space. 

What’s behind PLM obsolescence? According to CIMdata, “A sustainable PLM solution is one that can meet current and future business requirements with an acceptable return on investment (ROI) via incremental enhancements and upgrades.” But as clearly shown in the research, many companies using PLM software are not staying current. The five reasons are: 


1. Technically Impossible. Typically, after an arduous deployment and the necessary customization to meet the businesses current needs, the software is no longer capable of upgrading. 
2. No ROI. If you take a year to upgrade and it costs close to a million dollars, the cost and impact to the business is so outrageous it can’t be justified.

3. No Budget. Not having the budget is a real concern, but often the lack of budget is a mistake—a mis-prioritization of what’s important to your organization’s future growth, often combined with a high percentage of the overall budget being consumed by technical debt. 
4. Companies overinvest and therefore are committed. The only thing worse than spending large amounts of money on the wrong thing is doubling down and spending more, expecting a better experience. The pandemic has accelerated the need to change, to expect transformation with less risk, less cost, and greater ROI that will lead to greater business resiliency. Throwing good money after bad is no longer being tolerated—there is more of a focus on the bottom-line and doing more with less. 
5. Leadership Doesn’t Understand Dependency of Digital Transformation on PLM. If your PLM system hasn’t been upgraded in years and isn’t the foundation for continuous digital transformation efforts, there is an absolute lack of understanding of how PLM can transform a business.