The idea of a software bill of materials seems to be gaining traction. The Linux Foundation has had a group working on a standard. This news details the success of the effort.
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the Software Package Data Exchange (SPDX) specification has been published as ISO/IEC 5962:2021 and recognized as the open standard for security, license compliance, and other software supply chain artifacts. ISO/IEC JTC 1 is an independent, non-governmental standards body.
Intel, Microsoft, Phillips, Sony, Texas Instruments, Synopsys and VMware are just a handful of the companies using SPDX to communicate Software Bill of Materials (SBOM) information in policies or tools to ensure compliant, secure development across global software supply chains.
“SPDX plays an important role in building more trust and transparency in how software is created, distributed and consumed throughout supply chains. The transition from a de-facto industry standard to a formal ISO/IEC JTC 1 standard positions SPDX for dramatically increased adoption in the global arena,” said Jim Zemlin, executive director, the Linux Foundation. “SPDX is now perfectly positioned to support international requirements for software security and integrity across the supply chain.”
Ninety percent (90%) of a modern application is assembled from open source software components. An SBOM accounts for the software components contained in an application — open source, proprietary, or third-party — and details their quality, license, and security attributes. SBOMs are used as a part of a foundational practice to track and trace components across software supply chains. SBOMs also help to proactively identify software component issues and risks, and establish a starting point for their remediation.
SPDX evolved organically over the last ten years through the collaboration of hundreds of companies, including the leading Software Composition Analysis (SCA) vendors – making it the most robust, mature, and adopted SBOM standard.
To learn more about how companies and open source projects are using SPDX, recordings from the “Building Cybersecurity into Software Supply Chain” Town Hall that was held on August 18th.
ISO/IEC JTC 1 is an independent, non-governmental international organization based in Geneva, Switzerland. Its membership represents more than 165 national standards bodies with experts who share knowledge and develop voluntary, consensus-based, market relevant international standards that support innovation and provide solutions to global challenges.
“Software security and trust are critical to our Industry’s success. Intel has been an early participant in the development of the SPDX specification and utilizes SPDX both internally and externally for a number of software use-cases,” said Melissa Evers, Vice President – Software and Advanced Technology Group, General Manager of Strategy to Execution, Intel.
“Microsoft has adopted SPDX as our SBOM format of choice for software we produce,” says Adrian Diglio, Principal Program Manager of Software Supply Chain Security at Microsoft. “SPDX SBOMs make it easy to produce U.S. Presidential Executive Order compliant SBOMs, and the direction that SPDX is taking with the design of their next gen schema will help further improve the security of the software supply chain.”
“With ISO/IEC 5962:2021 we have the first official standard for metadata of software packages. It’s natural that SPDX is that standard, as it’s been the defacto standard for a decade. This will make license compliance in the supply chain much easier, especially because several open source tools like FOSSology, ORT, scancode and sw360 already support SPDX,” said Oliver Fendt, senior manager, open source at Siemens.
”The Sony team uses various approaches to managing open source compliance and governance,” says Hisashi Tamai, Senior Vice President, Deputy President of R&D Center, Representative of the Software Strategy Committee, Sony Group Corporation. “An example is the use of an OSS management template sheet that is based on SPDX Lite, a compact subset of the SPDX standard. It is important for teams to be able to quickly review the type, version and requirements of software, and using a clear standard is a key part of this process.”
“The Black Duck team from Synopsys has been involved with SPDX since its inception, and I personally had the pleasure of coordinating the activities of the project’s leadership for more than a decade. Representatives from scores of companies have contributed to the important work of developing a standard way of describing and communicating the content of a software package,” said Phil Odence, General Manager, Black Duck Audits.
“SPDX is the essential common thread among tools under the Automating Compliance Tooling (ACT) Umbrella. SPDX enables tools written in different languages and for different software targets to achieve coherence and interoperability around SBOM production and consumption. SPDX is not just for compliance, either; the well-defined and ever-evolving spec is also able to represent security and supply chain implications. This is incredibly important for the growing community of SBOM tools as they aim to thoroughly represent the intricacies of modern software,” said Rose Judge, ACT TAC Chair and open source engineer at VMware.
“The SPDX format greatly facilitates the sharing of software component data across the supply chain. Wind River has been providing a Software Bill of Materials (SBOM) to its customers using the SPDX format for the past 8 years. Often customers will request SBOM data in a custom format. Standardizing on SPDX has enabled us to deliver a higher quality SBOM at a lower cost,” said Mark Gisi, Wind River Open Source Program Office Director and OpenChain Specification Chair.
Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. The Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, RISC-V, Hyperledger, Jenkins, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration.
When AutomationDirect was PLCDirect and control platforms were developing with much technical development and innovation, I visited the company and its control developer in Knoxville, TN frequently. They were adding Ethernet and IT technologies. Great times. Then that part of the industry matured and AutomationDirect became a master electrical and automation distributor, while still keeping a foot in the automation development door.
This information came to me last week. Given all the interest in automation and sensor and OPC to the cloud, I thought this was interesting. AutomationDirect here discusses the PLC as an integral part of a cloud-based system. Good for them.
PLCs can now be directly integrated with cloud-based computing platforms, empowering end users and OEMs to quickly and easily add IIoT functionality to their systems.
Damon Purvis, PLC Product Manager at AutomationDirect, wrote an article for the August 2021 edition of Machine Design. The article is titled Modern PLCs Simplify Cloud-Based IIoT and it talks about how the newest BRX PLCs can securely connect directly to the leading cloud platforms from AWS, Microsoft, and others.
Industrial automation systems created by end users and OEMs have long had some IIoT data connectivity capabilities—but getting to this data and working with it has often been a chore, prohibitively expensive, or both.
Cloud computing options have eliminated many of these barriers, providing a cost-effective way to deploy and scale up IIoT projects. This is especially the case now that the BRX PLC can connect natively to cloud services, without requiring intermediate layers of processing.
Working on the factory floor early in my career taught me how much typical manufacturing workers know and care about the company’s products. Consultants came from time to time, studied, rearranged, left. Not much useful happened. But the individual guys (in those days) on the line knew more about what was going on than most of the supervisors and all of management.
Therefore, an opportunity to talk with Paul Vragel, Founder and President of 4aBetterBusiness in Evanston, IL to discuss his experiences as a project engineer and integrator was too good to pass up. After all, the values he learned and still implements include these:
- Listen to people
- Engage employees
- Ask everyone to look for problems with no fault issued
- Assume employees have needed knowledge
Vragel told me, “My initial education and experience is in Naval Architecture and Marine Engineering – that is, ship design and construction. Building a ship involves building a hotel, a restaurant, a huge warehouse and a power plant, putting them all together, putting a propeller on it and sending it out on the ocean where there are no service stations. Ship design and construction is essentially a demanding, large-scale systems engineering project.”
After graduation from Webb Institute of Naval Architecture, he worked at Newport News Shipbuilding. After a year, 2 prior graduates of Webb Institute, working for Amoco Corporation, hired him, at the age of 22, to manage ship construction programs in Spain. “A couple weeks after I was hired, I was on a plane to Spain with my instruction set being, essentially, ‘figure out what you’re supposed to do, and do that’ ”.
After a year, one of the earlier ships built in the series came in to Lisbon for its guarantee drydocking and inspection. When we opened one of the crankshaft bearings of the 30,000 hp main diesel engine, we saw the bearing material, which was supposed to be in the bearing, was lying on the crankshaft journal, in pieces.
Talk about a complex situation—the ship was built by a company controlled by the Spanish Government. They were the holder of the guarantee. The engine was built by a different company, also controlled by the Spanish Government. Amoco had a contract with the shipyard, not the engine builder. And the engine was built under license from a company in Denmark.
Vragel was there as an observer for the new construction department. The ship was under the control of the operations department. I had no authority and no staff reporting to me. “I had no technical knowledge of poured metal bearings in high-powered diesel engines, I didn’t speak Portuguese or Spanish, I was 23 years old, and the instruction from my boss was very simple: ‘Fix It!’ To add to the urgency of the ship being out of service, the shipyard in Lisbon, where the ship was located, was charging $30,000/day (about $250,000 in today’s dollars), just for being there.”
Vragel went to the engine builder in Spain who said, “We don’t think we have a problem – we think the Danes have a problem. They designed the engine, we just built it according to their instructions.”
Figuring that getting the Danish engineers down to Spain for a meeting wouldn’t be productive, he decided the only thing to do was to go into the plant and talk to the people who made the bearings. One problem – they only spoke Spanish, and he only spoke English. But there are lots of ways to communicate if you really want to. “I observed what they were doing, pointed, asked a lot of questions – they learned a little English, I learned a little Spanish – and we sketched out how the bearings were made.”
After a couple of days, he thought he had figured out the cause of the problem, but “I had the good sense to shut up. While our communication had become pretty good, I was sure that there were other parts of the process they knew about that we hadn’t touched on that might be part of the problem or solution. If I just told them what I thought, everything would stop there without awareness of those elements and we wouldn’t get an effective solution. But if I could work with them through the process so they saw the issues, the employees would bring those additional elements to the table. We would have a full understanding of the system, the employees would be part of the solution. In this way, employees would have ownership in the results.”
“And that’s exactly what happened. With a little more effort we found and fixed the causes of the problem (which was causing porosity in the bearing).”
I had no authority, no technical expertise, no staff, I was 23 years old, I didn’t speak Portuguese or Spanish, and in a few days, working cross language and cross culture in an overseas plant I had never seen in a technology in which I had no experience, we together achieved a solution that permanently raised their manufacturing capability – that they owned.
This key formative experience led to the beliefs on which 4aBetter Business was founded:
- We believe that employees are the world’s experts at knowing what they actually do every day – their local systems
- We believe that 90% of the issues in a company are embedded in the way these local systems work and work together
This lesson applies to 22-year-olds and 52-year-olds alike. Sometimes we get so wrapped up in our own ideas that we overlook an obvious source of great expertise.
Before the Industrial Internet Consortium changed its name (Industry IoT Consortium) I had two news items from it. The first is a Networking Framework publication and the second a definition for trustworthiness in cyber-physical systems. They both appear to be worthwhile additions to the state of the art.
IIC Defines Trustworthiness for Cyber-Physical Systems
The IIC has published IIoT Trustworthiness Framework Foundations. This foundational document explains the key concepts and benefits of trustworthiness in context, relating it to the real-world supply chain and offering model approaches. Trustworthiness is essential to government and commercial organizations with cyber-physical systems impacting the safety and well-being of people and the environment. These systems include industrial control systems and almost all systems that use digital technology to sense or affect the environment.
“Trustworthiness, and confidence in that trustworthiness, are an essential aspect of cyber-physical systems,” said Marcellus Buchheit, President & CEO, Wibu-Systems USA, a Co-Chair of the IIC Trustworthiness Task Group and one of the authors of the document. “Inattention to trustworthiness can lead to loss of human life, long-term environmental impacts, interruption of critical infrastructure, or other consequences such as disclosure of sensitive data, destruction of equipment, economic loss, and reputation damage,” continued Buchheit.
The IIoT Trustworthiness Framework Foundations document defines trustworthiness as a combination of security, safety, reliability, resilience, and privacy and the tradeoffs made among them in the face of environmental disturbances, human errors, system faults, and attacks. Ultimately, trustworthiness depends on the strategic intent and motivation of an organization, particularly its top management, to create and operate systems that inspire trust by partners, customers, and other stakeholders, including the community.
“Trustworthiness is the degree of confidence one has that a system performs as expected. It requires an understanding of the system, including interactions and emergent properties,” said Frederick Hirsch, Strategy Consultant, Upham Security, Co-Chair of the IIC Trustworthiness Task Group, and one of the authors of the foundational document. “In the digital world, trust and trustworthiness are achieved by understanding and addressing concerns related to the trustworthiness characteristics appropriately for the context of the entire system. Providing evidence of this can give others confidence.”
IIoT stakeholders will make different decisions and tradeoffs depending on the nature and or industry of the system. “Concerns in a factory are not the same as those for a hospital operating room,” said Bob Martin, Senior Principal Engineer, Cyber Solutions Innovation Center, The MITRE Corporation, Co-Chair of the IIC Trustworthiness Task Group, one of the authors of the document. “Designers must understand the many considerations involved in defining the appropriate trustworthiness implementation, including the supply chain, assembly, operation, and maintenance of a system.”
The IIoT Trustworthiness Framework Foundations document builds on the Industrial Internet of Things Security Framework (IISF). It is part of the IIC’s Industrial Internet Reference Architecture (IIRA), which provides an architectural framework of Industrial IoT Systems.
You can find IIoT Trustworthiness Framework Foundations and a list of IIC members who contributed to it here. Watch a short overview video. Register for the webinar, Ensuring Trustworthy Industrial Systems on September 1, 2021 at noon PST or 7:00 pm PST.
IIC Publishes IIoT Networking Framework
The IIC announced the Industrial Internet of Things Networking Framework (IINF) publication. The framework guides IIoT stakeholders on designing and developing the appropriate networking solutions to enable industrial IoT (IIoT) applications and stimulate industrial digital transformation. It details the requirements, technologies, standards, and solutions for networking that support diverse applications and deployments across a broad range of IIoT sectors and vertical industries.
“An underlying network is the foundation of any IIoT solution. It includes technologies at the network layer and below as well as related capabilities for management and security,” said David Lou, Co-chair, IIC Networking Task Group, Chief Researcher, Huawei Technologies, and one of the primary authors of the framework. “An underlying network enables the exchange of data and control and forms the basis of digital transformation across industries.”
The framework serves as a guideline and toolbox for IIoT networking solution stakeholders who design, develop, deploy, or operate the solutions and end-users in many industries trying to network their assets or products.
“IIoT applications span a range of industrial sectors as well as business, usage, deployment, and performance perspectives,” said Jan Holler, Co-chair IIC Networking Task Group, Research Fellow, Ericsson, and one of the primary authors of the framework. “The IINF helps organizations sort through numerous networking technologies to ensure interoperability across industry sectors. It answers the fundamental question, ‘How do I design, deploy, and operate a successful networking solution for my industrial IoT applications?'”
The IINF includes use cases from several industrial sectors, including smart factories, mining, oil & gas, and smart grid, to illustrate the diversity of networking considerations. Networking technologies and standards are covered in-depth to help organizations address their concerns and technical requirements. Finally, the IINF includes best practices for IT architectural blueprints.
An old friend and several acquaintances found themselves adrift when a magazine closed. All being entrepreneurial, they started a website and newsletter—RAM Review (Reliability, Availability, Maintenance). Old friend Jane Alexander is the editor. Not meaning she’s old, just that we’ve known each other for many years.
I met Bob Williamson 10 or 12 years ago mostly around discussions of ISO 55000 on asset management. He wrote the lead essay for a recent email newsletter on workforce. Now, I have to admit that the only part of manufacturing I never worked in was maintenance and reliability. I did work with skilled trades when I was a sales engineer, though. I considered them geniuses for the way they could fix things. One of the points of Bob’s essay is taking care of things before they break and need help.
The main workforce discussion in media concerns remote or hybrid work. Many engineering roles can be performed remotely. Many roles within manufacturing and production must be performed on site. With the current and projected future labor shortage, I like his closing paragraph except for the put down on current operators. I knew plenty who cared for their machine or process. Of course, many didn’t. Most likely a management failure. But cross-training people to be at least to some degree both competent operators and first-line RAM people seems to me to be a winning strategy. I’ve reprinted most of Bob’s essay below. You can read it on their website.
For many manufacturers, returning to traditional ways of work simply will not be an option. Something must change if they are to attract, hire, and retain a capable workforce. Therefore, I believe technology and desperately willing top-management teams will also help alter work cultures on factory floors. Respondents to the Manufacturing Alliance/Aon survey suggested offering “flexible working hours, compressed work weeks, split shifts, shift swapping, and part-time positions.” Use of such enticements with plant-floor workforces would look very different than use among the carpet dwellers in front offices.
We have another option, of course: Technology can automate our manufacturing processes, and much of it is far more affordable than it was a decade ago. In fact, given the rising cost of labor over the past decade, with increasing healthcare-cost burdens and skills shortages, many businesses have already automated some of their labor-intensive processes. The times we are in call for—make that scream for—large-scale automation. Yet, while process automation can be easier for large, deep-pocketed companies than for the smalls, it’s still a huge challenge.
There are four big hurdles to be overcome when automating manufacturing processes: availability, installation, sustainable reliability, and work-culture change. And remember, skills and labor shortages are widespread in these post-pandemic times. Moreover, despite the supply chain’s efforts to heal and keep up, manufacturers of automation technologies aren’t immune to the production-barrier ills that others face these days.
To repeat: RAM professionals are on manufacturing’s front line. Skill shortages may be affecting our ranks, but there are recruiting and training efforts underway in many companies to remedy the situation. In addition, we have technologies for carrying out data collection, analysis, and problem-solving somewhat remotely. However, the boots-on-the-ground parts of reliability and maintenance will not be virtual or remote.
So, consider this option: Recruit and train displaced production workers to wear some RAM “boots.” They’ll be familiar with industrial environments and the importance of plant equipment. Then, let’s train our current production workers to care more for their machines than they did in the past, and, in the process, become the eyes and ears for reliability, availability, and maintenance improvement.TRR
I have four news items today. A couple are AI related and a couple more along the Edge. All concern developing products with the latest tech. ABB, Micropsi, IOTech, and ThinkIQ.
ABB to deliver artificial intelligence modeling for data center energy optimization in Singapore.
ABB has signed up to a pilot study with ST Telemedia Global Data Centres (STT GDC) to explore how artificial intelligence (AI), machine learning (ML), and advanced analytics can optimize energy use and reduce a facility’s carbon footprint.
Singapore-headquartered STT GDC, which is one of the fastest growing global data center operators, is leveraging the digital transformation expertise of technology leader ABB as it bids to become net carbon-neutral by 2030.
ABB is conducting the pilot in two phases, beginning with initial data exploration, modeling, and validation, studying historical data to establish how digital solutions would impact existing operations and energy use. Once proven, it will be followed by AI control logic testing in a live data center environment. STT GDC aims to achieve at least 10 percent in energy savings from its cooling systems, which is the largest consumption of electrical power in a data center after IT equipment.
The ABB team is currently developing AI-based optimization models for the entire data center cooling plant, including the upstream chiller and distribution systems. The AI project is also unlocking new opportunities for efficiency improvement at a granular level within the data center. STT GDC will be able to use AI-generated insights, leveraging cutting-edge ABB Ability™ Genix for industrial analytics and AI, to track and analyze data generated by monitoring systems in the data center, and better facilitate dynamic cooling optimization.
Micropsi Industries’ AI-driven Control System Speeds Complex and Precise Robot Training and Deployment
Industrial and collaborative robots learn to perform camera-guided movements more quickly with the latest version of Micropsi Industries’ MIRAI robot control system. Using artificial intelligence (AI), MIRAI enables robots to flexibly react to variances in their tasks in real time by learning from humans. Variances in position, shape, surface properties or lighting conditions are a common challenge for robotic automation of machine tending, assembly or test applications. With MIRAI’s new “positioning skills” feature, giving examples of quality movements to the robot has become much easier, and the robot will generalize and understand what to do much more quickly.
With the new feature, MIRAI customers will notice quicker set-up times, down from 2-3 days per skill to about three hours. In addition, robot speeds have increased, which also enables shorter cycle times.
Companies wanting to use a robot to perform precise and complex skills—such as gripping and inserting a bendable or soft component, like a cable, into differently arranged sockets—would primarily use the MIRAI controller at the first and last decisive centimeters of a manufacturing step.
With MIRAI, preparing robots to perform tasks that include variances requires a human worker to guide the robot arm several times through typically occurring scenarios to show the robot to its destination, such as sockets in which freely hanging cables need to be inserted. A machine learning process then derives a motion intuition for the robot from the given examples. For a robot that is not required to follow specific paths to perform its task, MIRAI users can deploy the new positioning skills to teach the robot to find the destination even faster because a human worker needs only to show MIRAI the surroundings of the target with the camera. The robot then independently searches for the shortest path to the object.
IOTech launches Edge Builder to manage edge systems at scale
IOTech, the edge software company, announced the launch and availability of Edge Builder, its end-to-end management solution for edge systems. Edge Builder provides a comprehensive, flexible and open solution that simplifies and automates the management of edge systems at scale.
To ensure that Edge Builder addresses the market opportunity, IOTech has been working with a number of key partners and potential customers during the development phase of the product.
Designed to meet the specific needs of edge systems, Edge Builder provides light touch provisioning and complete lifecycle management for both edge nodes and their applications. Currently it supports the deployment and management of containerized applications at the edge and in the future will also support the deployment of native binary applications.
Edge systems are managed from a centralized Edge Builder controller that can be hosted either on-premise or in the cloud. Platform independence for both the managed nodes and the cloud environment on which the controller is deployed ensures flexibility and choice for Edge Builder users.
Edge Management at Scale -Solving the Big Problem in the IoT Room
ThinkIQ Enhances SaaS Platform with Stronger Connectivity, Analytics and Visualization
ThinkIQ, a pioneer of digital manufacturing transformation SaaS, announced major enhancements to its SaaS Manufacturing platform. The new offering strengthens the company’s leading Transformational Intelligence Platform and provides more powerful and simplified modeling technology to allow for faster time to solution, better analytics and visualization, and higher performance data processing.
Many transformational intelligence platforms are either pure developments tools or are restricted to the feature set that is delivered with the platform. ThinkIQ’s latest enhancements deliver the best of both worlds with strong model integration combined with an extensible development platform that bridges the gap between traditional, on-premise OT technologies and strong could-enabled analytics.
These capabilities can be applied to any manufacturing and supply chain application and are particularly well-suited for hybrid, continuous and batch processes.
ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform collects data across the operation (existing and IIoT sensors) to provide actionable real time insights (e.g., identify correlations and root causes, traceability and yield issues, etc.). It creates a new level of capability beyond what independent disconnected operating environments can provide today.
To learn more about ThinkIQ, visit our website.