by Gary Mintchell | Sep 13, 2023 | Automation, Edge, Industrial Computers
Are you using an Arduino anywhere? I keep it on my to do list that never gets done. I’ve had all these Halloween ideas that atrophy in my mind. I’ve believed for a long time that there must be many industrial uses for a good edge compute platform at low cost.
Here is news about Arduino joining the Amazon Web Services (AWS) Partner Network as an Independent Software Vendor (ISV) to further democratize embedded hardware for OEMs and Industrial Automation industries.
Arduino Cloud — built on AWS — hit a new milestone of 4 billion data messages per month and is mentioned in Gartner’s Hype Cycle for Infrastructure Platforms.
It has joined the Amazon Web Services (AWS) Partner Network (APN) to deliver enterprise-grade Arduino PRO products that work with AWS for customers in commercial and industrial sectors. The APN is a global community of AWS Partners that leverage programs, expertise and resources to build, market and sell customer offerings.
In addition, the company’s device and data management service, Arduino Cloud, announced that it now processes 4 billion device messages every month from both individuals and businesses. This is a significant milestone from the 3-year-old service built on AWS.
Although companies recognize the immense potential of digital transformation at the edge, many feel the goal is beyond their reach because of a lack of solutions. Arduino Cloud offers both businesses and individuals an easy path to collect data, control the edge and gain insights from connected products without the need to build, deploy and maintain a custom IoT platform.
“Choosing Arduino Cloud for our business application slashed product development time by six months and saved us over $250,000 in engineering services,” said Adam Bishop, co-founder of ABM Vapor Monitoring. “Arduino PRO provides us with an end-to-end commercial platform. Using the Arduino Opta PLC connected to Arduino Cloud, we monitor commercial buildings across America to ensure regulated air quality standards are met. Arduino Cloud has been an instrumental partner in our journey to introduce new products to the market.”
Arduino joins a global network of over 100,000 AWS Partners from more than 150 countries, working with AWS to provide innovative solutions, solve technical challenges, win deals and deliver value to mutual customers. Customers will also experience streamlined support architecting edge-to-cloud integrated solutions, whether choosing Arduino Cloud, AWS cloud services or hybrid architectures.
“Today, industrial hardware and advanced cloud services exist in independent worlds with significant complexity,” said Guneet Bedi, Arduino’s SVP and GM. “By offering integration with the flexibility and scalability of AWS and pay-as-you-go pricing, businesses will be able to greatly reduce complexity and significantly accelerate their go-to-market with the scale of Arduino PRO.”
The open architecture at the core of every Arduino product provides a new preferred path to AWS for all microchips supported by Arduino. In addition, existing Arduino Cloud business customers now have an integration track for scaling to self-managed solutions on AWS, while existing AWS customers now have reference architectures to integrate Arduino products.
Arduino’s Investment to Enable Industrial Innovation
The Arduino PRO product line, introduced in 2020, meets the request from OEMs and industry integrators for a hardware ecosystem that lowers the barrier to entry and accelerates time to market. The Arduino PRO portfolio features 24 industrial-grade products, including the Portenta X8 Linux SOM and UL-certified Opta PLC. Currently, Arduino PRO technology is deployed by more than 2,000 businesses worldwide.
This announcement reinforces the commitment Arduino shared when announcing its Series B funding to chart a new strategic course that emphasized the expansion of its enterprise-scale offerings. More recently, the company named Bedi to head its U.S. operations, with two new offices focused on accelerating its B2B growth.
by Gary Mintchell | Aug 16, 2022 | Industrial Computers, News, Organizations
The announcement of a new program of the Object Management Group called Responsible Computing was reported here last May. This news fleshes out the skeleton of the announcement through work of the inaugural meeting on June 29, 2022 that established six working groups defining their focus. Responsible Computing is a new consortium comprising technology innovators working together to address sustainable development goals.
Stephen Mellor, Executive Vice-President of OMG and CTO of Responsible Computing, said, “We have all heard about sustainability, but how often have you seen a request for proposal that set energy usage requirements or placed limits on ‘dark data? Responsible Computing will address these issues and more. The inaugural meeting set the stage for sustained work in multiple areas.”
Details of Six Working Groups
Data Center Working Group concentrates first on the building to reduce environmental impact with more efficient strategy and design, migrate to renewable energy sources, monitor consumption, and carbon footprint and optimize the reuse of waste from cooling and production. This working group will produce webinars, white papers, and best practice papers to help the IT community be net-zero by 2030 in compliance with UN SDGs.
Infrastructure Working Group realizes greater efficiencies with infrastructure (computers, networks, data centers) designed to deliver high-performing sustainable operations, consolidate workloads that peak at different times to increase efficient use of resources, and obtain high utilization levels. This working group will produce webinars, success stories, and best practice papers to reuse technology, reduce electronic waste, and create a circular economy.
Code Working Group is to align teams on software architecture, technology, programming language, and platform with anticipating and monitoring the total costs of running, supporting, and maintaining applications. It will produce white papers to help developers balance the trade-offs between accuracy, speed, and expense, including energy consumption, addressing the hidden energy impact of code, reducing data duplication, and improving cybersecurity. The group will also implement sustainability maturity assessment tools and KPIs to accelerate decision-making and pinpoint areas requiring more scrutiny during software development. There will also be ongoing training and workshops to reinforce shared sustainability goals to heighten team awareness of these issues.
Data Usage Working Group certifies that data is high quality. The group also works to ensure that organizations can trust processes and people, thereby reducing errors and misinterpretation of data by advocating intelligent workflows that leverage artificial intelligence and machine learning. The group will develop robust policies, guidelines, and practices for data governance (e.g., maintaining lineage and explainability), ongoing data usage risk assessment and risk mitigation, incident response, and data-breach remediation. It will also show organizations how to manage data lifecycle with accountable data-retention and destruction practices.
Systems Working Group will ensure that systems employ an integrated set of technologies to serve people by building ethical, privacy-preserving, secure, and resilient systems. Organizations must design systems with the environment, individuals, society, and future. Responsible systems are designed with a three-layered approach to include a cultural ethos across the entire supporting organization, forensic technology that can monitor and detect issues to enable trust, and governance requirements to which the entire organization adheres. This working group will help organizations maintain the integrity of internal systems, achieve compliance with internal and external standards, ongoing monitoring to ensure companies develop and use responsible systems and reinforce corporate social responsibility to close the digital divide.
Impact Working Group will work to offset the impact on the planet in the categories of ESG and level the playing field through sustainability, circularity, diversity, inclusion, climate, openness, and ethics. Six prime and measurable maturity characteristics represent the ability to achieve responsible impact: goal setting, scalability, replicability, socially responsible business model and strategy, and quantifiable and traceable to a UN SDG (United Nations Sustainability Development Goal).
Responsible Computing is a systemic approach aimed at addressing current and future challenges in computing, including sustainability, ethics, and professionalism, stemming from the belief that we need to start thinking about technology in terms of its impact on people and the planet.
by Gary Mintchell | Jul 5, 2022 | Automation, Industrial Computers, Internet of Things, Security
Emerson’s acquisitions have moved it more firmly into discrete manufacturing operations. This news of a new programmable automation controller family of products manages to combine benefits of control, automation, industrial Internet of Things (IIoT), analytics while “minimizing the need for specialized software engineering talent.” Automation suppliers have been on a fervent journey toward providing products that are easier to use for talent-strapped customers. It also brings in current requirements for security and open protocols.
Emerson, a global software, technology and engineering leader, announced the release of its PACSystems RSTi-EP CPE 200 programmable automation controllers (PAC). CPE 200 controllers will deliver large programmable logic controller (PLC) capability in a small, cost-effective, IIoT-ready form factor so machine manufacturers do not need to sacrifice performance for price.
Providing features that help speed time to use, the CPE 200 series offers security-by-design, open programming, and open communications built in to simplify connectivity to external analytics software platforms while reducing cost and complexity for OEMs and end users.
“Gaining competitive edge in today’s marketplace means having the flexibility to connect to the wide array of equipment end users employ as part of their proprietary processes, and supporting secure, open connectivity to allow easy access to on-premises and cloud-hosted analytics platforms,” said Jeff Householder, president of Emerson’s machine automation solutions business. “The CPE 200 series controllers take advantage of Emerson’s cybersecure-by-design architecture, common programming capabilities, and IIoT readiness to provide options currently missing in legacy compact PLCs.”
The controllers offer open communications through native, pre-licensed support for OPC UA Secure and other common industrial protocols for flexible connectivity over high-speed Gigabit Ethernet. IEC 61131 programming languages and C, the world’s most popular and easiest-to-use programming language, help engineers write and run the high-performance algorithms that enable proprietary production strategies and advanced automation technologies.
by Gary Mintchell | Jan 6, 2022 | Automation, Embedded Control, Industrial Computers, Technology
A couple of years ago, I was amazed to discover a conversation in Germany regarding PC-based control versus “old, proprietary PLCs”. Seeing that the conversation was in Germany, I assumed the “old” one to be Siemens and the new one was relative to CODESYS and companies such as Wago and perhaps Beckhoff. Then I just saw a conversation on LinkedIn where an American magazine evidently re-ran an old programmable automation controller (PAC) versus programmable logic controller (PLC). In both cases, the “old” PLC vendor rendered much of the argument moot by adopting PC-based technologies into their products.
The Open Process Automation Forum opened a new branch to the argument with the push for Software Defined Control Architecture. This is interesting. OPAF has progressed through definitions and standards—more on that in my next post. For this post, I’m reporting some news from, well, Germany, about an advance by a new company called Software Defined Automation. I wonder where this will lead us. It will be interesting. I have yet to see anything push Siemens and Rockwell off their thrones on the factory side or Emerson/Honeywell/Yokogawa/ABB on the process side. But, you never know.
Munich-Based Software Defined Automation (SDA) and VMware Implement Real-Time Virtual Programmable Logic Controllers (vPLCs)
The execution of deterministic real-time control on virtualized edge servers in combination with a comprehensive vPLC management interface in the cloud is aimed to be a unique solution, enabling customers to improve productivity, utilization, and security while at the same time gain independence from vendor-specific hardware and silicon.
The SDA solution will help improve industrial automation with the full virtualization of PLC controls on the VMware Edge Compute Stack that supports virtual machines (VM) and containers running on conventional IT servers at the edge. The real-time control on a VM will commission, monitor and manage vPLC instances on servers located in factories. The virtual real-time controllers, which will be installed and managed by SDA at the edge, have already been shown to achieve deterministic control cycle times of <10ms.
Many recent innovations developed by the IT industry have not been adopted in the area of PLCs. Traditional PLC implementations in hardware are costly and lack scalability. Since the emergence of the standard IEC 61131-3 in the 1980s, PLC technology has advanced very gradually. Current trends improve the PLC’s memory and processing power while shrinking their size. Yet, the technology still relies on on-site monitored and individually programmed PLCs that must be taken out of operation in order to change the code – leading to operational downtime and reliability risks. This common practice is due to the lack of alternative technologies and tools that could reduce the software limitations of PLCs and free them from the need to being manually managed on-site by automation engineers.
Virtual machines and containers transform hardware systems into software systems, in which all elements run on local off-the-shelf IT infrastructure. The VMware Edge Compute Stack in combination with SDA’s vPLC management and monitoring services will enable improved security, reliability and resilience while allowing for intelligent and deterministic real-time responsiveness.
The vPLC solution aims to bring the benefits of cloud systems to the shopfloor, increase resilience and security, while preserving real-time capabilities.
The solution is based on a hybrid architecture between a cloud system and an industrial workload on the edge. The hardware resources located at the edge will be efficiently used with VMware’s Edge Compute Stack, which manages the resources according to each vPLC’s needs. SDA is working on extending this technology stack with a management system for fully virtualized PLCs based on CodeSys technology to incorporate the industrial control layer as a software. The management system will simultaneously hold virtual PLC twins in the cloud.
The offering can then help to generate value for all sorts of industry processes controlled by PLCs. Software-based PLC implementations will end up being more flexible, simplifying the delivery logistics and reducing software commissioning time. The vPLC’s runtime at the edge can be updated over the cloud via the SDA management console. vPLCs will be handled as IT workloads and state-of-the-art IT best practices are applied to bolster automation IT security. Furthermore, the integrated monitoring service ensures that allowed vPLC response time thresholds are not exceeded.
Dr. Josef Waltl, CEO and co-founder of SDA, stated, “Today’s technological advances in software and cloud computing allow management of real-time control systems in a pure software fashion. The SDA vPLC service is able to meet sub 10ms performance, required for the many of industrial applications currently controlled by conventional PLCs.”
Muneyb Minhazuddin, vice president of edge computing, VMware, notes, “The pandemic has shown how vulnerable manufacturers still are at the edge despite having implemented latest industry 4.0 and cloud technologies. It’s the last mile that is still dependent on human intervention and vendor hardware, yet it is a vital part of production process controls that needs to be addressed. Together with SDA, VMware Edge Compute Stack will help manufacturers optimize PLCs in a time of semiconductor shortages, enabling resiliency, flexibility and effectivity at the very heart of their edge operations.”
by Gary Mintchell | Dec 17, 2021 | Automation, Embedded Control, Industrial Computers, Software
A fervent issue for discussion in German automation circles, especially for those who wish to displace Siemens from its leading position, is software-based control. An early leader in this technology is Beckhoff Automation. This press release made public this week gives some of the technology historical foundation. Yes, it’s commercial. But, yes, it’s interesting to see where we’ve been in order to speculate on where we’re heading.
The TwinCAT automation software suite from Beckhoff has reached its 25th anniversary in the market. Ubiquitous in automation today, TwinCAT has served as a powerful resource for engineers since 1996 – a quarter of a century. In addition, the underlying PC-based control technology from Beckhoff has been going strong since 1986, marking 35 years in the industry. TwinCAT, short for The Windows Control and Automation Technology, provides numerous benefits from its robust software functionality. The advantages of TwinCAT stem from its modular expandability extending to support for innovations such as integrated machine vision and artificial intelligence.
Since the 1996 introduction of the first software generation, TwinCAT 2, this product is still available and maintained, which is proof of its continuity and compatibility with current systems. Windows served as the operating system and the PLC programming was adapted to meet the requirements of the IEC 61131-3 standard. This introduced the ability to implement an industrial control system on a “regular” PC with a standard operating system.
Another milestone was the decision to align the TwinCAT programming environment with the world’s predominant IT programming environment. Microsoft Visual Studio is used for all major IT software developments, and Beckhoff also used this tool to develop TwinCAT 2 software. So why not develop PLC software applications with Visual Studio as well? The subsequent TwinCAT 3 software generation was introduced in 2010 and delivered to customers from 2011 on – which makes for another 10-year anniversary and another track record of success in the field.
The integration of the TwinCAT automation tools into Visual Studio established a completely new type of engineering environment. With the availability to use additional programming standards such as C/C++ and MATLAB/Simulink, further possibilities emerged for more efficient code generation for machines and systems. This has also gained widespread acceptance in the automation industry.
In addition to programming, TwinCAT offers an I/O configuration interface for a wide variety of fieldbus systems – first and foremost EtherCAT as well as more than 30 other communication protocols. Motion control applications from simple PTP movements to sophisticated CNC and robot kinematics are just as much part of the ongoing evolution as safety functions, image processing for machine vision and machine learning. With the advent of Industrie 4.0 and the Industrial Internet of Things (IIoT), it quickly became clear that the cloud, long established in IT, would also become a major factor in the automation market. To provide this functionality for customers, Beckhoff launched hardware and software solutions for IoT and cloud connectivity in 2015, followed by data analytics tools in 2018.
by Gary Mintchell | Nov 30, 2021 | Automation, Industrial Computers, News, Organizations, Technology
Let’s mix open source, alliances, collaboration, and the future of computing—quantum—into a new Linux Foundation Alliance. This is more future than 5G and IoT, but this is something we need to pay attention to. It’ll be here before you know it.
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the new QIR Alliance, a joint effort to establish an intermediate representation with the goal to facilitate interoperability within the quantum ecosystem and provide a representation suitable for current and future heterogenous quantum processors. Founding members include Honeywell, Microsoft, Oak Ridge National Laboratory, Quantum Circuits Inc. and Rigetti Computing.
QIR, or Quantum Intermediate Representation, is based on the popular open source LLVM https://llvm.org/ compiler toolchain. QIR specifies a set of rules for representing quantum programs within the LLVM IR. Examples of QIR applications include using the standard LLVM infrastructure to write quantum optimizers that operate on QIR and target it to specific hardware backends or linking it with classical high performance libraries for quantum simulation.
“We expect there to be exciting advances in how classical and quantum computations can interact at the hardware level. The QIR Alliance will provide a single representation that can be used for both today’s restricted capabilities and the more powerful systems of the future,” said Bettina Heim, principal software engineering manager, Microsoft. “This will allow the community to experiment with and develop optimizations and code transformations that work in a variety of use cases.”
Quantum development SDKs and languages appear and evolve at a fast pace, along with new quantum processors with unique and distinct capabilities from each other. To provide interoperability between new languages and new hardware capabilities and reduce development effort from all parties, it is imperative for the ecosystem to develop and share a forward-looking intermediate representation that works with present and future quantum hardware.
“Quantum technology is still quite nascent but the promise grows every day,” said Seth Newberry, executive director the Joint Development Foundation. “The QIR Alliance is poised to enable the open and technical development necessary to realize these promises. We’re very happy to provide a forum for this work.”
Honeywell
“The Quantum-Intermediate Representation Alliance, also known as QIRA, is a key piece of the quantum computing ecosystem that enables quantum hardware suppliers and quantum software suppliers to reduce redundant efforts involved in implementing programming languages across quantum computer architectures,” said Alex Chernoguzov, Honeywell Quantum Chief Engineer, Honeywell.
Oak Ridge National Laboratory
“ORNL is thrilled to be a part of the Quantum Intermediate Representation Alliance, which aims to develop a unified LLVM-based intermediate representation for quantum computing. A consistent IR of quantum programs will enable interoperability between quantum applications and hardware devices, making quantum computing more usable to researchers and developers. We look forward to contributing to the QIR specification and the associated compiler toolchain under this partnership,” said Thien Nguyen, Quantum Computer Science Researcher, Oak Ridge National Laboratory.
Quantum Circuits Inc.
At QCI, we are very pleased to be participating in the QIR Alliance. The QIR approach represents a revolutionary advance in the representation of quantum circuits, enabling users to take full advantage of the unique capabilities of quantum computing systems across a variety of different hardware platforms,” said Tom Lubinski, Chief Software Architect of Quantum Circuits Inc.
Rigetti
“Rigett has pioneered hybrid system architectures that are quickly becoming the predominant approach for cloud-based quantum computing” said David Rivas, SVP Systems & Services at Rigetti Computing. “The QIR Alliance is focusing on precisely the interface between quantum and classical compute, enabling rapid advances in quantum programming language design and execution systems. We’re thrilled to be working closely with this community to design the necessary compiler technology and develop implementations for Rigetti hardware.”
About Joint Development Foundation
Launched in 2015, the Joint Development Foundation (the Joint Development Foundation) is an independent non-profit organization that provides the corporate and legal infrastructure to enable groups to quickly establish and operate standards and source code development collaborations. More information about the Joint Development Foundation is available at http://www.jointdevelopment.org/.
About the Linux Foundation
Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration..