A few hours rather than a few days this week brought several conferences to my attention. One was Dell Technologies World. I first was involved with Dell through the Influencer program for Internet of Things and Edge. Both of those groups are gone. The only people I remember are selling laptops now. Only a few years ago, Michael Dell spoke of the IoT group in his keynote. This year…nada.
However, there was still much of interest this year. Just as its competitors have been building out as-a-Service strategies, Dell Technologies has announced just such a strategy. Expect this trend to spread.
Businesses can buy and deploy IT with a simple, consistent cloud experience across the industry’s broadest IT portfolio
- Project APEX simplifies how Dell Technologies customers can consume IT as-a-Service
- Dell Technologies Cloud Console gives customers a single self-service interface to manage every aspect of their cloud and as-a-Service journey
- Dell Technologies Storage as-a-Service will be deployed and managed on-premises by Dell Technologies
- Dell Technologies Cloud Platform advancements make cloud compute resources accessible with instance-based offerings, lowering the barrier of entry and extendingsubscription availability
Dell Technologies expands its as-a-Service capabilities with Project APEX to simplify how customers and partners access Dell technology on-demand—across storage, servers, networking, hyperconverged infrastructure, PCs and broader solutions.
Project APEX will unify the company’s as-a-Service and cloud strategies, technology offerings, and go-to-market efforts. Businesses will have a consistent as-a-Service experience wherever they run workloads including on-premises, edge locations and public clouds.
“Project APEX will give our customers choice, simplicity and a consistent experience across PCs and IT infrastructure from one trusted partner—unmatched in the industry,” said Jeff Clarke, chief operating officer and vice chairman, Dell Technologies.“We’re building upon our long history of offering on-demand technology with this initiative. Our goal is to give customersthe freedom to scale resources in ways that work best for them, so they can quickly respond to changes and focus less on IT and more on their business needs.”
“By the end of 2021, the agility and adaptability that comes with as-a-Service consumption will drive a 3X increase in demand for on-premises infrastructure delivered via flexible consumption/as-a-Service solutions,” Rick Villars, group vice president, Worldwide Research at IDC.
Dell Technologies as-a-Service and cloud advancements
The new Dell Technologies Cloud Console will provide the foundation for Project APEX and will deliver a single, seamless experience for customers to manage their cloud and as-a-Service journey. Businesses can browse the marketplace and order cloud services and as-a-Service solutions to quickly address their needs. With a few clicks, customers can deploy workloads, manage their multi-cloud resources, monitor their costs in real-time and add capabilities.
Available in the first half of next year, Dell Technologies Storage as-a-Service (STaaS) is an on-premises, as-a-Service portfolio of scalable and elastic storage resources that will offer block and file data services and a broad range of enterprise-class features. STaaS is designed for OPEX transactions and allows customers to easily manage their STaaS resources via the Dell Technologies Cloud Console.
Dell continues to expand its Dell Technologies Cloud and as-a-Service offerings with additional advances:
- Dell Technologies Cloud Platform instance-based offerings— Customers can get started with hybrid cloud for as low as $47 per instance per month with subscription pricing, making it easy to buy and scale cloud resources with pre-defined configurations through the new Dell Technologies Cloud Console.
- Geographic Expansions— Dell Technologies is extendingDell Technologies Cloud Platform subscription availability to the United Kingdom, France and Germany with further global expansion coming soon.
- Dell Technologies Cloud PowerProtect for Multi-cloud— This fully-managed service helps customers protect their data and applications across public clouds in a single destination via a low latency connection to the major public clouds. Businesses save costs through the PowerProtect appliance’s deduplication technology and realize additional savings with zero egress fees when retrieving their data from Microsoft Azure.
- Pre-approved Flex On Demand pricing— The pre-configured pricing makes it simpler for customers to select and deploy Dell Technologies solutions with a pay-per-use experience. Dell Technologies partners globally will receive a rebate up to 20% on Flex On Demand solutions.
Continued sustainability focus
Dell Technologies Project APEX will help companies retire infrastructure in a secure and environmentally friendly manner. Dell manages the return and refurbishing of used IT gear while also helping to support customers’ own sustainability goals. The company is making additional strides in achieving its Progress Made Real goals by:
- Reselling 100% of the returned leased assets.
- Refurbishing and reselling 89% of working assetsin the North America and EMEA regions.
- Reselling 10% of non-working assetsto Environmental Disposal Partners who repair, reuse, resell and recycle each asset. Last year, Dell recycled 240,257 kilograms of metal, glass and plastics through this program.
- Helping customers resell or recycle their excess hardware and prepare to return leased equipment in a secure and environmentally conscious manner through Dell Asset Resale & Recycling Services.
- Dell Technologies Cloud Console is available now as a public preview in the United States with EMEA availability planned for the first quarter of 2021.
- Dell Technologies Storage as-a-Service will be available in the U.S. in the first half of 2021.
- Dell Technologies Cloud Platform instance-based offerings with subscription pricing are available in the United States, France, Germany and the U.K. Dell Technologies Cloud PowerProtect for Multi-cloud is now available in the U.S., U.K. and Germany.
- Flex On Demand is available in select countries in North America, Europe, Latin America and the Asia-Pacific region.
ABB has been making some news lately. I’ve run projects and supervised project managers in my career, but never those really large capital projects in the energy industries. Maybe $10 million in 1985 dollars. But when they say a new methodology to reduce up to 40 per cent of capital expenditure and 30 per cent of delivery schedule improvements, even I notice. Reading through the release, the key element I picked out had to do with decoupling hardware and software in the automation system. This is something I’ve learned from ABB’s competitors, and something I expect to hear much more about.
- ABB introduces new, agile method to industrial project execution to help energy customers adapt to challenging market conditions
- Up to 40 per cent capital expenditure reduction is expected, delivery schedules are expected to be compressed by up to 30 per cent
ABB Adaptive Execution integrates expert teams, new technologies, agile processes, shared learnings, and proven methodologies into a single, streamlined project execution experience for all stakeholders involved in major capital investment projects.
In the energy sector, it is not uncommon for large capital projects to significantly exceed budget and experience extensive delays. ABB Adaptive Execution addresses major inefficiencies that result in cost and schedule overruns. It enables greater visibility across all layers of a project, unlocking significant project value improvements across the energy sector. With digitalization and collaboration at its core, Adaptive Execution is expected to reduce automation related capital expenditure by up to 40 per cent, compress delivery schedules by up to 30 per cent and start-up hours by up to 40 per cent.
Using virtualization, Adaptive Execution removes the need for engineering on site and reduces the physical hardware required for a control and automation system. By decoupling hardware and software, Adaptive Execution lowers the time and overall setup costs, cutting the number of engineering hours spent on project installation, commissioning and testing by up to 85 per cent.
Peter Terwiesch, President of ABB Industrial Automation said: “2020 has been a year of disruptions across the global energy industry. With falling oil prices, challenges induced by the current lockdowns and the rising demand for sustainable energy investments, companies are looking for new ways to reduce cost, schedule and risk for major projects in this low capex environment. Harnessing advances in digitalization, ABB Adaptive Execution responds to this need and enables capital projects to thrive in the new normal.”
Harnessing efficient modular design, combined with standardized, repeatable processes and shared and effective deployment of infrastructure, tools, and resources, ABB Adaptive Execution centralizes collaboration across all project stakeholders — from a project’s inception through to its successful completion.
Brandon Spencer, President of ABB Energy Industries commented: “Adaptive Execution will change the way in which customers, Engineering Procurement Construction (EPC) contractors and vendors interact. We can create better business value for our customers by creating an environment where everyone can do his or her own part with confidence, empowering delivery teams to achieve more, in less time. This is the key to overall project success.”
I had been talking Open Source with Bill Lydon, who, like me, has been around many places in manufacturing and media in his career most recently as editor of InTech. He referred me to an article he wrote on automation.com.
This release actually points to two important technologies that will combine to improve management’s ability to operate a more profitable and efficient plant. One is the open source. The other is DataOps. I had begun hearing about that from IT companies and was just beginning to wonder about applications specifically to industry when I was approached by John Harrington, one of the founders, with the initial story of HighByte.
I have written several things about DataOps. Here is one from my visit last year to the Hitachi Vantara conference, and the other reports on lack of leveraging data.
On to the actual news:
HighByte announced the release of HighByte Intelligence Hub version 1.2. The release expands the software’s support for open standards and open platforms, including MQTT, Sparkplug, and OpenJDK. Open standards and platforms simplify industrial data integrations and accelerate the time-to-value for Industry 4.0 solutions.
“The MQTT Sparkplug specification is critical to ensuring information interoperability and governance in the plant and throughout the enterprise,” said HighByte CEO Tony Paine. “By deploying HighByte Intelligence Hub with Sparkplug support, our customers are able to standardize and streamline their data infrastructures from factory to cloud.”
HighByte Intelligence Hub is the first DataOps solution purpose-built for industrial environments. DataOps is a new approach to data integration and security that aims to improve data quality, reduce time spent preparing data for analysis, and encourage cross-functional collaboration within data-driven organizations.
In addition to MQTT Sparkplug and Open JDK support, HighByte Intelligence Hub version 1.2 includes model and instance categorization, enhanced security, and more flexible presentation of information models and output formats.
The software is available as an annual subscription and is priced per instance or per site. highbyte.com
From The HighByte Blog
The future of Industry 4.0 is open: open standards, open platforms, and open thinking. In today’s ecosystem, realizing the full potential of Industry 4.0 requires a mesh of products working together to fulfill each layer of the technology stack. Open standards and platforms simplify these integrations and speed up the time-to-value for Industry 4.0 solutions.
Open Standards. This release adds Sparkplug support over MQTT. Sparkplug is an open standard built on top of MQTT that defines the message format for Sparkplug-enabled applications. Many industrial sensors and systems have adopted the Sparkplug specification with MQTT as a means of integrating systems due to Sparkplug’s prescriptive topic namespace, payload definition, and state management. Using Sparkplug, customers can instantly consume and publish data models to and from other Sparkplug-enabled systems.
Open Platform. This release also supports OpenJDK v14, a free and open-source implementation of Java, extending the reach of HighByte Intelligence Hub to any OpenJDK-enabled platform. OpenJDK support for the underlying JAVA virtual machine ensures the longevity of the solution and reduces the cost of ownership of the solution for our customers.
HighByte Intelligence Hub version 1.2 offers the following new features:
- Enhanced security. Securely connect and configure HighByte Intelligence Hub using HTTPS.
- More flexible output formats. JSON output can now be further customized, allowing users to flatten or expand the hierarchy. Flexible presentation of information models is essential when supporting multiple use cases and target applications. While MQTT is becoming the de facto protocol for IoT data and many applications support it, each application has nuances in how they expect JSON to be structured. Applications and data storage engines also have unique needs regarding update frequency and how much information is included in the update. Flexible presentation of information models addresses these interoperability challenges.
- Publish only data that changes. Enable a flow to only publish model data that has changed, reducing the amount of data sent to applications.
- Easily organize models and instances. Models and model instances can now be organized into categories, making it easier to manage hundreds or thousands of models across your enterprise. The organization of models and instances is critical as companies scale the size of their deployments.
Tony Paine Blog Post
Communication within a start-up is pretty straightforward. If you have a question about a new product launch, you go directly to the owner or CEO. Problems with a design flaw? Talk to your lead engineer. As that business scales, your lines of communication become more complex. You may need to send information through multiple channels to get an answer. Without an easy way to send or retrieve information, it might get lost or misinterpreted or you may wait days for an answer. Anyone who has worked in that environment knows the inherent challenges.
Similarly, when organizations implement new industrial IoT solutions, they may work fine at first but become less effective as the project or company scales. The more capabilities you add, the more connections you create throughout your data systems.
For example, today you might need one or two pieces of production data, such as downtime or line speed, from a machine that feeds information into a business intelligence system and an analytics software package from different vendors. As your organization grows, accessing this information becomes more complex because you now have thousands of connection points. Each time you add an application to the system, you need to build connections between the new software and the other systems with which it must communicate. This dramatically increases integration costs and slows deployments.
To scale your industrial IoT implementation, you need a unified namespace. A unified namespace is a software solution that acts as a centralized repository of data, information, and context where any application or device can consume or publish data needed for a specific action. Without a centralized data repository, it could take months to deploy a new analytics application across the entire enterprise versus hours with a unified namespace.
For nearly two decades, MQTT has served as an effective messaging protocol that allows any program or device to publish data but doesn’t offer interoperability between third-party devices and applications. Technology companies have brought data interoperability to MQTT devices and applications through the development of the Sparkplug specification.
At HighByte, we view our unified namespace as a middleware solution that allows users to collect data from various sources, add context so there’s meaning to it, and transform it to a format that other systems can understand. That is why we are adding support for Sparkplug in the upcoming October release of HighByte Intelligence Hub.
This is where you begin to unlock the real value of machine learning because you now have the connectivity you need to optimize your systems, devices, and processes in real time and scale your IoT capabilities without costly, time-consuming implementations.
The Open Process Automation Forum strives for a software-defined industrial control system where the hardware and software are dissociated. The specific reason is that upgrades become less expensive. Software must be upgraded more often than hardware in a control system. If the two are tied together as in all proprietary control systems, then upgrades run on a continuum from painful to impossibly expensive.
I’ve been puzzling out this press release from Schneider Electric about a new control software dubbed EcoStruxure Automation Expert. The company says, “it is the world’s first software-centric industrial automation system.” I’m not sure that claim would stand up exactly, but it seems to me that this is a step on that journey toward dissociating software and hardware in the control system. Executives have told me in the past few years that achieving this is an essential long-term strategy.
Any comments you all have about this are welcome (as long as they’re civil and enlightening).
The press release is written in the tone of a challenge to the rest of the industry to write “apps” that will run on this standards-based (IEC-61499) system.
Schneider Electric promises to unleash a new wave of innovation by championing the widespread adoption of open automation standards unveiling its vision for universal automation with EcoStruxure Automation Expert, “a new category of software-centric industrial automation system.”
Claims closed and proprietary automation platforms restrict the adoption of best-of-breed technologies, present challenges to integrate third-party components, and are expensive to upgrade and maintain. Industry has suffered from a lack of adaptability, modularization and interoperability, which is stunting innovation.
Universal automation is the world of plug and produce automation software components based on the IEC61499 standard that solve specific customer problems in a proven way. Adoption of an IEC61499-based standardized automation layer, common across vendors, will provide limitless opportunities for growth and modernization across industry.
By greatly extending the capabilities of existing IEC61131-based systems and enabling an app-store-like model for automation software components, Schneider Electric believes that the advancements possible in the Fourth Industrial Revolution will be fully realized. As its benefits become visible, Schneider Electric believes other vendors will adopt the universal automation approach, and end users will soon begin to demand it from their automation suppliers and ecosystem.
“The IT world has realized the benefits of open operating platforms; now it’s industry’s turn,” said Peter Herweck, executive vice president industrial automation, Schneider Electric. “Industrial automation architectures have done a good job of advancing industry to where we are today, but they are not capable of providing the agility and resilience that are paramount for modern industrial operations. To fully realize the promise of the Fourth Industrial Revolution, we need to reimagine our technology model by opening our platforms, decoupling software from hardware, and radically improving system agility and scalability.”
EcoStruxure Automation Expert is a new category of industrial automation system with IEC61499 at its core. EcoStruxure Automation Expert:
- Enables automation applications to be built using asset-centric, portable, proven-in-use software components, independent of the underlying hardware infrastructure.
- Allows the user to distribute applications to any system hardware architecture of choice —highly distributed, centralized, or both — with minimal to no additional programming effort.
- Supports established software best practices to simplify the creation of automation applications that interoperate with IT systems.
The first release of EcoStruxure Automation Expert supports traditional automation platforms, such as Modicon PLCs, and Altivar Variable Speed Drives and PCs. Completing the line-up, a virtualized software controller running in Docker-powered Linux containers supports distributed information and control systems across edge computing architectures.
Leveraging the object-oriented nature of IEC61499, software components known as Composite Automation Types (CATs) are used to model assets by combining real-time control functions with other facets, such as the human machine interface. This asset-centric approach delivers unprecedented cost and performance gains and frees engineers to innovate by automating low-value work and eliminating task duplication across tools. Benchmarking of EcoStruxure Automation Expert against today’s automation systems has demonstrated a 2 to 7X reduction in the time it takes to perform traditional automation tasks.
EcoStruxure Automation Expert’s support for mainstream IT best practices enables step-change improvements in asset and workforce efficiency using advanced technologies like predictive maintenance and digital twin. The system also reduces total cost of ownership by incorporating legacy systems with a wrap-and-reuse approach.
“EcoStruxure Automation Expert is the first step in the journey toward universal automation” said Fabrice Jadot, senior vice president, next generation automation, Schneider Electric. “To fully realize the potential of next-generation industries, we must embrace a new way of thinking. Working to common, open standards is vital to ensuring multivendor interoperability and seamless interfaces from supply chain through manufacturing and production to the end customer. Now is the time for all vendors to fully embrace open implementations with code and function portability to become more connected. Today is the first step in a new direction. We invite industrial developers everywhere to create their own software components and solutions based on the IEC61499 standard, which can easily interoperate with EcoStruxure Automation Expert.”
Emerson also joins the OPC Foundation Field Level Communications (FLC) initiative to drive a holistic approach to sensor and device level communications across process and factory automation industries.
This news could be a big win for the OPC Foundation. On the other hand, it sometimes happens that big companies join standards efforts in order to delay adoption. Sometimes, though, big companies see benefits for themselves from standards developments. It saves them a ton of internal development time and money. The news that Emerson is increasing its commitment to OPC and working with the field Level communications development can give a needed boost to the effort. We can hope.
The OPC Foundation is proud to announce Emerson has joined its Board of Directors and sincerely welcomes Peter Zornio, Chief Technology Officer for Emerson Automation Solutions, as Emerson’s representative on the Board of Directors. Emerson is one of the world’s largest automation suppliers, providing engineering services and automation technologies to process and discrete manufacturing industries.
Emerson has a long history with the OPC Foundation. As one of its founding members, Emerson played an important role in the development and adoption of the first OPC data connectivity standard and contributed to the development of OPC UA, today’s open data interoperability standard. Emerson supports OPC UA initiatives by participating in OPC Foundation working groups and by adopting OPC UA in a wide variety of its family of products.
Peter Zornio says: “OPC technology is well established in the automation space as the de facto standard for application-level communications. It also provides integration between operations technology (OT) and the IT world, including cloud-based environments. We look forward to growing that role, as well as working with the OPC FLC initiative on expanding OPC technology into real-time communications between control and field-connected devices. OPC is the best candidate to have a single communication standard cover the entire scope of automation architecture from intelligent field devices to the cloud.”
Peter Lutz, Director FLC Initiative: “Currently, the OPC Foundation is extending use of OPC UA down to the level of sensors and devices on the shop floor via its Field Level Communications (FLC) initiative. The value that a major player like Emerson brings to this initiative is important from both technical and market messaging perspectives. First, OPC Foundation Working Groups benefit from extensive Emerson field level expertise. Second, with its strong support for ongoing OPC UA standard development for use in both process and discrete industries, Emerson helps send a clear message to the market: OPC UA plays an equally important role in both verticals.
Stefan Hoppe, President and Executive Director, OPC Foundation commented “It is gratifying to welcome a company of Emerson’s stature to the OPC Foundation Board of Directors. Working together with Emerson and our other valued board members, the OPC Foundation is now better positioned to deliver on its directive to provide the world with the best single data connectivity and interoperability standard for use throughout the enterprise, regardless of the industry sector.”
One of the things I miss about attending conferences is the Moleskin-type 5.5 x 8.5-inch notebooks the organizers hand out. I have not purchased a notebook for years. I take notes with a pen (Uniball Signo Micro 207) in a journal-type notebook. I can make sketches, mindmaps, draw arrows from thought to thought, and so forth. By the way, studies continue to show that writing enhances memory.
Another aspect of a conference I miss are hearing application stories from people who apply the technology. It’s a chance to go beyond technology to human factors, decision making, management, and the like. This press release is from the Inductive Automation Ignition Community Conference where they hand out awards—this year virtually—to the best applications. You should get an idea or twol
Inductive Automation has selected the recipients of its Ignition Firebrand Awards for 2020. The announcements were made at the virtual Ignition Community Conference (ICC), which took place online on September 15.
The Ignition Firebrand Awards recognize system integrators and industrial organizations that use the Ignition software platform to create innovative new projects. Ignition by Inductive Automation® is an industrial application platform with tools for the rapid development of solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), manufacturing execution systems (MES), and the Industrial Internet of Things (IIoT).
The Ignition Firebrand Awards are presented every September at ICC. The award-winning projects are selected from the ICC Discover Gallery, which features the best 18 Ignition projects submitted by integrators and industrial organizations.
“This year’s Firebrand Award projects all made a big impression on us,” said Don Pearson, chief strategy officer for Inductive Automation. “The engineers came up with custom solutions featuring mobility, edge computing, MQTT, stronger remote connections, interfaces with ERP, and much more. The results included significant cost savings, greater efficiency, and enterprise-wide data sharing. Terrific projects!”
The 2020 Ignition Firebrand Award winners:
- Brock Solutions replaced seven SCADA systems with one for Toronto Pearson International Airport’s baggage handling system. The new system has more than 280,000 tags, enables greater efficiency and data-sharing, and is the ideal platform for future expansion. See the video.
- Controtek Solutions leveraged Ignition to connect 93 sites to a central SCADA system for Manila Water Company in the Philippines. The new system provides remote data gathering, operations monitoring, and enterprise integration. The solution makes extensive use of maps and data-rich dashboards. See the video.
- Flexware Innovation implemented a new SCADA/MES system at a California production facility for Veoneer, a leading global supplier to the automotive industry. Veoneer makes advanced safety systems for vehicles. The new SCADA/MES system provides more data and gives Veoneer the ability to quickly make changes itself. See the video.
- The Integration Group of Americas (TIGA) created a SCADA system that’s saving more than $500,000 per year for WaterBridge, a water-management company serving the oil & gas industry. TIGA’s solution included greater mobility, edge computing, and MQTT for 65 saltwater disposal facilities. See the video.
- Vertech provided the SCADA/MES system for a large greenfield plant in New Jersey for AriZona Beverages. Six distinct Ignition projects were created, including ISA-88 batch control, interfacing with SAP via Sepasoft, downtime analysis, and dashboards. See the video.
- Waste Management, North America’s leading provider of integrated environmental solutions with more than 20 million customers, used Ignition Perspective to create a mobile-first SCADA system for its landfills. The new solution provides a significant improvement in day-to-day operations. See the video.
Information on all 18 Discover Gallery projects can be seen here.
The ICC 2020 Keynote and other conference videos can be seen here.