Dell Technologies Accelerates as-a-Service Strategy

A few hours rather than a few days this week brought several conferences to my attention. One was Dell Technologies World. I first was involved with Dell through the Influencer program for Internet of Things and Edge. Both of those groups are gone. The only people I remember are selling laptops now. Only a few years ago, Michael Dell spoke of the IoT group in his keynote. This year…nada.

However, there was still much of interest this year. Just as its competitors have been building out as-a-Service strategies, Dell Technologies has announced just such a strategy. Expect this trend to spread.

Highlights

Businesses can buy and deploy IT with a simple, consistent cloud experience across the industry’s broadest IT portfolio

News summary

  • Project APEX simplifies how Dell Technologies customers can consume IT as-a-Service
  • Dell Technologies Cloud Console gives customers a single self-service interface to manage every aspect of their cloud and as-a-Service journey
  • Dell Technologies Storage as-a-Service will be deployed and managed on-premises by Dell Technologies
  • Dell Technologies Cloud Platform advancements make cloud compute resources accessible with instance-based offerings, lowering the barrier of entry and extendingsubscription availability

Dell Technologies expands its as-a-Service capabilities with Project APEX to simplify how customers and partners access Dell technology on-demand—across storage, servers, networking, hyperconverged infrastructure, PCs and broader solutions.

Project APEX will unify the company’s as-a-Service and cloud strategies, technology offerings, and go-to-market efforts. Businesses will have a consistent as-a-Service experience wherever they run workloads including on-premises, edge locations and public clouds.

“Project APEX will give our customers choice, simplicity and a consistent experience across PCs and IT infrastructure from one trusted partner—unmatched in the industry,” said Jeff Clarke, chief operating officer and vice chairman, Dell Technologies.“We’re building upon our long history of offering on-demand technology with this initiative. Our goal is to give customersthe freedom to scale resources in ways that work best for them, so they can quickly respond to changes and focus less on IT and more on their business needs.”

“By the end of 2021, the agility and adaptability that comes with as-a-Service consumption will drive a 3X increase in demand for on-premises infrastructure delivered via flexible consumption/as-a-Service solutions,” Rick Villars, group vice president, Worldwide Research at IDC.

Dell Technologies as-a-Service and cloud advancements

The new Dell Technologies Cloud Console will provide the foundation for Project APEX and will deliver a single, seamless experience for customers to manage their cloud and as-a-Service journey. Businesses can browse the marketplace and order cloud services and as-a-Service solutions to quickly address their needs. With a few clicks, customers can deploy workloads, manage their multi-cloud resources, monitor their costs in real-time and add capabilities.

Available in the first half of next year, Dell Technologies Storage as-a-Service (STaaS) is an on-premises, as-a-Service portfolio of scalable and elastic storage resources that will offer block and file data services and a broad range of enterprise-class features. STaaS is designed for OPEX transactions and allows customers to easily manage their STaaS resources via the Dell Technologies Cloud Console.

Dell continues to expand its Dell Technologies Cloud and as-a-Service offerings with additional advances:

  • Dell Technologies Cloud Platform instance-based offerings— Customers can get started with hybrid cloud for as low as $47 per instance per month with subscription pricing, making it easy to buy and scale cloud resources with pre-defined configurations through the new Dell Technologies Cloud Console.
  • Geographic Expansions— Dell Technologies is extendingDell Technologies Cloud Platform subscription availability to the United Kingdom, France and Germany with further global expansion coming soon.
  • Dell Technologies Cloud PowerProtect for Multi-cloud— This fully-managed service helps customers protect their data and applications across public clouds in a single destination via a low latency connection to the major public clouds. Businesses save costs through the PowerProtect appliance’s deduplication technology and realize additional savings with zero egress fees when retrieving their data from Microsoft Azure.
  • Pre-approved Flex On Demand pricing— The pre-configured pricing makes it simpler for customers to select and deploy Dell Technologies solutions with a pay-per-use experience. Dell Technologies partners globally will receive a rebate up to 20% on Flex On Demand solutions.

Continued sustainability focus

Dell Technologies Project APEX will help companies retire infrastructure in a secure and environmentally friendly manner. Dell manages the return and refurbishing of used IT gear while also helping to support customers’ own sustainability goals. The company is making additional strides in achieving its Progress Made Real goals by:

  • Reselling 100% of the returned leased assets.
  • Refurbishing and reselling 89% of working assetsin the North America and EMEA regions.
  • Reselling 10% of non-working assetsto Environmental Disposal Partners who repair, reuse, resell and recycle each asset. Last year, Dell recycled 240,257 kilograms of metal, glass and plastics through this program.
  • Helping customers resell or recycle their excess hardware and prepare to return leased equipment in a secure and environmentally conscious manner through Dell Asset Resale & Recycling Services

Availability

  • Dell Technologies Cloud Console​ is available now as a public preview in the United States with EMEA availability planned for the first quarter of 2021.
  • Dell Technologies Storage as-a-Service will be available in the U.S. in the first half of 2021.
  • Dell Technologies Cloud Platform instance-based offerings with subscription pricing are available in the United States, France, Germany and the U.K. Dell Technologies Cloud PowerProtect for Multi-cloud is now available in the U.S., U.K. and Germany.
  • Flex On Demand is available in select countries in North America, Europe, Latin America and the Asia-Pacific region.

ABB Adaptive Execution For Capital Projects

ABB has been making some news lately. I’ve run projects and supervised project managers in my career, but never those really large capital projects in the energy industries. Maybe $10 million in 1985 dollars. But when they say a new methodology to reduce up to 40 per cent of capital expenditure and 30 per cent of delivery schedule improvements, even I notice. Reading through the release, the key element I picked out had to do with decoupling hardware and software in the automation system. This is something I’ve learned from ABB’s competitors, and something I expect to hear much more about.

  • ABB introduces new, agile method to industrial project execution to help energy customers adapt to challenging market conditions
  • Up to 40 per cent capital expenditure reduction is expected, delivery schedules are expected to be compressed by up to 30 per cent

ABB Adaptive Execution integrates expert teams, new technologies, agile processes, shared learnings, and proven methodologies into a single, streamlined project execution experience for all stakeholders involved in major capital investment projects.

In the energy sector, it is not uncommon for large capital projects to significantly exceed budget and experience extensive delays. ABB Adaptive Execution addresses major inefficiencies that result in cost and schedule overruns. It enables greater visibility across all layers of a project, unlocking significant project value improvements across the energy sector. With digitalization and collaboration at its core, Adaptive Execution is expected to reduce automation related capital expenditure by up to 40 per cent, compress delivery schedules by up to 30 per cent and start-up hours by up to 40 per cent.  

Using virtualization, Adaptive Execution removes the need for engineering on site and reduces the physical hardware required for a control and automation system. By decoupling hardware and software, Adaptive Execution lowers the time and overall setup costs, cutting the number of engineering hours spent on project installation, commissioning and testing by up to 85 per cent.

Peter Terwiesch, President of ABB Industrial Automation said: “2020 has been a year of disruptions across the global energy industry. With falling oil prices, challenges induced by the current lockdowns and the rising demand for sustainable energy investments, companies are looking for new ways to reduce cost, schedule and risk for major projects in this low capex environment. Harnessing advances in digitalization, ABB Adaptive Execution responds to this need and enables capital projects to thrive in the new normal.”

Harnessing efficient modular design, combined with standardized, repeatable processes and shared and effective deployment of infrastructure, tools, and resources, ABB Adaptive Execution centralizes collaboration across all project stakeholders — from a project’s inception through to its successful completion. 

Brandon Spencer, President of ABB Energy Industries commented: “Adaptive Execution will change the way in which customers, Engineering Procurement Construction (EPC) contractors and vendors interact. We can create better business value for our customers by creating an environment where everyone can do his or her own part with confidence, empowering delivery teams to achieve more, in less time. This is the key to overall project success.”

www.abb.com

Strategic Partnership to Deliver a Digital Reliability Platform

If I receive a press release without either “partnership” or “AI” in it, it will be the featured news for a week. Of course, I’m all about digital. Now that we’re past all the releases about what technology companies are doing to help protect workers from Covid, those are the keywords. This news highlights an interesting partnership between AVEVA and Chemical Business SCG to promote something called “digital reliability.” And reliability is a key requirement for technologies ever since the invention of the plow.

Strategic collaboration between the organizations extends the Engineering Digital Twin and harnesses AI-Infused Asset Performance Management to prevent unplanned downtime

AVEVA and  Chemicals Business, SCG, one of the largest petrochemical companies in Thailand and a key industry player in Asia, have announced their strategic partnership to develop a Digital Reliability Platform (DRP), a complete asset performance management (APM) solution to predict equipment health, monitor performance, and enable advanced maintenance across its operations to eliminate unplanned downtime. The DRP was completed through collaborative effort. This partnership matched the company’s broader digital transformation imperative to become a data-driven organization to advance its position as a leader in the petrochemical industry and to also take the DRP solutions to the market

Asset reliability is critical for asset intensive businesses such as petrochemicals. Unplanned shutdowns cause significant negative impacts on petrochemical value chains. Digital transformation initiatives enable businesses to address this risk by harnessing data to build and deploy an advanced APM solution to monitor critical assets and predict failure towards a goal of zero unplanned shutdowns. The solution integrates online and offline equipment data to visualize plant performance, enhance workforce efficiency, and apply artificial intelligence (AI) for predictive maintenance and resolution.

“This is a great achievement for Chemicals Business, SCG since reliability is a critical element to our business. With the innovative approach of the Digital Reliability Platform, we will ensure that we can eliminate the business risks posed by unplanned downtime. In our quest for a partner, AVEVA was the only company to provide an end-to-end solution spanning engineering, operations, and maintenance. With the DRP, we have successfully brought together big data, AI, machine learning, and predictive analytics into a practical solution that will empower our workers and improve our performance,” said Mr. Mongkol Hengrojanasophon, Vice President – Olefins Business and Operations, Chemicals Business, SCG.

“Moreover, this partnership will include launching the Digital Reliability Platform Solutions to the market. This would be the first complete and unique digital solutions which combine both breakthrough technology and industrial specific information,” added Mr. Mongkol.

“Our strategic partnership with Chemicals Business, SCG is a major milestone for us in leveraging the strength of our portfolio to deliver value through digital transformation. We are proud to be part of this collaboration that improves operational efficiency and reliability to achieve zero unplanned downtime by maximizing asset availability with predictive and prescriptive maintenance. The standardized systems and processes defined through this collaboration will also result in improved workforce efficiency,” said Ravi Gopinath, Chief Cloud Officer and Chief Product Officer at AVEVA

The Digital Reliability Platform will bring together digital innovations and practitioner knowledge to increase work efficiency and safety to establish a new competitive standard within the industry.

Podcast 215 – Pursuing Quality

Podcast 215–I moved to a new state a few months ago and have been searching for a good local, independent coffee shop with ethically traded coffee–in vain. So, I go to Starbucks a few times a week. The concept of quality at Starbucks is not the coffee, which is probably why people doctor it with flavored sugars and milk. Its quality has always been environment. One of my first jobs was with Airstream, manufacturer of quality recreation vehicles. Everyone in the company was aware of the need for quality.

The question for you today is are you contributing to building quality, ethical products that serve your customers and society?

On a personal development note, I leave you with Seven Daily Habits from Richard Koch in The 80/20 Principle.

Open Source Meets DataOps

I had been talking Open Source with Bill Lydon, who, like me, has been around many places in manufacturing and media in his career most recently as editor of InTech. He referred me to an article he wrote on automation.com.

This release actually points to two important technologies that will combine to improve management’s ability to operate a more profitable and efficient plant. One is the open source. The other is DataOps. I had begun hearing about that from IT companies and was just beginning to wonder about applications specifically to industry when I was approached by John Harrington, one of the founders, with the initial story of HighByte.

I have written several things about DataOps. Here is one from my visit last year to the Hitachi Vantara conference, and the other reports on lack of leveraging data.

On to the actual news:

HighByte announced the release of HighByte Intelligence Hub version 1.2. The release expands the software’s support for open standards and open platforms, including MQTT, Sparkplug, and OpenJDK. Open standards and platforms simplify industrial data integrations and accelerate the time-to-value for Industry 4.0 solutions.
 
“The MQTT Sparkplug specification is critical to ensuring information interoperability and governance in the plant and throughout the enterprise,” said HighByte CEO Tony Paine. “By deploying HighByte Intelligence Hub with Sparkplug support, our customers are able to standardize and streamline their data infrastructures from factory to cloud.”
 
HighByte Intelligence Hub is the first DataOps solution purpose-built for industrial environments. DataOps is a new approach to data integration and security that aims to improve data quality, reduce time spent preparing data for analysis, and encourage cross-functional collaboration within data-driven organizations.
 
In addition to MQTT Sparkplug and Open JDK support, HighByte Intelligence Hub version 1.2 includes model and instance categorization, enhanced security, and more flexible presentation of information models and output formats.
 
The software is available as an annual subscription and is priced per instance or per site. highbyte.com  
 

From The HighByte Blog

The future of Industry 4.0 is open: open standards, open platforms, and open thinking. In today’s ecosystem, realizing the full potential of Industry 4.0 requires a mesh of products working together to fulfill each layer of the technology stack. Open standards and platforms simplify these integrations and speed up the time-to-value for Industry 4.0 solutions.

Open Standards. This release adds Sparkplug support over MQTT. Sparkplug is an open standard built on top of MQTT that defines the message format for Sparkplug-enabled applications. Many industrial sensors and systems have adopted the Sparkplug specification with MQTT as a means of integrating systems due to Sparkplug’s prescriptive topic namespace, payload definition, and state management. Using Sparkplug, customers can instantly consume and publish data models to and from other Sparkplug-enabled systems.
 
Open Platform. This release also supports OpenJDK v14, a free and open-source implementation of Java, extending the reach of HighByte Intelligence Hub to any OpenJDK-enabled platform. OpenJDK support for the underlying JAVA virtual machine ensures the longevity of the solution and reduces the cost of ownership of the solution for our customers.
 
Key Features 
HighByte Intelligence Hub version 1.2 offers the following new features:

  • Enhanced security. Securely connect and configure HighByte Intelligence Hub using HTTPS.
  • More flexible output formats. JSON output can now be further customized, allowing users to flatten or expand the hierarchy. Flexible presentation of information models is essential when supporting multiple use cases and target applications. While MQTT is becoming the de facto protocol for IoT data and many applications support it, each application has nuances in how they expect JSON to be structured. Applications and data storage engines also have unique needs regarding update frequency and how much information is included in the update. Flexible presentation of information models addresses these interoperability challenges.
  • Publish only data that changes. Enable a flow to only publish model data that has changed, reducing the amount of data sent to applications.
  • Easily organize models and instances. Models and model instances can now be organized into categories, making it easier to manage hundreds or thousands of models across your enterprise. The organization of models and instances is critical as companies scale the size of their deployments.

Tony Paine Blog Post

Communication within a start-up is pretty straightforward. If you have a question about a new product launch, you go directly to the owner or CEO. Problems with a design flaw? Talk to your lead engineer. As that business scales, your lines of communication become more complex. You may need to send information through multiple channels to get an answer. Without an easy way to send or retrieve information, it might get lost or misinterpreted or you may wait days for an answer. Anyone who has worked in that environment knows the inherent challenges.

Similarly, when organizations implement new industrial IoT solutions, they may work fine at first but become less effective as the project or company scales. The more capabilities you add, the more connections you create throughout your data systems.
 
For example, today you might need one or two pieces of production data, such as downtime or line speed, from a machine that feeds information into a business intelligence system and an analytics software package from different vendors. As your organization grows, accessing this information becomes more complex because you now have thousands of connection points. Each time you add an application to the system, you need to build connections between the new software and the other systems with which it must communicate. This dramatically increases integration costs and slows deployments.
 
To scale your industrial IoT implementation, you need a unified namespace. A unified namespace is a software solution that acts as a centralized repository of data, information, and context where any application or device can consume or publish data needed for a specific action. Without a centralized data repository, it could take months to deploy a new analytics application across the entire enterprise versus hours with a unified namespace.
 
For nearly two decades, MQTT has served as an effective messaging protocol that allows any program or device to publish data but doesn’t offer interoperability between third-party devices and applications. Technology companies have brought data interoperability to MQTT devices and applications through the development of the Sparkplug specification.
 
At HighByte, we view our unified namespace as a middleware solution that allows users to collect data from various sources, add context so there’s meaning to it, and transform it to a format that other systems can understand. That is why we are adding support for Sparkplug in the upcoming October release of HighByte Intelligence Hub.

This is where you begin to unlock the real value of machine learning because you now have the connectivity you need to optimize your systems, devices, and processes in real time and scale your IoT capabilities without costly, time-consuming implementations.

Follow this blog

Get a weekly email of all new posts.