OnRobot Adds Software Product To Its Cobot Portfolio

The “Subscribe” links goes to a MailChimp sign up page. I have stopped using MailChimp due to its obnoxious marketing tactics. WordPress stopped its service of sending a notice of updated posts. I am now using the Web page and email service of Hey, developed by BaseCamp. Please visit world.hey.com/garymintchell to register for the newsletter. There is no tracking or other privacy-invading tech.

Industrial automation products have blended hardware and software for many years. Companies, likewise, find the need to develop software products augmenting their hardware products. News has reached me from a couple of different segments of the market with similar emphases. This one is from OnRobot, manufacturer of collaborative robots, the so-called “cobots”. This appears to be a valuable adjunct to its main product line.

WebLytics brings remote monitoring, device diagnostics, and data analytics capabilities to OnRobot’s line of collaborative application-focused hardware solutions.

The company describes this as “a unique production monitoring, device diagnostics, and data analytics solution designed to enhance productivity and minimize downtime.”

Capable of monitoring the performance of multiple collaborative applications simultaneously and in real-time, WebLytics gathers equipment data from both robots and tools and transforms it into easy-to-understand, visualized device and application-level intelligence.

“The launch of WebLytics is an important landmark for OnRobot, our customers, and our global integrator network,” said Enrico Krog Iversen, CEO of OnRobot. “WebLytics is the first software solution to provide real-time, application-focused data for collaborative applications across major robot brands. As our first software product, WebLytics marks the beginning of OnRobot’s journey into robot software and completes our vision of providing a One Stop Shop for collaborative applications on both the hardware and software side.”

For end users and integrators, WebLytics not only eliminates manual data collection — it provides actionable insights into how well a collaborative application is performing, offering live device diagnostics, alerts and preventive maintenance measures to keep costly robot cell downtime to a minimum. 

This software appears to solve one of the many drawbacks of OEE—consistent and accurate reporting of the actual data from the machine.

Integrating the globally recognized overall equipment effectiveness (OEE) industry standard, WebLytics identifies trends in real-time in the robot cell, including patterns, peaks, and disturbances in application productivity. OEE measures the percentage of manufacturing time that is truly productive – a score of 100% indicates that the collaborative application is producing only good parts, as fast as possible, and with no downtime. Leveraging these OEE measures, WebLytics can determine whether the manufacturing process is running at optimal speed and can monitor and analyze the quality of application cycles –key insights for manufacturers of all sizes.

WebLytics can report on utilization of the robot arm and OnRobot tools such as grippers, vision cameras, and sensors, as well as the number of safety stops initiated, and the number of grip cycles performed while an application is running.

When changes are made to a robot cell, such as adjusting the speed of a robot or the settings on a gripper, WebLytics can also automatically report on the impact of those changes on application performance.

If anomalies occur in the collaborative application after deployment, WebLytics enables users to analyze the data collected directly from the robot(s) and tools and report on its findings using customizable dashboards.

Customer testimonial. We’ve been writing about automated data collecting for 20 years. Each year the industry moves another step forward.

Laszlo Papp, Product Manager & Sales Engineer at Wamatec Hungary Kft., tested WebLytics on machine tending, pick & place, and palletizing applications: “In this fast-paced world, time is everything. When cycle time is really important, WebLytics helps you identify the small mistakes that cause time wastage,” he said. “WebLytics can also save a lot of time for yourself and for your production line by making it easy to schedule all maintenance and product changes. My favorite function was the dashboard. I really liked how WebLytics allowed me to monitor all my applications, my cobots/robots, and my end-of-arm-tools using one platform that provides real time monitoring, data collection and line charting. WebLytics makes optimizing all applications much easier than before.”  

Access to WebLytics’ is provided through a secure, intuitive browser-based user interface, that displays OEE measures and user-defined KPIs through customizable dashboards that provide an immediate and transparent view into real-time and historical application performance.

The WebLytics server can be deployed on a shop floor’s local network or added to a virtual network that connects to the robot cell. Collected data is stored locally on the WebLytics server. Meanwhile, WebLytics’ built-in web server is always accessible from the shop floor network or from anywhere in the world via secure HTTPS connection.

WebLytics also creates new revenue opportunities for system integrators, by providing the software required to offer their customers data-backed custom service agreements and engineering services for cell optimization.

Standards and Open Source News

Open source predominates in IT. One can find open source growing within OT. I expect more as the younger generation of engineers takes over from the Boomers. My generation has laid a great foundation of standards. These make things better for engineers just trying to get a job done with inadequate resources. A few news items have piled up in my queue. Here is a CESMII announcement followed by several from the Linux Foundation.

SME and CESMII Join Forces to Accelerate Smart Manufacturing Adoption

SME, a non-profit professional association dedicated to advancing manufacturing, and CESMII – The Smart Manufacturing Institute, are partnering to align their resources and educate the industry, helping companies boost productivity, build a strong talent pipeline, and reduce manufacturers’ carbon footprint.

CESMII and SME will address the “digital divide” by connecting manufacturers to technical knowledge. These efforts will especially help small and medium-size companies—a large part of the supply network—to overcome the cost and complexity of automation and digitization that has constrained productivity and growth initiatives. 

“The prospect of the Fourth Industrial Revolution catalyzing the revitalization of our manufacturing productivity in the U.S. is real, but still aspirational, and demands a unified effort to accelerate the evolution of this entire ecosystem,” said John Dyck, CEO, CESMII. “We couldn’t be happier to join with SME on this important mission to combine and align efforts with the best interest of the employers and educators in mind.”

Smart Manufacturing Executive Council

The first joint initiative is the formation of a new national Smart Manufacturing Executive Council. It will engage business and technology executives, thought leaders, and visionaries as a “think tank” advocating for the transformation of the ecosystem. It will build on each organization’s history of working with industry giants who volunteer their time and impart their knowledge to benefit the industry.

Members of the council will act as ambassadors to drive the national conversation and vision for smart manufacturing in America. Working with policy makers and others, the council will unify the ecosystem around a common set of interoperability, transparency, sustainability and resiliency goals and principles for the smart manufacturing ecosystem.

Focus on Manufacturing Workforce

The need for richer, scalable education and workforce development is more important than ever.

SME’s training organization, Tooling U-SME, is the industry’s leading learning and development solutions provider, working with thousands of companies, including more than half of all Fortune 500 manufacturers as well as 800 educational institutions across the country. CESMII has in-depth training content on smart manufacturing technology, business practices, and workforce development. Leveraging Tooling U-SME’s extensive reach into industry and academia, the synergistically combined CESMII and Tooling U-SME training portfolios and new content collaborations will expedite smart manufacturing adoption, driving progress through transformational workforce development.

Through this national collaboration, Tooling U-SME will become a key partner for CESMII for advancing education and workforce development around smart manufacturing. 

“Manufacturers are looking for a more effective, future-proof approach to upskill their workforce, and we believe that the best way to accomplish that is for CESMII and Tooling U-SME to work together,” said Conrad Leiva, Vice President of Ecosystem and Workforce Education at CESMII. “This partnership brings together the deep domain expertise and necessary skills with the know-how to package education, work with employers and schools and effectively deliver it at scale nationally.

Linux Foundation Announces NextArch Foundation

The Linux Foundation announced the NextArch Foundation. The new Foundation is a neutral home for open source developers and contributors to build next-generation architecture that can support compatibility between an increasing array of microservices. 

Cloud-native computing, Artificial Intelligence (AI), the Internet of Things (IoT), Edge computing and much more have led businesses down a path of massive opportunity and transformation. According to market research, the global digital transformation market size was valued at USD 336.14 billion in 2020 and is expected to grow at a compound annual growth rate (CAGR) of 23.6% from 2021 to 2028. But a lack of intelligent, centralized architecture is preventing enterprises and the developers who are creating innovation based on these technologies to fully realize their promise.

“Developers today have to make what feel like impossible decisions among different technical infrastructures and the proper tool for a variety of problems,” said Jim Zemlin, Executive Director of the Linux Foundation. “Every tool brings learning costs and complexities that developers don’t have the time to navigate yet there’s the expectation that they keep up with accelerated development and innovation. NextArch Foundation will improve ease of use and reduce the cost for developers to drive the evolution of next-generation technology architectures.” 

Next-generation architecture describes a variety of innovations in architecture, from data storage and heterogeneous hardware to engineering productivity, telecommunications and much more. Until now, there has been no ecosystem to address this massive challenge. NextArch will leverage infrastructure abstraction solutions through architecture and design and automate development, operations and project processes to increase the autonomy of development teams. Enterprises will gain easy to use and cost-effective tools to solve the problems of productization and commercialization in their digital transformation journey.

Linux Foundation and Graviti Announce Project OpenBytes to Make Open Data More Accessible to All

The Linux Foundation announced the new OpenBytes project spearheaded by Graviti. Project OpenBytes is dedicated to making open data more available and accessible through the creation of data standards and formats. 

Edward Cui is the founder of Graviti and a former machine learning expert within Uber’s Advanced Technologies Group. “For a long time, scores of AI projects were held up by a general lack of high-quality data from real use cases,” Cui said. “Acquiring higher quality data is paramount if AI development is to progress. To accomplish that, an open data community built on collaboration and innovation is urgently needed. Graviti believes it’s our social responsibility to play our part.”

By creating an open data standard and format, Project OpenBytes can reduce data contributors’ liability risks. Dataset holders are often reluctant to share their datasets publicly due to their lack of knowledge on various data licenses. If data contributors understand their ownership of data is well protected and their data will not be misused, more open data becomes accessible.
 
Project OpenBytes will also create a standard format of data published, shared, and exchanged on its open platform. A unified format will help data contributors and consumers easily find the relevant data they need and make collaboration easier. These OpenBytes functions will make high-quality data more available and accessible, which is significantly valuable to the whole AI community and will save a large amount of monetary and labor resources on repetitive data collecting.

The largest tech companies have already realized the potential of open data and how it can lead to novel academic machine learning breakthroughs and generate significant business value. However, there isn’t a well-established open data community with neutral and transparent governance across various organizations in a collaborative effort. Under the governance of the Linux Foundation, OpenBytes aims to create data standards and formats, enable contributions of good-quality data and, more importantly, be governed in a collaborative and transparent way.

Linux Foundation Announces Security Enhancements to its LFX Community Platform to Protect Software Supply Chain

The Linux Foundation announced it has enhanced its free LFX Security offering so open source projects can secure their code and reduce non-inclusive language.

The LFX platform hosts community tools for security, fundraising, community growth, project health, mentorship and more. It supports projects and empowers open source teams to write better, more secure code, drive engagement and grow sustainable ecosystems.

The LFX Security module now includes automatic scanning for secrets-in-code and non-inclusive language, adding to its existing comprehensive automated vulnerability detection capabilities. Software security firm BluBracket has contributed this functionality to open source software projects under LFX as part of its mission of making software safer and more secure. This functionality builds on contributions from leader in developer security, Snyk, now making LFX the leading vulnerability detection platform for the open source community.

The need for a community-supported and freely available code scanning is clear, especially in light of recent attacks on core software projects and recent the White House Executive Order calling for improved software supply chain security. LFX is the first and only community tool designed to make software projects of all kinds more secure and inclusive.

LFX Security now includes:
● Vulnerabilities Detection: Detect vulnerabilities in open source components and dependencies and provide fixes and recommendations to those vulnerabilities. LFX tracks how many known vulnerabilities have been found in open source Projects, identifies if those vulnerabilities have been fixed in code commits and then reports on the number of fixes per project through an intuitive dashboard. Fixing known open source vulnerabilities in open source projects helps cleanse software supply chains at their source and greatly enhances the quality and security of code further downstream in development pipelines. Snyk has provided this functionality for the community and helped open source software projects remediate nearly 12,000 known security vulnerabilities in their code.
● Code Secrets: Detect secrets-in-code such as passwords, credentials, keys and access tokens both pre- and post-commit. These secrets are used by hackers to gain entry into repositories and other important code infrastructure. BluBracket is the leading provider of secrets detection technology in the industry and has contributed these features to the Linux Foundation LFX community.
● Non-Inclusive Language: Detect non-inclusive language used in project code, which is a barrier in creating a welcoming and inclusive community. BluBracket worked with the Inclusive Naming Initiative on this functionality.

But That’s Not All, You Can Get Twice As Many If

I wondered how long it would take for someone to ask. What happens when an independent software developer who has a widely adopted product sells out to a particular supplier? The supplier proclaims independence, but will it really not absorb it into their proprietary product?

Then GE Digital sent an email to me at a very old address that I still have forwarded (note to self: delete that old address).  Get Proficy Historian for up to 90% less than OSI—or swap your current OSI or eDNA license at no cost.

Wow. This sounded a bit like a late-night cable TV ad. “But wait, if you act now you can get two for this amazing low price of $19.99.”

Sorry to poke a little fun at GE Digital. The question is, does this ad reflect a real marketplace concern? Or, is it a weak marketing gimmick? I don’t know. But I have been curious. What do you think?

Gary,

We’re serious about helping you get the most from your OT data. So we’re giving you two offers to choose from: 

Offer 1: Get Proficy Historian at our lowest price ever—up to 90% less than OSI 

Offer 2: Swap your current OSI, AVEVA, or eDNA historian license for a Proficy Historian license at absolutely no additional cost. You’ll then also enjoy product support at 50% off your previous rate with OSI, AVEVA, or eDNA.

Thousands of companies worldwide depend on Proficy Historian for gathering their most important data. Take advantage of:

  • Modern visualization and data in context with Historian Analysis app
  • Up to 100 million tags with Proficy Historian Enterprise
  • Support for server horizontal scalability and mirroring
  • Third-party cloud connectivity
  • Too many other features to list

Emerson Morphs Software Strategy in Transaction with AspenTech

Consolidation continues to rock the industrial software market. I recorded a podcast ruminating on the changes and where the market might be heading. Just last week, I added a blog post about a new CTO at NI. Previously, NI had reported strategy shifts to emphasize software as a key supplement and extension to its strong data acquisition and analytics portfolio. Independent software developers and longtime stalwarts in the market PAS and OSIsoft sold to Hexagon and AVEVA respectively. Iconics is now part of Mitsubishi. Infor went to Koch Industries. One wonders where AWS, Azure, Google Cloud might fit into this mix.

Perhaps you have seen speculation about the financial moves of Emerson and AspenTech. Some people extolled this as a sign of Emerson expanding into software. They missed the point. Emerson tried software making an acquisition followed by a divestiture. Remember Emerson made a big play that became public for Rockwell Automation. It’s obviously hungry for growth of some kind through acquisition. 

Meanwhile, AspenTech has resided on shaky financial foundations for quite a while. New management has constructed a firmer structure, but the company still needed a capital infusion.

And again, Emerson had completed a couple of software acquisitions, but I think the financial managers figured out that mixing a software business into a hardware business is tricky. I don’t mean illegal tricky. I mean keeping track of the differing financial models tricky.

I find this financial transaction of Emerson and AspenTech intriguing. It seems to follow the path forged by Schneider Electric when it was trying to figure out what to do with the software portfolio it inherited with a number of acquisitions followed by swallowing Invensys. It took two software acquisitions plus AspenTech, mixed the pot with a wisk, and came away with 55% ownership of a new company dubbed New AspenTech.

Expect to see further consolidation.

Emerson to Receive 55% Stake of New AspenTech

  • AspenTech Shareholders to Receive Approximately $87 Per Share in Cash and 0.42 Shares of New AspenTech for each AspenTech Share, Providing Upside through 45% Stake
  • New AspenTech Expected to Drive Double-Digit Annual Spend Growth, Best-in-Class Profitability, Strong Free Cash Flow and Be Positioned to Pursue and Complete Strategic Transactions
  • Emerson Reaffirms Fiscal Year 2021 Underlying Sales Guidance of 5% to 6% and Adjusted EPS Guidance of $4.06 to $4.08

Emerson and AspenTech announced that the companies have entered into a definitive agreement to contribute Emerson’s industrial software businesses – OSI Inc. and the Geological Simulation Software business – to AspenTech to create a diversified, high-performance industrial software leader with greater scale, capabilities and technologies (“new AspenTech”). Emerson will also contribute $6.0 billion in cash to new AspenTech, which will be received by AspenTech shareholders, in exchange for a 55% stake in new AspenTech. New AspenTech will offer a highly differentiated industrial software portfolio with the capabilities to support the entire lifecycle of complex operations across a wide range of industry verticals, including design and engineering, operations, maintenance and asset optimization.

The new company, which will retain the name AspenTech, enables Emerson to realize significant synergies and accelerate its software strategy to drive meaningful value creation. Majority ownership position in a highly valued, pure-play industrial software leader will give Emerson the platform and flexibility to strategically deploy capital for growth through continued investment and M&A. The transaction continues Emerson’s long history of delivering shareholder value. New AspenTech will be fully consolidated into Emerson financials and is expected to be accretive to adjusted EPS after year one.

MES Adds SPC and Recipe Module Updates

I don’t seem to receive many MES updates. This update comes from Sepasoft, a sister company to Inductive Automation that naturally features connectivity to Ignition by Inductive Automation. These updates include Statistical Process Control v 3.0 and Settings and Changeover (formerly Recipe and Changeover).

Statistical Process Control 3.0 Module for Enterprise-Ready Connectivity

Statistical Process Control (SPC) 3.0 module extends Manufacturing Execution System (MES) manufacturing operations, allowing scalable solutions from the plant floor all the way up to executive level operations, regardless of location.

The new SPC 3.0 update adds important new features:

• Sample entry documents that can be integrated with Standard Operating Procedures (SOPs) and custom sample entry forms are now available to reduce time while increasing consistency with your sample data.

• Highly-optimized data collection eliminates or reduces the time required by operators and QA staff to enter in sample data.

• Central MES Configuration reduces or eliminates common SPC configuration errors regarding sample definitions, equipment, materials, etc.

• Scalability and increased transparency are now an added benefit with our MES enterprise-wide management of sample definitions and enterprise-wide analysis of results features.

• Samples can be scheduled in concert with OEE, Track & Trace Operations, and Batch Procedure functionality.

Settings & Changeover 3.0 Module for Enterprise-Ready Connectivity

The Settings & Changeover 3.0 update adds various important new features and improvements:• Software performance gets a massive upgrade due to less demand on the server, database, & network. 

• Prevent loss of production or quality by centrally managing machine settings, equipment configuration, and security access rights. Identify deviations as they happen and quickly resolve critical issues with role-based management.

• Changes made to any MES configuration are automatically recorded in a changelog to help you meet challenges of regulatory compliance. Quickly zero in on significant changes with access to a complete audit trail evidencing the location, date, type of change, responsible person, and more.

• Optimize your document organization while increasing your production efficiency with our Artifacts feature that replaces the need for paper-based procedures and vastly improves upon your Standard Operating Procedures (SOPs)

• The Settings & Changeover module is now compatible with OEE and Track & Trace 3.0 modules.

• Enterprise-wide management of machine settings and analysis of results.

HPE Hastens Transition To Data Management Company

Hewlett Packard Enterprise (HPE) held a Web event Sept. 28 to announce extensions and enhancements to its GreenLake edge-to-cloud platform. One commentator during the “deep dive” sessions opined that HPE is becoming a “data management company.” In other words, it is transitioning from a hardware company to a software and as-a-Service company. And the pace of the change during the past two years is picking up. Quite frankly, I’m surprised at the speed of the changes in the company over that brief period of time.

The announcements in summary: 

  • HPE unveils new cloud services for the HPE GreenLake edge-to-cloud platform
  • The HPE GreenLake platform now has more than 1,200 customers and $5.2 billion in total contract value
  • HPE takes cyberthreats and ransomware head-on with new cloud services to protect customers’ data from edge to cloud
  • HPE pursues big data and analytics software market – forecasted by IDC to reach $110B by 2023– with industry’s first cloud-native unified analytics and data lakehouse cloud services optimized for hybrid environments

Following is information from HPE’s press release. 

HPE GreenLake edge-to-cloud platform combines control and agility so customers can accelerate innovation, deliver compelling experiences, and achieve superior business outcomes

Hewlett Packard Enterprise (NYSE: HPE) today announced a sweeping series of new cloud services for the HPE GreenLake edge-to-cloud platform, providing customers unmatched capabilities to power digital transformation for their applications and data. This represents HPE’s entry into two large, high-growth software markets – unified analytics and data protection. Together, these innovations further accelerate HPE’s transition to a cloud services company and give customers greater choice and freedom for their business and IT strategy, with an open and modern platform that provides a cloud experience everywhere. The new offerings, which add to a growing portfolio of HPE GreenLake cloud services, allow customers to innovate with agility, at lower costs, and include the following:

  • HPE GreenLake for analytics – open and unified analytics cloud services to modernize all data and applications everywhere – on-premises, at the edge, and in the cloud

  • HPE GreenLake for data protection – disaster recovery and backup cloud services to help customers take ransomware head-on and secure data from edge-to-cloud

  • HPE Edge-to-Cloud Adoption Frameworkand automation tools– a comprehensive, proven set of methodologies expertise, and automation tools to accelerate and de-risk the path to a cloud experience everywhere

“The big data and analytics software market, which IDC predicts will reach $110 billion by 2023, is ripe for disruption, as customers seek a hybrid solution for enterprise datasets on-premises and at the edge,” said Antonio Neri, president and CEO, at HPE. “Data is at the heart of every modernization initiative in every industry, and yet organizations have been forced to settle for legacy analytics platforms that lack cloud-native capabilities, or force complex migrations to the public cloud that require customers to adapt new processes and risk vendor lock-in. The new HPE GreenLake cloud services for analytics empower customers to overcome these trade-offs and gives them one platform to unify and modernize data everywhere. Together with the new HPE GreenLake cloud services for data protection, HPE provides customers with an unparalleled platform to protect, secure, and capitalize on the full value of their data, from edge to cloud.” 

HPE continues to accelerate momentum for the HPE GreenLake edge-to-cloud platform. The HPE GreenLake platform now has more than 1,200 customers and $5.2 billion in total contract value. In HPE’s most recent quarter, Q3 2021, HPE announced that the company’s Annualized Revenue Run Rate was up 33 percent year-over-year, and as-a-service orders up 46 percent year-over-year. Most recently, HPE announced HPE GreenLake platform wins with Woolworths Group, Australia and New Zealand’s largest retailer, and the United States National Security Agency.

HPE GreenLake Rolls Out Industry’s First Cloud-Native Unified Analytics and Data Lakehouse Cloud Services Optimized for Hybrid Environments

HPE GreenLake for analytics enable customers to accelerate modernization initiatives, for all data, from edge to cloud. Available on the HPE GreenLake edge-to-cloud platform, the new cloud services are built to be cloud-native and avoid complex data migrations to the public cloud by providing an elastic, unified analytics platform for data and applications on-premises, at the edge and in public clouds. Now analytics and data science teams can leverage the industry’s first cloud-native solution on-premises, scale-up Apache Spark lakehouses, and speed up AI and ML workflows. The new HPE GreenLake cloud services include the following:

  • HPE Ezmeral Unified Analytics: Industry’s first unified, modern analytics and data lakehouse platform optimized for on-premises deployment and spans edge to cloud.

  • HPE Ezmeral Data Fabric Object Store: Industry’s first Kubernetes-native object store optimized for analytics performance, providing access to data sets edge to cloud.

  • Expanding HPE Ezmeral Partner Ecosystem: The HPE Ezmeral Partner Program delivers a rapidly growing set of validated full-stack solutions from ISV partners that enable customers to build their analytics engines. This includes new support from NVIDIA, Pepperdata and Confluent, and open-source projects such as Apache Spark. HPE has added 37 ISV partners to the HPE Ezmeral Partner Program since it was first introduced in March 2021, delivering additional ecosystem stack support of core use cases and workloads for customers, including big data and AI/ML use cases.

HPE Takes Cyberthreats and Ransomware Head-On with New HPE GreenLake Cloud Services to Protect Customers’ Data from Edge to Cloud

HPE today entered the rapidly growing data protection-as-a-service market with HPE GreenLake for data protection, new cloud services designed to modernize data protection from edge to cloud, overcome ransomware attacks, and deliver rapid data recovery.

  • HPE Backup and Recovery Service: Backup as a service offering that provides policy-based orchestration and automation to backup and protect customers’ virtual machines across hybrid cloud, and eliminates the complexities of managing backup hardware, software, and cloud infrastructure.

  • HPE GreenLake for Disaster Recovery: Following the close of the Zerto acquisition, HPE plans to deliver Zerto’s industry-leading disaster recovery as a service through HPE GreenLake, to help customers recover in minutes from ransomware attacks. Zerto provides best-in-class restore times without impacting business operations for all recovery scenarios.

HPE Accelerates Adoption of Cloud-Everywhere Operating Models with Proven Framework and Data-Driven Intelligence and Automations Tools

HPE also today announced a proven set of methodologies and automation tools to enable organizations to take a data-driven approach to achieve the optimal cloud operating model across all environments:

  • The HPE Edge-to-Cloud Adoption Framework leverages HPE’s expertise in delivering solutions on-premises, to meet a broad spectrum of business needs for customers across the globe. HPE has identified several critical areas that enterprises should evaluate and measure to execute an effective cloud operating model. These domains, which include Strategy and Governance, People, Operations, Innovation, Applications, DevOps, Data, and Security, form the core of the HPE Edge-to-Cloud Adoption Framework.

  • The cloud operational experience is enhanced with the industry’s leading AI Ops for infrastructure, HPE InfoSight, that now constantly observes applications and workloads running on the HPE GreenLake edge-to-cloud platform. The new capability, called HPE InfoSight App Insights, detects application anomalies, provides prescriptive recommendations, and keeps the application workloads running disruption free. HPE CloudPhysics delivers data-driven insights for smarter IT decisions across edge-to-cloud, enabling IT to optimize application workload placement, procure right-sized infrastructure services, and lower costs.

HPE GreenLake Announcement Event

Please visit the HPE Discover More Network to watch the HPE GreenLake announcement event, including the keynote from Antonio Neri, HPE president and CEO, live on September 28that 8:00 am PT or anytime on-demand.

Product Availability

HPE GreenLake for analytics and HPE GreenLake for data protection will be available in 1H 2022.

The HPE Edge-to-Cloud Adoption Framework is available now.

HPE provides additional information about HPE product and services availability in the following blogs: 

HPE GreenLake for analytics

HPE GreenLake for data protection

HPE Edge-to-Cloud Adoption Framework

Hewlett Packard Enterprise (NYSE: HPE) is the global edge-to-cloud company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way people live and work, HPE delivers unique, open and intelligent technology solutions delivered as a service – spanning Compute, Storage, Software, Intelligent Edge, High Performance Computing and Mission Critical Solutions – with a consistent experience across all clouds and edges, designed to help customers develop new business models, engage in new ways, and increase operational performance.