Executives at ThinkIQ have talked with me a few times. They have an interesting story around what they call a SaaS-based Continuous Intelligence Platform. They also released some news last week.
ThinkIQ announced enhancements to its platform to provide more capabilities geared towards Continuous Intelligence for supply chain and production optimization within material centric manufacturing.
ThinkIQ is the first platform-based Continuous Intelligence solution in the market and can be operationalized at many levels – from supply chain, product quality, process improvement, and any time-sensitive process enhanced by the ability to respond to what’s happening right now throughout the entire manufacturing process. Analytics are woven directly into operational processes that can take or trigger actions when specific conditions are met. This ranges from time-sensitive alerts that guide employees on what to do next, to fully automated processes that trigger downstream actions without human intervention.
Continuous Intelligence closes the gap between what is happening in your operations now and the information and insights available. This accelerates how effectively people and processes respond to rapidly changing conditions.
With ThinkIQ, manufacturers are able to collect, analyze, and share information in a way that was not previously possible. ThinkIQ combines the capabilities of continuous ingestion of data with a well-defined model that fits manufacturing and their supply chains, achieving the goal of having current data with meaning. The latest release includes a number of new features, including:
The ability to close the loop from the edge to the cloud where AI can be done, and back to the edge where actions can be taken. This is a key step to enabling the autonomous self-driving supply chain.
- Several modeling enhancements including automatic propagation of model configuration information which enables adoption of corporate standards.
- Attributes on organizations and places. This is critical for continuous roll up of corporate, business-unit, and plant KPIs. ThinkIQ supports rollup of any KPI, including operational, environmental, safety, and financial metrics.
- Strengthening of ThinkIQ’s GraphQL API for stronger integration with third party applications.
- A new Model Browser that makes it easy to find anything anywhere in the model.
- New expressions for attributes for simple and weighted moving average. The expressions take into account how the interpolated data between the recorded points shall be interpreted and do not require that the data input stream is evenly sampled in time.
- Improved robustness and performance of our on-premises gateways and connectors.
- Cyber security enhancements according to our SOC 2 compliance program.
“The supply chain has been in disarray since the pandemic started, and accurate materials tracking is more important than ever before for manufacturers,” said Niels Andersen, CTO and CPO of ThinkIQ. “With our enhanced Continuous Intelligence solution, our customers can respond and adapt their manufacturing processes automatically based on ever-changing conditions leveraging contextualized, time-sensitive data.”
Steve Pavlosky, Principal Product Manager Proficy Historian and Data at the Edge, talked with me about this latest release of Proficy Historian by GE Digital. I asked him for the important points. He responded, “It’s our extreme scalability of up to 100 million tags.” Further, this release allows customers to choose their cloud environment and how much data they want sent to the cloud of their choice. It sounds like there are some cool things coming later in the year. Glad to hear about the progress. With OSIsoft now part of AVEVA, I am interested in how market competition may (or may not) change. So this is interesting.
Here is the release.
GE Digital today announced the availability of its latest version of Proficy Historian, a historian software solution that collects industrial time-series and Alarms & Events (A&E) data to analyze asset and process performance to drive business value. Proficy Historian 2022 has a flexible and scalable architecture – from sensor to enterprise – that makes it foundational for Industrial Internet deployments.
Used by thousands of companies around the world, Proficy Historian has helped a global chemical company create a single industrial data repository across its plants for improved visibility and insights, delivering a 20% increase in capacity; a large power monitoring and diagnostics center achieve tens of millions in cost savings for customers in just one year along with 5% reduction in unplanned downtime and 20% reduction in IT infrastructure costs; and a leading industrial gas company reduce costs by consolidating to one historian and eliminating more than 100 servers.
This new version boosts large-scale deployments with enhanced system management and connectivity, value from data with a new Asset Model associated with Historian data, and significant improvement in collection throughput and encryption. Proficy Historian 2022 also features improved system management with a modern single administrator across the Proficy portfolio that increases productivity. It also provides new capabilities for managing multiple systems from a “single pane of glass.”
“GE Digital has made significant strides with Proficy Historian 2022. With features like decentralized data collection, excellent data volume handling, scalability from on-premise to hybrid Cloud to full Cloud, plus remote management, and an OPC UA Server, Proficy Historian is now one of the leading historian products on the market,” said Joe Perino, Principal Analyst, LNS Research. “No longer is there a default choice in historians. If GE Digital’s Proficy Historian is not on your short list, it certainly should be.”
A centralized collector configuration within Proficy Historian 2022 allows companies to utilize remote data collectors to reduce maintenance costs and downtime. This simplified enterprise-wide management makes Proficy Historian the best solution on the market for widely distributed data collection in large water utilities, organizations such as oil & gas and power generation and grid operators, and multi-plant manufacturing. In addition, it provides horizontal scalability, so all clients have access to all data, without the need for consolidation in a central enterprise historian.
The first business trip involving airplane and car in 18 months took me to Houston in November to Automation Fair, the Rockwell Automation user conference and trade show. They offered five press conferences via remote conferencing. I felt the urge to visit with people in person. Several thousand visitors wandered the show floor along with me. And I sat in the press conferences in person with a couple of editors from Control, a couple of analysts from ARC Advisory Group, an editor I didn’t know for one session, and an editor from Automation World for one other session. It felt good to be back, but this was hardly like old times. I was not rushed from appointment to appointment—I had no appointments.
The content was not like old times. No motor control or programmable controllers, although I did look up a PLC product person on the show floor to dive into a couple of things. The press conferences were somewhat IT oriented with cybersecurity and cloud, workforce issues around culture and diversity, and sustainability. Following are summaries of the press conferences and of three news items released at the show.
Cybersecurity Steps Needed for 2022
No discussion of industrial technology can begin without considering cybersecurity. Angela Rapko (Regional Vice President, Lifecycle Services, Rockwell Automation), Shoshana Wodzisz (Manager, Product Security, Rockwell Automation), and Theodore Haschke (Manager, Business Development, Functional Safety & Cybersecurity, TUV Rheinland) talked standards with us. High-profile cyber and ransomware attacks rocked the manufacturing industry in 2021 and raised government attention to the need for stronger oversight to protect businesses worldwide. Global cybersecurity standards have been established based on guidance from industry leaders for both the IT and OT level, but adoption still wanes. We’ll share how businesses can utilize standards to improve security in 2022, and why OT can’t be left behind when updating best practices.
Leveraging Culture and DEI as a Competitive Advantage
Bobby Griffin (Chief Diversity, Equity & Inclusion Officer, Rockwell Automation) and Becky House (Senior Vice President & Chief People & Legal Officer, Rockwell Automation) discussed how many companies have put a more intentional focus on company culture and DEI – but how do you know you’re having the right impact? Diversity, equity, and inclusion are core principles at Rockwell. This has a KPI associated and manager’s compensation is tied to it. Among other things, check out the senior leadership page on the Rockwell website. There are women on it. And a couple of other faces that are not old white men. There is a refreshing mix of ages, genders, ethnicities.
Why Cloud? Why Now? Three Factors Driving Adoption of SAAS-Based Solutions
I could understand the discussion of cybersecurity, which can be expected given the several-year-old vision of Rockwell regarding the Connected Enterprise. The discussion of computing in the cloud would never have happened with a straight face even three years ago. Maybe two. Let us consider two very recent acquisitions of cloud-based companies—Plex and Fiix. Brian Shepherd (Senior Vice President, Software & Control, Rockwell Automation), James Novak (Chief Executive Officer, Fiix), and Bill Berutti (Chief Executive Officer, Plex) joined us for a discussion of the companies, products, and benefits of cloud. Yet another sign of a rapidly changing Rockwell Automation.
Using Data to Drive Productivity and Sustainability
Rockwell Automation has had sustainability goals and solutions for many years. This topic remains a key focus for the corporation. Tom O’Reilly (Vice President, Sustainability, Rockwell Automation) and Arvind Rao (Director, Product Management & Head of Industry Solutions, Rockwell Automation) met with us to discuss how “customers and investors are demanding that we do business in ways that are more productive and more sustainable.” Operational data and analytics can reduce waste, improve quality, and reduce energy, all while driving increased productivity and delivering results against sustainability initiatives.
Three Strategies for Creating an Agile and Flexible Workforce
Rachael Conrad (Vice President & General Manager, Customer Support & Maintenance, Rockwell Automation) and Sherman Joshua (Director, Workforce & Competency, Lifecycle Services, Rockwell Automation) revealed Rockwell’s on key strategies for creating an agile and flexible workforce post pandemic and how manufacturers can leverage their workforce as their greatest asset.
New Initiatives to Bolster Cybersecurity Offering for Customers
Rockwell Automation, Inc. announced new investments to enhance its information technology (IT) and operational technology (OT) cybersecurity offering. These initiatives include strategic partnerships with Dragos, Inc. and CrowdStrike, as well as the establishment of a new Cybersecurity Operations Center in Israel.
Rockwell and Dragos, a global leader in cybersecurity for industrial control systems (ICS)/OT environments, have announced a partnership that combines Rockwell’s global industry, application, and ICS domain expertise with Dragos’s world-class technology, professional services, and threat intelligence services. The partnership will focus on incident response services and threat intelligence.
Rockwell and CrowdStrike, a leader in cloud-delivered endpoint and workload protection, have formed a partnership to deliver end-to-end cybersecurity and network service solutions to customers. The partnership will examine initiatives for CrowdStrike’s cloud-native, AI-powered Falcon platform with Rockwell’s global deployment, network architecture, support, OT, and managed services capabilities to deliver differentiated solutions that address customer cybersecurity pain points.
Rockwell Automation Expands Supply Chain Services with Acquisition of AVATA
Rockwell Automation, Inc. has acquired AVATA, a leading services provider for supply chain management, enterprise resource planning, and enterprise performance management solutions. AVATA has significant domain expertise in enterprise applications and is a leading consultant and systems integrator for Oracle cloud software applications.
By significantly improving end-to-end supply chain visibility and management, AVATA, together with Kalypso, Rockwell’s industrial digital transformation services business, will help further unlock the value of information technology/operational technology (IT/OT) convergence that Rockwell can deliver to customers. AVATA will be integrated into Kalypso, which is a part of Rockwell’s Lifecycle Services business.
AVATA supports Rockwell’s recent cloud-native investments, building on its open architecture to extend the digital thread and enable powerful integrations with other leading technologies, now including Plex and Oracle Cloud.
Rockwell Automation and Battery Pioneer Cadenza Innovation to Explore Driving Energy Storage and Advance Sustainability
Rockwell Automation has begun collaborating with Cadenza Innovation, the award-winning provider of safe, low cost, and energy-dense Lithium-ion-based storage solutions, to define a strategic relationship including a shared goal of building the industry’s highest performance battery cell production lines.
During 2022 the companies intend to collaborate to develop a customer cloud portal to manage deployed distributed energy resources, an end-to-end battery manufacturing execution system (MES), and equipment automation to support the expansion of Cadenza Innovation’s battery manufacturing in the US and abroad.
Rockwell Automation and Cadenza Innovation intend to create a full digital thread that feeds information from business systems to the factory floor and subsequently out to the field-deployed energy storage systems to ‘close the loop’ by feeding data from the field back into Cadenza Innovation’s connected operations. This, in turn, will ensure peak performance of customer systems.
I remember talking with a friend about ten years ago about the explosion of database technology beyond SQL. The feeling within industrial circles evidently was that historian and SQL were the only tools needed. Many were also skeptical of cloud architectures, as well. Today’s news concerns a couple of announcements from a company called Datadobi who specializes in helping you manage your unstructured databases even across multiple clouds. May the force be with you!
Datadobi Software Enhancements Power Agile Multi-Cloud Expansion, Flexible Data Reorganization, Lower Costs
Datadobi announced enhancements to its vendor-neutral unstructured data mobility engine with the introduction of DobiMigrate’s API. Version 5.13 will allow organizations to programmatically configure unstructured data migrations using the API.
This latest update complements Datadobi’s enterprise GUI and enables automation of file and object migrations and reorganization or clean up projects.
Using the DobiMigrate API, customers and partners can now extend existing automated storage provisioning workflows with the necessary data migration steps. Organizations can first use the storage system APIs to provision a new group of on-premises or cloud storage and then use the DobiMigrate API to set up the NAS or object migration. Following the cutover to the new storage, the administrator can then again use storage systems APIs to deprovision the original storage.
The DobiMigrate process is fully auditable and comes with patent pending chain of custody technology that provides a detailed report on what files were migrated, their content, and timestamps. DobiMigrate has also been reviewed and received attestation for Service Organizations Control (SOC) 2 Type 1 compliance by KPMG. The attestation follows a review of Datadobi’s operations, support, and engineering processes demonstrating Datadobi’s focus on high standards for integrity, security, and confidentiality of customer data.
“Due to the scale and complexity of unstructured data in today’s heterogeneous storage environments, enterprises can no longer rely on outdated tools and manual practices to execute data management projects. Organizations must trust specialist tools powered by automation to gain an understanding of their environments and move data accordingly,” said Carl D’Halluin, CTO, Datadobi. “Datadobi’s API allows for a seamless data management experience built with the speed and integrity needed to conduct business today.”
“Every day we see organizations challenged by an overwhelming volume of unstructured data. As data proliferation continues, successful enterprises will look to data management solutions like DobiMigrate’s API to tailor data projects, enhance mobility capabilities, and find value in their data,” said Bill O’Brien, Director of Data Center Storage and Data Protection at AHEAD. “We look forward to continuing to provide top-of-the-line data management to our enterprise clients using Datadobi’s solutions.”
Datadobi Validates Google Cloud Storage as an Endpoint for Data Management
Datadobi announced it has validated Google Cloud Storage as an endpoint for data replication in addition to data migration. As a result, users of the DobiProtect software suite can now replicate data held on any S3-compatible object storage to Google Cloud Storage or vice versa. The news comes shortly after the company announced support for Azure Blob storage in the DobiProtect software suite.
Datadobi’s validation of Google Cloud Storage enables organizations to build a true multi-cloud strategy with their data available at multiple Hyperscalers. Customers can host their data both on any S3 object storage service and on Google Cloud Storage, and use DobiProtect to set up permanent data replication between these two clouds.
Nearly 92% of enterprises now have a multi-cloud strategy. This way, data is held in two different sets of data centers, deployed on two completely different cloud storage software and hardware stacks, operated by different companies, different people, and different service procedures. DobiProtect now allows users to boost data protection by entrusting their data to two cloud storage stacks.
Datadobi can enable the initial migration and then set up the subsequent replication stream. This is a one-time copy which ends in the cutover step, during which the data authority is moved from the source to the destination storage service or system. During this cutover, the Datadobi software provides a chain of custody containing an auditable proof of which data was moved during the data migration and confirming the integrity of the data.
US Presidential Candidate Ross Perot years ago described a “giant sucking sound” using a typical businessperson’s view of government. Well, I think that a digital picture of today’s supply chains would show a giant clogging mess, like a kitchen garbage disposal gone wrong. Regardless, Google Cloud released this supply chain digital twin to show just such a condition.
We in manufacturing and production need to pay attention to these giant enterprise IT companies. They keep encroaching into our territory. Someday industrial technology will be absorbed into it at the rate we are going.
Google Cloud today announced the launch of Supply Chain Twin, a purpose-built industry solution that lets companies build a digital twin–a virtual representation of their physical supply chain–by orchestrating data from disparate sources to get a more complete view of suppliers, inventories, and other information. In addition, the Supply Chain Pulse module, also announced today, can be used with Supply Chain Twin to provide real-time dashboards, advanced analytics, alerts on critical issues like potential disruptions, and collaboration in Google Workspace.
The majority of companies do not have complete visibility of their supply chains, resulting in retail stock outs, aging manufacturing inventory, or weather-related disruptions. In 2020, out-of-stock items alone cost the retail industry an estimated $1.14 trillion. The past year-and-a-half of supply chain disruptions related to COVID-19 has further proven the need for more up-to-date insights into operations, inventory levels, and more.
“Siloed and incomplete data is limiting the visibility companies have into their supply chains.” said Hans Thalbauer, Managing Director, Supply Chain & Logistics Solutions, Google Cloud. “The Supply Chain Twin enables customers to gain deeper insights into their operations, helping them optimize supply chain functions—from sourcing and planning, to distribution and logistics.”
With Supply Chain Twin, companies can bring together data from multiple sources, all while requiring less partner integration time than traditional API-based integration. Some customers have seen a 95% reduction in analytics processing time, with times for some dropping from 2.5 hours down to eight minutes. Data types supported in Supply Chain Twin include:
- Enterprise business systems: Better understand operations by integrating information such as locations, products, orders, and inventory from ERPs and other internal systems.
- Supplier and partner systems: Gain a more holistic view across businesses by integrating data from suppliers, such as stock and inventory levels, and partners, such as material transportation status.
- Public sources: Understand your supply chain in the context of the broader environment by connecting contextual data from public sources, such as weather, risk, or sustainability-related data, including public datasets from Google.
Once customers are up-and-running on Supply Chain Twin, the Supply Chain Pulse module enables further visibility, simulations, and collaboration features:
- Real-time visibility and advanced analytics: Drill down into key operational metrics with executive performance dashboards that make it easier to view the status of the supply chain.
- Alert-driven event management and collaboration across teams: Set mobile alerts that trigger when key metrics reach user-defined thresholds, and build shared workflows that allow users to quickly collaborate in Google Workspace to resolve issues.
- AI-driven optimization and simulation: Trigger AI-driven algorithm recommendations to suggest tactical responses to changing events, flag more complex issues to the user, and simulate the impact of hypothetical situations.
“At Renault, we are innovating on how we run efficient supply chains. Improving visibility to inventory levels across our network is a key initiative,” said Jean-François Salles, Supply Chain Global Vice President at Renault Group. “By aggregating inventory data from our suppliers and leveraging Google Cloud’s strength in organizing and orchestrating data, with solutions like the Supply Chain Twin we expect to achieve a holistic view. We aim to work with Google tools to manage both stock, improve forecasting, and eventually optimise our fulfillment.”
“End-to-end visibility across the entire supply chain is a top priority for supply chain professionals to optimize planning, real-time decision making and monitoring,” said Simon Ellis, Program Vice President at IDC. “Google Cloud’s approach to a digital twin of the supply chain spans internal, external, and partner data networks without complex integrations. This approach can help organizations to better plan, monitor, collaborate and respond at scale.”
Customers are deploying Supply Chain Twin via Google Cloud partners
Retailers, manufacturers, CPG firms, healthcare networks, and other logistics-heavy companies can deploy Supply Chain Twin by working directly with Google Cloud’s partner ecosystem. For example, system integration partners such as Deloitte, Pluto7, and TCS, can help customers integrate the Supply Chain Twin and relevant datasets into their existing infrastructure.
In addition, data partners, such as Climate Engine, Craft, and Crux can augment Supply Chain Twin by providing geospatial, sustainability, and risk management data sets for a more complete macroenvironment view. Finally, application partners such as Anaplan, Automation Anywhere, and project44 can provide information from their platforms into Supply Chain Twin to help customers better understand product lifecycles, track shipments across carriers, predict ETAs, and more.
Supply Chain Twin and the Twin Pulse module are today globally available in Preview. For pricing and availability, customers should talk to their Google Cloud sales representative. For more information on Supply Chain Twin, visit here.
Google Cloud accelerates organizations’ ability to digitally transform their business with the best infrastructure, platform, industry solutions and expertise. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology – all on the cleanest cloud in the industry. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
Hewlett Packard Enterprise (HPE) held a Web event Sept. 28 to announce extensions and enhancements to its GreenLake edge-to-cloud platform. One commentator during the “deep dive” sessions opined that HPE is becoming a “data management company.” In other words, it is transitioning from a hardware company to a software and as-a-Service company. And the pace of the change during the past two years is picking up. Quite frankly, I’m surprised at the speed of the changes in the company over that brief period of time.
The announcements in summary:
- HPE unveils new cloud services for the HPE GreenLake edge-to-cloud platform
- The HPE GreenLake platform now has more than 1,200 customers and $5.2 billion in total contract value
- HPE takes cyberthreats and ransomware head-on with new cloud services to protect customers’ data from edge to cloud
- HPE pursues big data and analytics software market – forecasted by IDC to reach $110B by 2023– with industry’s first cloud-native unified analytics and data lakehouse cloud services optimized for hybrid environments
Following is information from HPE’s press release.
HPE GreenLake edge-to-cloud platform combines control and agility so customers can accelerate innovation, deliver compelling experiences, and achieve superior business outcomes
Hewlett Packard Enterprise (NYSE: HPE) today announced a sweeping series of new cloud services for the HPE GreenLake edge-to-cloud platform, providing customers unmatched capabilities to power digital transformation for their applications and data. This represents HPE’s entry into two large, high-growth software markets – unified analytics and data protection. Together, these innovations further accelerate HPE’s transition to a cloud services company and give customers greater choice and freedom for their business and IT strategy, with an open and modern platform that provides a cloud experience everywhere. The new offerings, which add to a growing portfolio of HPE GreenLake cloud services, allow customers to innovate with agility, at lower costs, and include the following:
- HPE GreenLake for analytics – open and unified analytics cloud services to modernize all data and applications everywhere – on-premises, at the edge, and in the cloud
- HPE GreenLake for data protection – disaster recovery and backup cloud services to help customers take ransomware head-on and secure data from edge-to-cloud
- HPE Edge-to-Cloud Adoption Frameworkand automation tools– a comprehensive, proven set of methodologies expertise, and automation tools to accelerate and de-risk the path to a cloud experience everywhere
“The big data and analytics software market, which IDC predicts will reach $110 billion by 2023, is ripe for disruption, as customers seek a hybrid solution for enterprise datasets on-premises and at the edge,” said Antonio Neri, president and CEO, at HPE. “Data is at the heart of every modernization initiative in every industry, and yet organizations have been forced to settle for legacy analytics platforms that lack cloud-native capabilities, or force complex migrations to the public cloud that require customers to adapt new processes and risk vendor lock-in. The new HPE GreenLake cloud services for analytics empower customers to overcome these trade-offs and gives them one platform to unify and modernize data everywhere. Together with the new HPE GreenLake cloud services for data protection, HPE provides customers with an unparalleled platform to protect, secure, and capitalize on the full value of their data, from edge to cloud.”
HPE continues to accelerate momentum for the HPE GreenLake edge-to-cloud platform. The HPE GreenLake platform now has more than 1,200 customers and $5.2 billion in total contract value. In HPE’s most recent quarter, Q3 2021, HPE announced that the company’s Annualized Revenue Run Rate was up 33 percent year-over-year, and as-a-service orders up 46 percent year-over-year. Most recently, HPE announced HPE GreenLake platform wins with Woolworths Group, Australia and New Zealand’s largest retailer, and the United States National Security Agency.
HPE GreenLake Rolls Out Industry’s First Cloud-Native Unified Analytics and Data Lakehouse Cloud Services Optimized for Hybrid Environments
HPE GreenLake for analytics enable customers to accelerate modernization initiatives, for all data, from edge to cloud. Available on the HPE GreenLake edge-to-cloud platform, the new cloud services are built to be cloud-native and avoid complex data migrations to the public cloud by providing an elastic, unified analytics platform for data and applications on-premises, at the edge and in public clouds. Now analytics and data science teams can leverage the industry’s first cloud-native solution on-premises, scale-up Apache Spark lakehouses, and speed up AI and ML workflows. The new HPE GreenLake cloud services include the following:
- HPE Ezmeral Unified Analytics: Industry’s first unified, modern analytics and data lakehouse platform optimized for on-premises deployment and spans edge to cloud.
- HPE Ezmeral Data Fabric Object Store: Industry’s first Kubernetes-native object store optimized for analytics performance, providing access to data sets edge to cloud.
- Expanding HPE Ezmeral Partner Ecosystem: The HPE Ezmeral Partner Program delivers a rapidly growing set of validated full-stack solutions from ISV partners that enable customers to build their analytics engines. This includes new support from NVIDIA, Pepperdata and Confluent, and open-source projects such as Apache Spark. HPE has added 37 ISV partners to the HPE Ezmeral Partner Program since it was first introduced in March 2021, delivering additional ecosystem stack support of core use cases and workloads for customers, including big data and AI/ML use cases.
HPE Takes Cyberthreats and Ransomware Head-On with New HPE GreenLake Cloud Services to Protect Customers’ Data from Edge to Cloud
HPE today entered the rapidly growing data protection-as-a-service market with HPE GreenLake for data protection, new cloud services designed to modernize data protection from edge to cloud, overcome ransomware attacks, and deliver rapid data recovery.
- HPE Backup and Recovery Service: Backup as a service offering that provides policy-based orchestration and automation to backup and protect customers’ virtual machines across hybrid cloud, and eliminates the complexities of managing backup hardware, software, and cloud infrastructure.
- HPE GreenLake for Disaster Recovery: Following the close of the Zerto acquisition, HPE plans to deliver Zerto’s industry-leading disaster recovery as a service through HPE GreenLake, to help customers recover in minutes from ransomware attacks. Zerto provides best-in-class restore times without impacting business operations for all recovery scenarios.
HPE Accelerates Adoption of Cloud-Everywhere Operating Models with Proven Framework and Data-Driven Intelligence and Automations Tools
HPE also today announced a proven set of methodologies and automation tools to enable organizations to take a data-driven approach to achieve the optimal cloud operating model across all environments:
- The HPE Edge-to-Cloud Adoption Framework leverages HPE’s expertise in delivering solutions on-premises, to meet a broad spectrum of business needs for customers across the globe. HPE has identified several critical areas that enterprises should evaluate and measure to execute an effective cloud operating model. These domains, which include Strategy and Governance, People, Operations, Innovation, Applications, DevOps, Data, and Security, form the core of the HPE Edge-to-Cloud Adoption Framework.
- The cloud operational experience is enhanced with the industry’s leading AI Ops for infrastructure, HPE InfoSight, that now constantly observes applications and workloads running on the HPE GreenLake edge-to-cloud platform. The new capability, called HPE InfoSight App Insights, detects application anomalies, provides prescriptive recommendations, and keeps the application workloads running disruption free. HPE CloudPhysics delivers data-driven insights for smarter IT decisions across edge-to-cloud, enabling IT to optimize application workload placement, procure right-sized infrastructure services, and lower costs.
HPE GreenLake Announcement Event
Please visit the HPE Discover More Network to watch the HPE GreenLake announcement event, including the keynote from Antonio Neri, HPE president and CEO, live on September 28that 8:00 am PT or anytime on-demand.
HPE GreenLake for analytics and HPE GreenLake for data protection will be available in 1H 2022.
The HPE Edge-to-Cloud Adoption Framework is available now.
HPE provides additional information about HPE product and services availability in the following blogs:
HPE GreenLake for analytics
HPE GreenLake for data protection
HPE Edge-to-Cloud Adoption Framework
Hewlett Packard Enterprise (NYSE: HPE) is the global edge-to-cloud company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way people live and work, HPE delivers unique, open and intelligent technology solutions delivered as a service – spanning Compute, Storage, Software, Intelligent Edge, High Performance Computing and Mission Critical Solutions – with a consistent experience across all clouds and edges, designed to help customers develop new business models, engage in new ways, and increase operational performance.