HPE Shows Company’s Investment In People, Environment, and Doing Business the Right Way

This recap of Hewlett Packard Enterprise’s (HPE) annual Living Progress Report for 2019 wraps up thoughts and coverage of all the many virtual conferences I experienced in June. The communications teams from all the companies worked hard and had to experiment in real time to bring out the best alternative to just completely shutting down.

These thoughts center on ethics—something given my experience in business I thought I’d never be writing about. If there were two institutions within which I worked where ethics was merely a word in the dictionary, they were business and church.

Thankfully that situation is changing, and this report from HPE is encouraging. I’ve met many people within the company. I don’t think this is superficial marketing-speak.

The report demonstrates HPE’s ongoing commitment to being a force for good by equipping customers with sustainable technology solutions, upholding HPE’s own high Environmental, Social and Governance (ESG) standards across its value chain, and prioritizing company culture to fuel business outcomes by unlocking the innovation of team members.

“Our team members’ passion, ingenuity and resilience enable us to create technology solutions to tackle the many pressing challenges facing society today,” said HPE President and CEO Antonio Neri. “Current discussions around systemic racism, inclusion, and diversity demonstrate the importance of taking bold actions to create a more equitable and sustainable future. We are proud of our progress and committed to do more as a company and in partnership with our peers, customers, and partners.”

In 2019, HPE intensified its strategic focus on culture and launched the “Work That Fits Your Life” program to support a more inclusive workplace that values team members’ experience inside and outside of the workplace. New benefits include six months of paid parental leave for mothers and fathers, career reskilling and transition support, and a company-wide shortened work day once a month on “Wellness Friday”.

From the release:

HPE is investing in human capital because it wants to be a place where people can learn, develop skills and do career-defining work. HPE’s Executive Committee developed the “Work that Fits your Life” program in partnership with its Board of Directors, 54% of whom identify with one or more diverse groups. One of the key goals of the program was to drive inclusion in the workplace. And, the company began tying diversity metrics to executive compensation to further embed inclusion and diversity into the organization.

In 2019, HPE’s employee engagement score rose 10 percent and has risen an unprecedented 18 percent since 2017, and its team members clocked their one millionth hour of company-supported volunteer time since 2016. In addition, HPE offered opportunities to engage with more social impact activities through the inaugural HPE Accelerating Impact initiative.

This year’s Living Progress Report also, for the first time, details racial diversity statistics of its team member population, an important step in being transparent and addressing systemic bias and inequality in our society.

HPE announced that it would make its entire portfolio available as-a-service by 2022, a consumption model that can bring significant energy efficiency gains and cost savings to its customers by eliminating overprovisioning and allowing customers to pay for only what they use. In addition, service-based models allow HPE to maintain chain of custody over equipment to ensure recovery and refurbishment, reducing physical waste and the need to source substances of concern. In 2019, 88% of the nearly four million assets returned to HPE’s Technology Renewal Centers were given a new life, but shifting to more consumption-based solutions is predicted to dramatically reduce the consumption of unnecessary IT assets.

HPE remained on track to meet all of its 2025 climate targets, having reduced its carbon footprint by 47% in just four years. The company also introduced a new emissions reduction target for its transportation logistics footprint – aiming to reduce the footprint by 35% by 2025. It also continued to see opportunity to help customers thrive in a carbon constrained world – with efficient IT products and services representing nearly USD $7.7 billion in revenue in 2019.

HPE continued to hold suppliers to high environmental, social and ethical standards. In 2019, HPE’s supply chain audit and assurance improvement program touched over 133,000 workers and the company guided 51% of its suppliers on how to set their own science-based climate targets. In addition, HPE sought to promote inclusion and diversity through its supply chain by spending approximately USD $1 billion with small enterprises and minority, women and veteran-owned businesses in the United States.

Getting Your Software App Delivered as-a-Service

The news in brief: New HPE GreenLake cloud services deliver an agile, lower cost, and consistent cloud experience everywhere.

We’re living in an as-a-service and edge-to-cloud world (to paraphrase the Material Girl). When Antonio Neri assumed leadership of the storied (at least part of the storied) Hewlett Packard, he grasped both that reality and that HPE had most of the tools to get there. A couple of acquisitions, a bolstered executive leadership team, and now the unveiling. There remain more work on the financial end for him, but I think HPE is positioned for growth in this arena.

Last year at Discover, HPE pushed the GreenLake idea on us. This year, it’s capabilities and possibilities are greatly expanded. And for my industrial / production readers–this applies as much to you as to Enterprise IT. It’s getting blurry at the Edge, apps like MES are moving to the cloud (actually, probably all have moved there), and the roles of Enterprise IT and Manufacturing IT are also blurring at the edges.

It’s a new world–and I don’t mean just post-Covid.

Following is from the release:

Hewlett Packard Enterprise today announced significant advancements to the company’s edge-to-cloud platform-as-a-service strategy, through next-generation cloud services and an accelerated delivery experience for HPE GreenLake. The new HPE GreenLake cloud services, which span container management, machine learning operations, VMs, storage, compute, data protection, and networking, help customers transform and modernize their applications and data – the majority of which live on premises, in colocation facilities, and increasingly at the edge.

“Now more than ever, given current market conditions, organizations have an urgent need to connect and leverage all of their applications and data in order to transform their businesses, support their employees, and serve their customers,” said Antonio Neri, President and CEO, Hewlett Packard Enterprise. “As we enter the next phase of the cloud market, customers require an approach that enables them to innovate and modernize all of their applications and workloads, including those at the edge and on premises. By delivering a consistent cloud experience everywhere through HPE GreenLake cloud services, and software designed to accelerate transformation, HPE is uniquely positioned to help customers harness the full power of their information, wherever it resides.”

Today, organizations are at a crossroads in their digital transformation efforts. According to IDC, despite the growth and adoption of public clouds, 70 percent of applications remain outside of the public cloud. Due to several factors, including application entanglement, data gravity, security and compliance, and unpredictable costs, organizations have struggled to move the majority of the applications that run their businesses to public clouds. Forced to support two operating models, organizations face additional costs, complexity and inefficiency, limited agility and innovation, and the inability to capitalize on information everywhere.

HPE delivers a unique approach to solving this dilemma by providing HPE GreenLake cloud services to customers in the environment of their choice – from edge to cloud – with a consistent operating model and with visibility and governance across all enterprise applications and data.

HPE GreenLake cloud services also provides customers with a superior economic model. Unlike public cloud vendors, which charge customers to get data back on premises, HPE charges no data egress fees. HPE GreenLake’s flexible as-a-service model and robust cost and compliance analytics tools allow customers to preserve cash flow, control spend, and prioritize investments that are aligned to business priorities.

“HPE GreenLake gives us 100% uptime, and the predictable pricing model is already helping us cut costs,” said Ed Hildreth, Manager of IT Distributed Systems, Mohawk Valley Health System. “Thanks to the cloud-like experience, when we needed to quickly activate additional features and resources in response to the COVID-19 pandemic, we were able to easily roll this out with no time delay. We are extremely pleased with HPE GreenLake and plan to leverage this model once again for new hospitals within our health system.”

Introducing New HPE GreenLake Cloud Services for Distributed Environments

HPE now offers cloud services for containers, machine learning operations, virtual machines, storage, compute, data protection and networking. All cloud services are accessible via a self-service point-and-click catalogue on HPE GreenLake Central, a platform where customers can learn about, price, and request a trial on each cloud service; spin up instances and clusters in a few clicks; and manage their multi-cloud estate from one place. They can all be deployed and run in the customers’ environment.

Based on pre-integrated building blocks, the new HPE GreenLake cloud services are now available in small, medium, and large configurations, delivered to customers from order to run in as few as 14 days. Partners and customers benefit from pre-configured reference architectures and pricing to speed time to consuming cloud services.

HPE GreenLake is one of the fastest-growing businesses in HPE with over 4 billion USD in total contract value

  • Cloud services for Containers – These new HPE GreenLake cloud services, powered by HPE Ezmeral Container Platform, provide the flexibility to run containerized applications in data centers, colocation facilities, multiple public clouds, and at the edge.
  • Cloud services for Machine Learning Operations – Through HPE GreenLake, customers can subscribe to a workload-specific solution built on the HPE Ezmeral Container Platform and HPE Ezmeral ML Ops for the entire ML lifecycle.
  • Cloud services for Virtual Machines, Storage, and Compute – For customers who want a private cloud experience, HPE is launching HPE GreenLake cloud services for virtual machines, storage and compute. With provisioning of instances in five clicks, these easy-to-deploy services also provide visibility into usage and spend, and active capacity planning with powerful consumption analytics in the HPE GreenLake Central management platform.
  • Cloud services for Data Protection – For customers looking to modernize data protection, HPE is making data backup and recovery effortless and automated for every SLA – from rapid recovery to long-term retention. These new cloud services through HPE GreenLake include secure and efficient on-premises backup and an enterprise cloud backup service, HPE Cloud Volumes Backup, which enables backup and recovery to/from the cloud without egress costs or lock-in, and with the agility to activate data for recovery, test/dev, and analytics.
  • Cloud services for the Intelligent Edge – Today, more than ever, customers are looking to reduce CapEx to simplify their budget process and better predict and manage network operational costs. Aruba’s new Managed Connectivity Services, now available as cloud services through HPE GreenLake, provide the industry’s first complete Network as a Service offering, and bring cloud agility to the edge with the recently introduced Aruba ESP (Edge Services Platform).

CII and MIMOSA Join Forces to move Interoperability Forward for Capital Projects

CII and MIMOSA sign Memorandum of Understanding (MOU) to use the Open Industrial Interoperability Ecosystem (OIIE) as the interoperability framework for CII best practices.

I’m a believer based upon long experience that standards and interoperability drive industries (and society) forward. Just look, for example, at standard gauge railroad tracks or standard shipping containers or Internet protocols. I should also note that I worked on the development of the OIIE several years ago, but they got to the point where I could not contribute for a while. As I wrote recently, things are coming together in this effort for interoperable data flow from engineering design through construction to operations & maintenance throughout the lifecycle of a large capital project.

Here is the latest, and very important, news.

CII (The Construction Institute) and MIMOSA announce their collaboration to adopt and progress the standards for an open, vendor neutral digital ecosystem supporting data and systems interoperability in capital projects, operations and maintenance enabling digital transformation of the full asset lifecycle. The MOU establishes the basis for a CII/MIMOSA Joint Working Group to develop best practices for standards based interoperability in capital projects leveraging the organizations combined strengths.

It will develop formal OIIE Use Cases for capital projects based on Industry Functional Requirements developed by CII, starting with those associated with Advanced Work Packaging (AWP). These OIIE Use Cases will be validated in the OIIE Oil and Gas Interoperability (OGI) Pilot before they are published and licensed for use on a world-wide royalty free basis. Once the jointly developed OIIE Use Cases are validated in the pilot, CII and MIMOSA intend to submit them to ISO TC 184/WG 6 for inclusion in future parts of ISO 18101.

The OIIE is an outgrowth of collaboration between multiple industry-level Standards Developing Organizations, where MIMOSA plays a key leadership role and has led the workstreams for digitalization and interoperability in support of asset life-cycle management. The OIIE OGI Pilot includes standard use cases for asset intensive industries, currently featuring an example oil and gas industry process unit.

Active collaboration has begun, by sharing the existing OIIE Use Case Architecture and asset lifecycle management OIIE Use Cases previously developed by MIMOSA and validated in the OIIE OGI Pilot. CII has shared the AWP data requirements that are under development by CII.

Next steps will begin to include CII AWP best practices in applicable, OIIE Use Cases for capital projects, including jointly enhancing existing use cases and the joint development of new ones. CII and MIMOSA encourage interested organizations to join and participate in each association to fully support this important industry-led effort.

Organizations that participate have the potential to benefit in many ways including:

  • System of Systems interoperability results in less reliance on expensive, fragile, custom integration between systems, reducing IT costs while increasing agility and sustainability.
  • Education and training to a common set of industry practices and standards, provides a more flexible and efficient digital economy work force, benefitting industry and workers alike with reduced loss of knowledge and expertise.
  • Investment in future proofed, vendor neutral, interoperable data, enables industry to create, capture, manage and reuse digital information, as a strategic asset throughout the entire physical asset lifecycle, deriving significantly more business value from capital projects.
  • Owners identified the opportunity to cut CAPEX spend by 15-20% through better information sharing with improved schedules and productivity due to far less time wasted looking for information, and much more time on tools.

CII, based at The University of Texas at Austin, is a consortium of more than 140 leading owner, engineering- contractor, and supplier firms from both the public and private arenas. These organizations have joined together to enhance the business effectiveness and sustainability of the capital facility life cycle through CII research, related initiatives, and industry alliances.

MIMOSA is a 501 (c) 6 not-for-profit industry trade association dedicated to developing and encouraging the adoption of open, supplier-neutral IT and IM standards enabling physical asset lifecycle management spanning manufacturing, fleet and facilities environments. MIMOSA standards and collaboratively developed specifications enable Digital Twins to be defined and maintained on a supplier-neutral basis, while also using Digital Twins to provide Context for Big Data (IIOT and other sensor-related data) and Analytics.

HPE Edge Orchestrator Delivers Low-Latency Cloud Services At the Edge

• Enables telcos to monetize 5G networks and edge infrastructure by delivering new low latency cloud services at the edge via an app catalog

Remember two things if you meet the typical profile of a reader of this blog—it’s all happening at the edge and production/manufacturing are the edge; and data is the gold we’re mining and hype of the cloud is so over, it’s about gathering, analyzing, orchestrating, and sending all the data gathered at the edge to some sort of cloud.

I’ve been watching developments of 5G (and not so much WiFi6, but it’s a partner) for some time. It’s hard to separate the hype from reality—something that always happens at the early stage. Marketers can’t prevent themselves from trying to hype their companies as ahead of the curve, while engineers have been quietly pushing the curve.

So I was able to watch some sessions at the Hewlett Packard Enterprise (HPE) Discover Virtual Experience on this very topic. HPE’s Aruba is a leading supplier of communication products. Following is the lead announcement from last week.

HPE Edge Orchestrator, a SaaS-based offering enables telcos to deploy innovative new edge computing services to customers via IT infrastructure located at the edge of telco networks or on customer premises. With the HPE Edge Orchestrator solution, telcos can extend their offerings to include a catalog of edge computing applications which customers can deploy with a single click, across hundreds of locations. HPE Edge Orchestrator enables telcos to monetize the 5G network and telco cloud while bringing lower latency, increased security and enhanced end-user experiences to their customers.

Analysts expect the next decade to see the rise of edge computing where data intensive workloads such as AI, machine learning (ML), augmented and virtual reality apps will be hosted at the edge. Telcos already have thousands of edge sites powering mobile and fixed networks, so they are uniquely positioned to lead the edge services market. In fact according to a recent IDC study, 40% of enterprises trust their telco to be their main provider of edge solutions. However, until now telcos haven’t had the tools to do this themselves without relying on public cloud providers.

HPE Edge Orchestrator gives the power back to telcos. Now they can offer value-added edge services in their own right and can move from being primarily bandwidth providers to offering innovative edge computing applications, such as AI-powered video analytics, industrial automation and VR retail services. New revenue from these high-value enterprise services will also help to cover the significant cost of deploying new 5G infrastructure.

• Telcos can move from being primarily bandwidth providers to offering innovative edge computing applications

HPE Edge Orchestrator unleashes the deployment and configuration of customer applications, provided as virtual machines or containers, at geographically distributed edge locations owned by telcos, such as existing central offices or on customer premises. Customers can access edge applications via a self-service app catalog for simple management, monitoring and the deployment of an app to an edge device with one-click operation.

HPE Edge Orchestrator enables enterprises to easily combine their applications with network services offered by telcos, thus creating an end-to-end flow across the edge. Today, HPE Edge Orchestrator supports Multi-access Edge Computing (MEC) with other network-as-a-service (NaaS) functions being added to the catalog over time. The MEC platform enables applications to run at the edge, while delivering network services that ensure a dynamic routing of edge traffic in 4G, 5G, and Wi-Fi environments.

To capitalize on the edge services opportunity, telcos need to bring applications from the cloud out to the edge where the data exists. With HPE Edge Orchestrator, along with HPE Edgeline and ProLiant servers, telcos can position application intelligence at the edge and unlock major business benefits for their customers:

  • Lower latency: When applications can process requests locally instead of routing them to a data center, they can deliver much better performance. This translates to a better user experience for any business application. For the new generation of ultra-low-latency use cases like augmented reality and industrial automation, short round-trip times are absolutely essential.
  • Bandwidth optimization: Positioning application intelligence out at the edge, such as doing number-crunching closer to where the numbers are generated, greatly reduces the wide-area network (WAN) bandwidth the application requires. This translates to lower WAN costs for businesses and less traffic congestion in telco core and metro networks. Applications like video analytics become much more efficient and, as a result, applicable to use cases that might not have been viable in the past.
  • Improved security and privacy: Any time businesses transmit data over a network, they’re potentially exposing it to security threats. For the most sensitive information, some businesses want to keep everything onsite. In regions with strict privacy protections like the European Union, some applications may simply not be viable unless they can process all personally identifiable information (PII) locally.

New edge computing offerings start with compute platforms optimized for deployment at remote operator sites (central offices, radio towers, other point of presence (POP) locations), or even directly at the customer premises. For example, HPE Edgeline Converged Edge Servers, such as the EL4000 and EL8000, have been specifically designed to run at the edge. Platforms like these host all of the components needed to manage the edge computing workloads in containers or VMs.

HPE Edge Orchestrator provides a centralized, comprehensive, hardware agnostic orchestration platform to provision, configure, and perform general management functions for all components of edge computing. HPE Edge Orchestrator is also multi-tenant by design.

Telcos can give diverse enterprise customers their own “private” interfaces to manage their workloads, sites, edge devices, and services, while their own teams manage the entire CSP edge computing portfolio as a single system. HPE Edge Orchestrator can also work in conjunction with the recently announced Aruba Edge Services Platform (ESP), enabling enterprises to easily integrate both Wi-Fi-based and telco services.

HPE Emphasizing Software As Part of Pivot to As-a-Service Vision

HPE Discover Virtual Experience wrapped up last week, but I have much to think about and report. The HPE team did an excellent job pulling together a conference where we saw many different living rooms and home offices. Tough job; well done.

The release of a new software portfolio from HPE may sound more to the interest of enterprise architects, but I have already seen demos of where this also aids the coming together of OT and IT in order to bring the production side of an enterprise into more of a value to the enterprise. This is important toward the counteracting of recent enterprise history where production was a “black box” and corporate financial geniuses viewed it as something that could be moved around chasing low cost.

From the blog of Kumar Sreekanti, CTO and head of software at HPE, we learn about the coming together of Ezmeral—brand name for the software portfolio.

Digital transformation is being amplified by an order of magnitude. In fact, many business leaders that I’ve spoken with are now embracing a digital-first strategy—to compete and thrive in the midst of a global pandemic. And the enterprises that use data and artificial intelligence effectively are better equipped to evolve rapidly in this dynamic environment. Now these data-driven transformation initiatives are being accelerated to enable faster time-to-market, increased innovation, and greater responsiveness to the business and their customers.

As CTO and head of software at HPE, my focus is on delivering against our edge-to-cloud strategy and vision of providing everything as a service. Software is a very critical and important component of this strategy. It’s also essential to helping our customers succeed in their data-driven digital transformation journeys, now more than ever.

We’re committed to providing a differentiated portfolio of enterprise software to help modernize your applications, unlock insights from your data, and automate your operations—from edge to cloud. Today, we announced that we’ve unified our software portfolio with a new brand: HPE Ezmeral.

The HPE Ezmeral portfolio allows you to:

  • Run containers and Kubernetes at scale to modernize apps, from edge to cloud
  • Manage your apps, data, and ops – leveraging AI and analytics for faster time-to-insights
  • Ensure control for governance, compliance, and lower costs
  • Provide enterprise-grade security and authentication to reduce risk

Business innovation relies on applications and data. The apps and data running the enterprise now live everywhere—in data centers, in colocation centers, at the edge, and in the cloud. Most of the applications running businesses today are primarily non-cloud-native; and data is everywhere, with more and more data being generated at edge. Our customers are having real issues with non-cloud-native systems that will not or cannot move to the public cloud due to data gravity, latency, application dependency, and regulatory compliance reasons. Data has gravity, so our customers want to bring compute to the data not data to the compute. And because data is exploding, it’s driving the need for AI and machine learning at enterprise-scale—with the ability to harness and leverage petabytes of data.

Our customers want flexibility and openness; they want to eliminate lock-in. They want pay-per-use consumption in an as-a-service model. They want open solutions that give them the best of both worlds—with a modern cloud experience in any location, from edge to cloud. We address these needs by providing HPE GreenLake in the environment of your choice, with a consistent operating model, and with visibility and governance across all enterprise applications and data. Our software provides differentiated IP to deliver these cloud services through HPE GreenLake.

And in today’s news, we announced new cloud services from HPE GreenLake. This includes new HPE GreenLake cloud services for containers and machine learning operations—powered by our HPE Ezmeral Container Platform software to run containerized applications with open source Kubernetes, and HPE Ezmeral ML Ops software to operationalize the machine learning model lifecycle.

Follow this blog

Get every new post delivered right to your inbox.