HPE Announces GreenLake Modern Private Cloud and New Cloud Services

Hewlett Packard Enterprise (HPE) announced advances to its GreenLake flagship Software-as-a-Service platform at its Discover 2022 user conference. Below are the update summaries:

  • Unified experience across edge to cloud 
  • Deepens security 
  • Extends developer tools 
  • Strengthens capabilities to run workloads at scale
  • Transformed and modern private cloud experience with automated, flexible, scalable pay-as-you-go private cloud for traditional and cloud-native workloads
  • Eight new cloud services including backup and recovery, block storage, compute operations management, data fabric, disaster recovery, hyperconverged infrastructure, as well as industry-vertical cloud services for customer engagement and payments

“Three years ago, at HPE Discover, HPE committed to delivering our entire portfolio as a service by 2022,” said Antonio Neri, president and CEO, HPE. “Today, I am proud to say that not only have we delivered on that commitment, we have become a new company. HPE GreenLake has emerged as the go-to destination for hybrid cloud, and our industry-leading catalog of cloud services enables organizations to drive data-first modernization for all their workloads, across edge to cloud. The innovations unveiled today further build on our vision to provide the market with an unmatched platform to spur innovation and drive transformation.”

You have to give Neri credit. I was in the crowd three years ago when he made this audacious commitment to turn the entire company in a new direction. Not only has that goal been accomplished, but also customers have accepted it. The results reported have been outstanding.

In Q2 2022, HPE reported Annualized Revenue Run-Rate (ARR) of $829 million and triple digit as-a-service orders growth for the third consecutive quarter.

Mitigate Supply Chain Disruption

I am moderating a discussion on the Web July 27 at 11 am EDT among three experts sharing ideas about using technologies we have today to help mitigate supply chain disruption. We discuss supply chain control towers, digital twin (yes, the digital twin concept expands to include the entire supply chain), and research on optimization through simulation.

The discussion is sponsored by Hitachi Vantara and the experts are employed there. It’s a thoroughly non-commercial presentation brought to you by IIoT World.

Some discussion involves IoT and the need for sensors to provide data. Also developing a digital twin and using it for simulation as an aid to executive supply chain decision making. Check it out. It’ll be recorded and provided as a LinkedIn Live broadcast as well.

Responsible Computing Consortium

We thought of Artificial Intelligence as something magical. Then we figured out that much of the output of AI depends upon how the application is trained. Then, duh, we discovered bias underneath the AI training. Perhaps concepts we have learned since ancient times such as ethics, morals, and responsibility are important. My adolescent self hates hearing me say things like that. But, it’s true.

Sometimes I despair at the general lack of taking responsibility for our words and actions I too often observe.

In a bit of mining that same vein, the Object Management Group announced last Friday, May 20, a new consortium called Responsible Computing (a trademark of IBM, by the way). The founding members of this group are IBM and Dell. Its purpose is to focus on sustainable development goals.

From the news release, “Responsible computing is a systemic approach aimed at addressing current and future challenges in computing, including sustainability, ethics, and professionalism, stemming from the belief that we need to start thinking about technology in terms of its impact on people and the planet.”

“Responsible Computing aims to shift thinking and, ultimately, behavior within the IT industry and affect real change,” said Bill Hoffman, Chairman, and CEO of RC and OMG. “We’ve made our manifesto and framework freely available, and we’ve asked every RC member to implement RC principles. Our goal is that someday every IT professional will adhere to RC principles.”

The new consortium’s manifesto defines RC values to restore trust in IT by responsibly applying technology and by sharing experiences with other organizations. These values include sustainability, inclusiveness, circularity, openness, authenticity, and accountability.

The consortium’s RC framework focuses on six domains of responsible computing, including:

  • Data centers – are designed and operated with a focus on efficiency and sustainability, including emphasizing green energy and improving the handling and disposal of chemicals, toxic materials, and rare metals.
  • More sustainable infrastructure – monitoring the energy usage of products and technologies. Efficient and more sustainable operations, including proper disposal of products.​
  • Code – choosing code that optimizes environmental, social, and economic impact over time. Optimal code includes efficient algorithms, frameworks, and tools and KPIs to accelerate decision-making and pinpoint areas requiring more scrutiny during software development.
  • Data usage – the safe use of data will drive transparency, fairness, privacy, and respect for users.
  • Systems – that address bias and discrimination by driving equality for all, for example, the use of Artificial Intelligence (AI) for transparency.​
  • Impact – the technologies and innovations that drive a positive impact on society at large such as building to improve human conditions and mitigate social risk.

Through interviews with over 100 CTOs concerns were raised around developing practical actions to progress Environmental, Social, and Governance (ESG) programs. They wanted to contribute to becoming more sustainable businesses and demonstrate progress through consistent metrics. In November 2020, IBM’s Academy of Technology (AoT)’s responded to these challenges and created the Responsible Computing Council, an international team of technology and computing leaders who collaborate in validation and the implementation of the RC framework and lead by example in becoming a responsible computing provider. Object Management Group (OMG) was an early member of the council, and shortly after that, the OMG board approved the formation of the RC consortium.

“Now is the time for companies to adopt a holistic approach that places sustainability strategy at the center of their business,” said Sheri Hinish, Global Lead, IBM Consulting Sustainability Services. “IBM is proud to be a founding member of the RC consortium. Through this collaboration, we hope to help companies establish new and innovative ways to transform their business operations through ethical, impactful ways that can help contribute to a more sustainable future.”

“Dell is proud to be a founding member of the RC consortium. We are aligned with and driven by a similar passion to help leading technology organizations realize their sustainable development goals, in line with the planet’s,” said Marc O’Regan, CTO EMEA, Dell Technologies. “In addition to being socially and environmentally responsible, we also expect that RC members will see improved go-to-market solutions, strategies and bottom-line results by following RC principles.”

An organization can become more operationally efficient and demonstrate a return on investment (ROI) when meeting sustainability goals. The ROI can potentially include:​

  • Reduced power consumption
  • Waste reduction for packaging
  • Cost-effective heating and cooling solutions​
  • Supply chain efficiency, and more

Unstructured Data Management

Welcome to the age of Industrial Internet of Things and the need for unstructured data management. Analysts talked for years about the amount of data generated (or could be generated if digital) from manufacturing. We have arrived at that age. Datadobi is one company I talk with in the unstructured data management space. Here is news of additions to its product portfolio through a product called StorageMAP.

TL;DR: New software enables enterprises to visualize, organize, and take action on their entire unstructured data landscape via one single pane of glass.

Datadobi, a leader in unstructured data management, announced launch of StorageMAP, a solution that provides a single pane of glass for organizations to manage unstructured data across their complete data storage estate. Built upon Datadobi’s best-in-class vendor-neutral unstructured data mobility engine, the software enables enterprises to visualize, organize, and act on their data in hybrid vendor and cloud environments. StorageMAP puts companies in control of their data’s cost, carbon footprint, risk, and value.

StorageMAP is the evolution from Datadobi’s field-hardened data migration and protection solutions to a fully-fledged data management platform.

Unstructured data is expected to reach 144 trillion gigabytes by 2025, according to IDC. The sheer magnitude of unstructured data combined with the increasing complexity of today’s heterogeneous storage environments has caused 95% of organizations to cite the need to manage unstructured data as a problem for their business.

The four major critical concerns for enterprises that StorageMAP addresses are:

  • Cost Control – StorageMAP helps organizations cut costs by enabling intelligent cloud adoption, moving data to less costly on-premises storage, accelerating the decommissioning of inefficient storage, and deleting Redundant, Obsolete, and Trivial (ROT) data.
  • Conformance to Environmental, Social, and Governance (ESG) Policies – StorageMAP helps organizations to meet CO2 reduction targets through carbon accounting of unstructured data storage in the cloud and in the data center. Cloud adoption (which claims to have up to an 80% carbon reduction rate compared to on-premises storage), data center optimization, and reduction of ROT data all contribute to a lower carbon footprint.
  • Risk Reduction –  StorageMAP bolsters unstructured data protection by allowing IT leaders to understand what data they have, why they have it, where it is, and who owns it. The software then enables companies to take action to properly backup their unstructured data and delete ROT, disowned, and dangerous data.
  • Getting More Value from Data – StorageMAP allows companies to move their data to the right place, at the right time, all the time. For example, data can be quickly copied to the cloud where it can be analyzed by cloud-native apps. StorageMAP enables companies to exploit their data’s hidden value, transforming it from a liability into an asset.

StorageMAP is sold as a “pay-as-you-grow” model that allows customers to first understand their unstructured data environment before committing to take any necessary actions required by the business. IT leaders, business unit managers, and the C-suite can see reports optimized to their needs that will help them make clear decisions according to their business priorities. Once they have the visibility, they can then organize the data according to multiple criteria such as ownership of the data, the role of the data, where the data belongs, the risk profile of the data, and the type of action to take on the data. Only then do customers need to commit to purchasing the necessary Action Add-ons required to meet their immediate needs.

Supply Chain Cyber Security

I’ve noted that cyber security news has been inundating my inbox. As well, media relations people have identified me as a supply chain writer/analyst. It’s one of those indications of the broadening of the market I serve. This news concerns the first product of a new company–Chainguard.

We’re announcing our first product, Chainguard Enforce–a software supply chain solution that is native for Kubernetes workloads. Chainguard Enforce enables you to define, observe, distribute, and enact policies that ensure only trusted container images are deployed and run in your clusters. The goals of Chainguard Enforce are to deliver a seamless developer experience with security built in, and a platform for CISOs to manage organization-wide security controls. 

After speaking with over 50 organizations about their software supply chain challenges, it was clear security leaders share a similar concern: it’s impossible to be confident about the code running in production environments. There are limited options for production supply chain security policy management today, yet emerging frameworks like SLSA and NIST’s SSDF require it. 

“Insider risks are top of mind for us. The capabilities Chainguard Enforce provides are filling critical gaps across our organization.” said Jim Higgins, CISO for Block.

Component Breakdown

Chainguard Enforce consists of four main components as well as a developer-friendly CLI and UI: a Policy Agent, Build System Integrations, Continuous Verification, and an Evidence Lake. 

The read-only Policy Agent provides support for per-cluster policy and webhook configurations that can all be centrally managed and administered across multi-cluster environments. The Agent integrates with many Kubernetes platforms like EKS, AKS, and GKE today. It comes with a curated set of policy definitions based on the open-source SLSA and NIST SSDF standards, and also supports a full policy language for defining custom policies.

Chainguard Enforce includes Build System Integrations for most popular CI platforms like GitHub Actions, CircleCI, BuildKite, and GitLab to establish a record of what source code was used to build each container. In most cases, it takes less than a day for DevOps teams to install and configure these build system integrations.

Continuous Verification ensures that deployed container images stay in compliance with your defined policies and any deviations will trigger an alert.

Last but not least, the Evidence Lake is a real-time asset inventory that provides visibility into the security posture across an organization. The data can be used to power developer tooling, incident recovery, debugging, and audit automation. There are also integrations available for popular alerting and ticketing platforms such as Slack and Jira.

Red Canary Annual Cybersecurity Threat Detection Report

Cybersecurity is such a big topic right now. For the past few months, many companies founded to take on cyber threats have sent news of many kinds. Most seem to be doing studies and issuing annual reports. This is news of research and a report form a managed detection and response provider called Red Canary.

This report analyzes 30,000 threats in customer environments to uncover trends, threats, and techniques from the 2021 threat landscape. Its fourth annual Threat Detection Report, an extensive report that’s based on analysis of more than 30,000 confirmed threats detected across customers’ environments in the past year.

The findings reveal that ransomware dominated the threat landscape in 2021, with groups adopting new techniques such as double extortion and “as-a-service” models to evade detection and maximize their earnings. The report explores the top 10 threats impacting the majority of Red Canary customers – from adversary favorites like Cobalt Strike to new activity clusters like Rose Flamingo – and the most common techniques that adversaries use to carry out these attacks, including guidance for companies to strengthen their ability to detect these threats.

“These threats are less sensational than you might find elsewhere, but they’re the ones that will impact the majority of organizations,” said Keith McCammon at Red Canary. “This report addresses highly prevalent threats and the tried-and-true techniques that are wreaking havoc on organizations. We take it a step further to explore in depth the adversarial techniques that continue to evade preventative controls, and that can be challenging to detect. We hope that this report serves as a valuable tool for everyone from executives to practitioners, providing the information that’s needed to detect and respond to cybersecurity threats before they negatively impact organizations.”

Red Canary found that adversaries have continued to carry out attacks using legitimate tools. As security tools increase in sophistication, adversaries are finding it more difficult to develop and deploy their own malware that evades defenses. As a result, adversaries rely on administrative tools — like remote management software — and native operating system utilities out of necessity, co-opting tools that are guaranteed or likely to be installed on a device rather than introducing non-native software.

Several of the top 10 threats and techniques highlighted in the report are used by adversaries and administrators or security teams alike, including command and control (C2) tool Cobalt Strike, testing tool Impacket, and open source tool Bloodhound. Cobalt Strike, in particular, has never been more popular, impacting 8% of Red Canary’s customers in 2021. Some of the most notorious ransomware operators, including Conti, Ryuk and REvil, are known to rely heavily on Cobalt Strike. Coming in at the No. 5 ranking, Impacket is a collection of Python libraries that is used legitimately for testing but is abused by ransomware operators. This is another favorite among adversaries, as it’s known to evade detection due to its difficulty to be differentiated as malicious or benign.

Ransomware was top billing for some of last year’s most destructive cyberattacks. The report describes the new tactics that ransomware groups used in 2021, such as double extortion, which applies pressure to victims in more than one way to coerce them to pay a ransom. Last year also brought the rise of the affiliate model, which made tracking malicious activity more difficult because intrusions can often result from an array of different affiliates providing access to different ransomware groups. Examples of this include the Bazar and Qbot trojans, used by adversaries to gain initial access into environments before passing off access to ransomware or other threat groups.

Download Red Canary’s full Threat Detection Report here (you’ll have to part with some personal contact information).

Follow this blog

Get a weekly email of all new posts.