Data Classification, Data Visibility, and Data Accessibility Concerns

Early in my career I reported to the head of product development who realized the critical importance of data that originated with product engineering and design. He appointed me to lead the data function. Little did I realize that the role has become even more critical in manufacturing (and other) enterprises today.

I recently heard from my Datadobi PR contact who shared a copy of an announcement made later via the Datadobi blog with deep thoughts on current trends and requirements for data management. The news relates to a recently published IDC report.

IDC says a Data Mobility Engine Can Serve as the Core of an Effective Data Management Strategy

Research firm IDC predicts that, over the next five years, more than 80% of the data collected by organizations will be unstructured data, and that will only continue to grow 40-50% per year for most enterprises.

IDC’s Research VP of Infrastructure Systems, Platforms and Technologies Group Eric Burgener authored an IDC Analyst Brief, sponsored by Datadobi, titled “The Data Mobility Engine as the Foundation for an Efficient Data Management Strategy.”

In the analyst brief, Burgener urges organizations to implement a comprehensive data management strategy to confront this increasing influx of data, noting that a data mobility engine provides the foundation for an effective data management strategy and can drive significant benefits for the hybrid multicloud enterprise.

In his analysis, Burgener outlines the five main components of an effective data mobility engine, including the following:

1)   Vendor-neutral interoperability

2) provide visibility into data metrics, access patterns, and usage activities

3)   Orchestration and automation

4)   Scan-optimize-copy capabilities

5)   Integrity enforcement

Over the last several years at Datadobi, we’ve had more and more IT leaders come to us with concerns around data classification, data visibility, and organization-wide data accessibility, as well as how to handle aging data and the high costs that result from a fragmented data management strategy.

Finally, Burgener states in his report that “the benefits of an effective data management strategy include reduced IT costs, easier data sharing, better security, less legal exposure, and an improved ability to demonstrate governance and regulatory compliance.”

Annual Threat Detection Report Reveals Top Threats and Techniques

Threats can come suddenly from anywhere. The day after Russia invaded Ukraine traffic from Russia to my website spiked. I have a rather steady, if low, number of page views regularly from that country. Not sure why the spike. But when I turned my site into part of my business rather than a hobby blog, I also signed up with a website defense company.

Cybersecurity news has become a mainstay thread for the past year. I don’t know if the cause is related to the pandemic or if venture money is flowing that direction. They all do studies and reports. This one comes from a company called Red Canary, a managed detection and response provider. It analyzed 30,000 threats in customer environments and uncovered a number of trends, threats, and techniques from the 2021 landscape.

Red Canary, the Managed Detection and Response (MDR) provider that detects threats no one else does, on March 22 launched its fourth annual Threat Detection Report, an extensive report that’s based on analysis of more than 30,000 confirmed threats detected across customers’ environments in the past year.

The findings reveal that ransomware dominated the threat landscape in 2021, with groups adopting new techniques such as double extortion and “as-a-service” models to evade detection and maximize their earnings. The report explores the top 10 threats impacting the majority of Red Canary customers – from adversary favorites like Cobalt Strike to new activity clusters like Rose Flamingo – and the most common techniques that adversaries use to carry out these attacks, including guidance for companies to strengthen their ability to detect these threats.

“These threats are less sensational than you might find elsewhere, but they’re the ones that will impact the majority of organizations,” said Keith McCammon at Red Canary. “This report addresses highly prevalent threats and the tried-and-true techniques that are wreaking havoc on organizations. We take it a step further to explore in depth the adversarial techniques that continue to evade preventative controls, and that can be challenging to detect. We hope that this report serves as a valuable tool for everyone from executives to practitioners, providing the information that’s needed to detect and respond to cybersecurity threats before they negatively impact organizations.”

Red Canary found that adversaries have continued to carry out attacks using legitimate tools. As security tools increase in sophistication, adversaries are finding it more difficult to develop and deploy their own malware that evades defenses. As a result, adversaries rely on administrative tools — like remote management software — and native operating system utilities out of necessity, co-opting tools that are guaranteed or likely to be installed on a device rather than introducing non-native software.

Several of the top 10 threats and techniques highlighted in the report are used by adversaries and administrators or security teams alike, including command and control (C2) tool Cobalt Strike, testing tool Impacket, and open source tool Bloodhound. Cobalt Strike, in particular, has never been more popular, impacting 8% of Red Canary’s customers in 2021. Some of the most notorious ransomware operators, including Conti, Ryuk and REvil, are known to rely heavily on Cobalt Strike. Coming in at the No. 5 ranking, Impacket is a collection of Python libraries that is used legitimately for testing but is abused by ransomware operators. This is another favorite among adversaries, as it’s known to evade detection due to its difficulty to be differentiated as malicious or benign.

Ransomware was top billing for some of last year’s most destructive cyberattacks. The report describes the new tactics that ransomware groups used in 2021, such as double extortion, which applies pressure to victims in more than one way to coerce them to pay a ransom. Last year also brought the rise of the affiliate model, which made tracking malicious activity more difficult because intrusions can often result from an array of different affiliates providing access to different ransomware groups. Examples of this include the Bazar and Qbot trojans, used by adversaries to gain initial access into environments before passing off access to ransomware or other threat groups.

The report analyzes several new ransomware families that became more prevalent in 2021, including BlackByte, Grief, Hive, Yanluowang, Vice Society and CryptoLocker/Phoenix Locker, while also taking a look at the families that tapered off, like Egregor, REvil, BlackMatter and Doppelpaymer. Many of the emergent ransomware families were similar to those that became less or inactive, leading analysts to assess that known adversaries resurfaced using a new name.  

The threat landscape moved toward a Software-as-a-Service (SaaS) economy in 2021, muddying the already murky waters of attribution. While Ransomware-as-a-Service (RaaS) has been widely reported for years, this model has now become the norm for adversaries. While Red Canary has been tracking some “as-a-service” models like TA551 over the years, others are just now coming into focus. In particular, Red Canary tracks multiple phishing affiliates that dropped variants of the Bazar family of malware.

This economic model lowers the technical barrier to entry, allowing operators to purchase capabilities rather than develop them. Between Phishing-as-a-Service, Access-as-a-Service, and Crypters-as-a-Service, it has never been easier to find an adversary for hire.

Download Red Canary’s full Threat Detection Report here.

HPE Expands GreenLake Edge-to-Cloud Platform Adding Services

Hewlett Packard Enterprise executives discussed the company’s latest product and business advances this morning, March 22, 2022. Antonio Neri, President and CEO, added that the “as-a-Service” business continues to be accepted by customers with revenue growth above 130% last quarter.

Technology continues to blend for manufacturing and production enterprises as Edge-to-Cloud architectures mature.  HPE moved aggressively to an as-a-Service business model. This announcement concerns adding Aruba networking customers to the GreenLake platform—120,000 in total—adding a new extension of Networking-as-a-Service. GreenLake also added 12 new cloud services and HPE expanded its online marketplace by adding ALSO Group, Arrow Electronics, Ingram Micro Inc., and TD Synnex.

HPE appears to be outpacing rivals with networking, compute, and data services available with the as-a-service flexibility.

From the news release:

Hewlett Packard Enterprise announced significant advancements to HPE GreenLake, the company’s flagship offering that enables organizations to modernize all their applications and data, from edge to cloud. Now, HPE’s market-leading hybrid cloud platform just got stronger, with a unified operating experience, new cloud services, and availability of HPE GreenLake in the online marketplaces of several leading distributors.

“HPE was among the first to deliver a cloud platform that enables customers to manage and extract insights from their data from edge to cloud, and our continued innovation is driving growth and furthering our market leadership,” said Antonio Neri, president and CEO, HPE. “In the hybrid cloud market, HPE GreenLake is unique in its simplicity, unification, depth of cloud services, and partner network. Today, we are furthering our differentiation, boldly setting HPE GreenLake even further apart as the ideal platform for customers to drive data-first modernization.”

HPE GreenLake supports multi-cloud experiences everywhere – including clouds that live on-premises, at the edge, in a colocation facility, and in a public cloud – and continues to drive strong demand worldwide. In Q1 2022, HPE reported Annual Recurring Revenue of $798 million, and increased as-a-service orders 136 percent year-over-year.

Platform updates include converging Aruba Central, a cloud-native, AI-powered network management solution with the GreenLake platform. Further, a new, unified operational experience that provides a simplified view and access to all cloud services, spanning the entire HPE portfolio, with single sign-on access, security, compliance, elasticity, and data protection.

The HPE GreenLake platform provides the foundation for more than 50 cloud services, including electronic health records, ML Ops, payments, unified analytics, and SAP HANA, as well as a wide- array of cloud services from partners.

HPE also unveiled 12 new cloud services in networking, data services, high performance computing and compute operations management.

HPE GreenLake for Aruba networking. The eight new services simplify the process of procuring and deploying NaaS and allow customers to align network spend to usage needs, while ensuring that the network is always ready to support business objectives. The new services are also optimized for channel partners looking to satisfy growing customer demand for NaaS, to operate in a resale or managed service provider model.

New and enhanced services for block storage and data protection join the current HPE GreenLake data services.

HPE GreenLake for Block Storage is the industry’s first block storage as-a-Service to deliver 100% data availability guarantee built-in on a cloud operational model.

Enhanced HPE Backup and Recovery Service is backup as a service built for hybrid cloud. Customers can effortlessly protect their data for Virtual Machines, gain rapid recovery on-premises, and deliver a cost-effective approach to store long-term backups in the public cloud. HPE Backup and Recovery Service is now available for Virtual Machines deployed on heterogeneous infrastructure.

HPE is further enhancing its HPE GreenLake for High Performance Computing offerings, making it more accessible for any enterprise to adopt the technology, by adding new, purpose-built HPC capabilities. These also include lower entry points to HPC, with a smaller configuration of 10 nodes, to test workloads and scale as needed.

First introduced at HPE Discover 2021, the HPE GreenLake Compute Ops Management is a cloud-native management console to access, monitor, and manage servers. Compute Ops Management easily automates the compute lifecycle management and securely across a customer’s compute environment.

HPE continues to invest in co-development with key distribution partners. First announced in March 2021, HPE GreenLake is now directly available in the cloud marketplaces and ecommerce platforms of ALSO Group, Arrow Electronics, Ingram Micro and TD Synnex.

Finally, HPE announced today a new global agreement with Digital Realty, the largest global provider of cloud-and carrier-neutral data center, colocation and interconnection solutions. Digital Realty allows customers to run any HPE GreenLake service with colocation across Digital Realty’s more than 285 data centers on six continents, which includes sites in 50 major cities, to deliver a rich ecosystem of offerings and world-class business and cloud adjacency.

Product Information Management Handles Complexity

The manufacturing company I worked for at the beginning of my career had a couple of information management problems. The problem’s core was organizing all the product data and working with production, accounting, inventory management, costing, and other functions. I had a mentor at the time unaware to me who pushed the VP of Product Development to bring me over from the production position I had to organize data.

It’s a long story (TL;DR), but by the end of my time there (some pretty massive layoffs with the recession of 1980 and other stuff) I had been slotted for an IT leadership post and (in today’s terms) digitized the lot of it.

It was not out of ignorance that it happened that I interviewed the president of a Product Information Company newly spun off from a university group.

Since it takes more than one company to make a market, I heard about another company in the market. Viamedici was established in 1999 successfully completing more than 300 PIM/MDM projects since. Where I assembled a totally manual system, current state of the computing art allows Viamedici to implement projects with upwards of a billion records and 46.6 million records in-memory data.

Today’s conversation linked me with CEO Juergen Mueller who ran me through the company’s portfolio of products and services. The key proposition is high scalability and flexibility. The company has been moving to “Data-as-a-Service” on its own cloud or also on AWS or Azure.

Viamedici is privately held and has offices in the US, Germany, China and Japan with a partner network in place to support its global client base. The company’s portfolio covers all product management and marketing processes. At the center of the portfolio lies the product information management suite Viamedici EPIM, comprised of the areas product master data management, media asset management, total quality management, data governance and cross media publishing. These are supplemented by a high performance ecommerce platform as well as solutions for electronic data exchange and mobile applications. Applications for translation and language management collaboration and marketing operation management top off the offering.

Overcoming the chaos of complexity seems to be the driving force for these applications. As companies grow, the generate many products, many varieties within their product lines, and a huge spare parts problem for customers and distributors. These complexity management products seek to give management a handle on the problem.

StorCentric’s Retrospect Adds Anomaly Detection to Ransomware Protection

I used to use Retrospect to back up files on my Macs. Not sure why I stopped, probably a compatibility issue with MacOS at the time. It did the job for me, though. But I was surprised to get some news from StorCentric, the company behind Retrospect, announcing an update. Something I’ll have to check out again.

Retrospect, a StorCentric company, announced the general availability (GA) of Retrospect Backup 18.5, featuring new anomaly detection, customizable filtering and thresholds, and enhanced ransomware protection to help businesses quickly detect and protect against malicious attacks. With deeper Microsoft Azure Blob integration for Immutable Backups and integrated cloud bucket creation, Retrospect Backup 18.5’s anomaly detection and ransomware protect bolsters StorCentric’s data-centric security approach to organizations’ critical infrastructure.

According to Coveware, most corporate targets are small and medium businesses. 72% of targeted businesses have fewer than 1,000 employees, and 37% have fewer than 100. Businesses are projected to have paid out $20B in 2021, a 100% Y-o-Y increase for the last four years, and it’s only going to get worse with new business models like RaaS: ransomware-as-a-service. With Retrospect Backup 18, businesses can protect their infrastructure with immutable backups for ransomware protection.

Included in Retrospect Backup 18.5

▪ Anomaly Detection: Detect anomalies in systems based on customizable filters and thresholds tailored to individual environments.

▪ Retrospect Management Console Integration: View anomalies across a business or partner’s entire client base in a single pane of glass.

▪ Improved Microsoft Azure Blob Integration: Set individual immutable retention policies for different backup sets within the same Azure Storage Container.

▪ Streamlined Immutable Backup User Experience: Automatically create cloud buckets with immutable backups supported by default.

▪ LTO-9 Support: Includes support for LTO-9, with capacities up to 18TB (45TB compressed).

End of General Purpose Wireless for IoT

Stacey Higgenbotham writes in her weekly Stacey on IoT newsletter about the proliferation of IoT networks. I’ve been noting advances in 5G and WiFi6 and occasionally about advances in Bluetooth, but she puts it all together here.

But as we have added more devices and more types of networks, wireless connectivity has become a lot more complicated. We have personal area networks for wearables and headsets that use Bluetooth. We have some devices on Wi-Fi 5 networks and others on Wi-Fi 6 or even Wi-Fi 6E. Smart homes might have Zigbee, Z-Wave, or Thread. Corporate offices might have a proprietary OT network and variations on 4G or 5G cellular.

Then it comes to the crucial point.

And someone has to manage all of this. Welcome to the end of the general purpose wireless network. Today, it’s all about special purpose connectivity.

On the one hand, corporate IT must pick up new skills.

So how will corporate IT departments manage the provisioning and acquisition of connectivity across multiple types of networks? Wi-Fi gear that handles Wi-Fi and Bluetooth that gets managed by the IT department has already come onto the market. But while the IT department might help manage the cellular network bills, with private LTE or private 5G it’s unclear how that gear and management will converge.

And, of course, changes also mean new opportunities for entrepreneurs to enter.

As general purpose wireless networks fade, enterprises will need help. The only question is: Who will win the race to provide it? 

Follow this blog

Get a weekly email of all new posts.