Back in the 70s I actually had a job in a manufacturing company with a principle function regarding data. All the engineering, product, and cost data. Little did I know then that 45 years later data is the new oil. (Don’t really believe all the hype.)
I’ve been interested in the new function and applications called DataOps, among other things for data management. I’ve been part of the HPE VIP Community for a few years. HPE isn’t talking as much about manufacturing lately, but many of its technologies are germane. I picked up this short post from the Community site on data management.
Data management complexity is everywhere in the modern enterprise, and it can feel insurmountable. IT infrastructure has evolved into a complex web of fragmented software and hardware in many organizations, with disparate management tools, complicated procurement processes, inflexible provisioning cycles, and siloed data. Organizations are running multiple storage platforms, each with their own management tools, which becomes a progressively tougher problem at scale.
Complexity is a growing problem, and according to a recent ESG study, 74% of IT decision makers acknowledge that it is holding them back in their digital transformation journey. Their data management capabilities simply can’t keep pace with business demands. As a result, IT organizations are forced to spend time managing the infrastructure, as opposed to really leveraging the data.
Constant firefighting might sound like business as usual to many tech pros, but it can be avoided. To accelerate transformation, IT leaders must confront and eliminate data complexity first, getting data to flow where and when it needs to, and allowing the business to get back to business.
Of course, HPE has a solution—GreenLake for Block Storage.