Return of the Large Trade Show

IMTS / Hannover Messe invaded Chicago this week. I drove down a couple of days. It was huge. Booths populated all four halls. I did not see everything. Or even half.

Hannover Messe (in Chicago) has co-located for the past three or four events. As in the past, the automation / Hannover Messe part encompassed a few aisles in the East hall.

I’ll have more news items in the next post.

Best of what I saw:

Nokia. What?! I was approached for an appointment. I said yes figuring on a 5G private network discussion. I was partly right.

Let me back up for context.

  • Enterprises crave data to feed their information systems.
  • Data from industrial / manufacturing operations were bottled in isolated, siloed systems
  • Networking became robust
  • Interoperable protocols grew
  • The Internet of Things (IoT) became a thing
  • Suddenly data could go where and when needed

Solutions.

  • Automation vendors claimed connectivity to enterprise but that fell short
  • IT suppliers, supporters of the enterprise, tried to enter the market with gateways, networking, partnerships and ecosystems to get the data.
  • They couldn’t find the formula to sell to manufacturing (known as OT)
  • We have gateways, databases, networking, but still no enterprise solution

Nokia.

  • Builds off networking technology which has progressed to 5G Private Networks
  • Has added edge compute devices
  • Partnership with PTC (Kepware / Thingworx) for software connectivity
  • Attacking this open market from a new perspective–both the enterprise IT side and the operations OT side

I am not predicting success. I never do. What I love about trade shows is finding this nugget of original thinking cloaked in the mundane. They have the foundation. Can they sell?

Check out this page on the Nokia site.

Identify Data Sets With Orphaned Data Enabling Appropriate Actions

Steve Leeper, VP Systems Engineering and Marketing, and Carl D’Halluin, Chief Technology Officer of Datadobi, met with me recently to discuss data, unstructured data management, data migration, the company, and those poor little orphaned files and data left in storage long after they had useful life.

I’ve been writing about the company’s products for about two years. Datadobi is about to celebrate its 13th birthday. Organically funded, management expects it to be around for some time to come.

The occasion of this conversation was release 6.2 and an upgrade to the recently announced StorageMAP product—a multi-vendor, multi-cloud data management platform with the introduction of capabilities to discover and remediate orphaned data.

Orphaned data is owned by employees of a company that are inactive but still enabled on an organization’s network or systems. These datasets, which can amount to a significant part of a company’s total data stored, represent risk and cost to every organization. Any enterprise with a large employee base, normal to high attrition rates, or undertaking merger and acquisition (M&A) activities will be vulnerable to orphaned data. Orphaned data is mostly not of value to the organization and creates liability due to its unknown content. Eliminating orphaned data enables IT leaders to reduce cost, lower carbon footprint, and lower risk while maximizing value out of relevant unstructured data used by active users.

With the new capabilities, StorageMAP can identify data sets with high amounts of orphaned data, allowing IT leaders to group and organize the data independent of the business organization. The orphaned datasets can then be migrated, archived to a lower-cost platform, or deleted by StorageMAP. The platform provides an easy-to-read, granular report on the top users with orphaned data and for additional detail, a search function allows targeted searches to be executed across part of or the entire data estate.

The 6.2 release comes after Datadobi launched its StorageMAP platform earlier this year. StorageMAP provides a single pane of glass for organizations to manage unstructured data across their complete data storage estate on-premises and in the cloud. This latest update enables IT leaders to fully take control of orphaned data on their networks.

“Due to the scale and complexity of unstructured data in today’s heterogeneous storage environments, enterprises can easily lose track of orphaned data within networks and open themselves up to excess storage costs and risk,” said Carl D’Halluin, CTO, Datadobi. “StorageMAP’s new capabilities allow for a seamless experience identifying and taking the appropriate action on orphaned data to give IT leaders peace of mind.”

“As data proliferation continues, IT leaders are going to see more orphaned data on their networks. This is why it is so important that organizations turn to unstructured data management solutions like StorageMAP to find datasets associated with inactive users and take the appropriate action,” said Craig Ledo, IT Validation Analyst at Enterprise Strategy Group (ESG). “Combined with a high level of cross-vendor knowledge, years of real-life experience, and great customer support, enterprises can let StorageMAP do the heavy lifting when it comes to orphaned data.”

Fix Your Data Flow and Transform Your Business

Back in the 70s I actually had a job in a manufacturing company with a principle function regarding data. All the engineering, product, and cost data. Little did I know then that 45 years later data is the new oil. (Don’t really believe all the hype.)

I’ve been interested in the new function and applications called DataOps, among other things for data management. I’ve been part of the HPE VIP Community for a few years. HPE isn’t talking as much about manufacturing lately, but many of its technologies are germane. I picked up this short post from the Community site on data management.

Data management complexity is everywhere in the modern enterprise, and it can feel insurmountable. IT infrastructure has evolved into a complex web of fragmented software and hardware in many organizations, with disparate management tools, complicated procurement processes, inflexible provisioning cycles, and siloed data. Organizations are running multiple storage platforms, each with their own management tools, which becomes a progressively tougher problem at scale.

Complexity is a growing problem, and according to a recent ESG study, 74% of IT decision makers acknowledge that it is holding them back in their digital transformation journey. Their data management capabilities simply can’t keep pace with business demands. As a result, IT organizations are forced to spend time managing the infrastructure, as opposed to really leveraging the data.

Constant firefighting might sound like business as usual to many tech pros, but it can be avoided. To accelerate transformation, IT leaders must confront and eliminate data complexity first, getting data to flow where and when it needs to, and allowing the business to get back to business.

Of course, HPE has a solution—GreenLake for Block Storage.

ABB and Red Hat Partner for Scalable Digital Solutions

Much as some of its large industrial competitors, ABB is quickly building out industrial software solutions. A friend who is a financial analyst told me that Wall Street and other investors prize software right now. A company focused on instrumentation and automation platforms doesn’t evoke the same eyes full of longing and desire as when they add software.

In this announcement, ABB and Red Hat, the open source enterprise software company, are partnering to deliver ABB automation and industrial software solutions at the intersection of information technology (IT) and operational technology (OT), equipping the industrial ecosystem with extended deployment capabilities and greater agility. This is consistent with ABB’s vision of the evolution of process automation.

  • ABB will deliver digital solutions to customers on-demand and at scale using Red HatOpenShift
  • Customers will be better able to harness the potential of data-based decisions by using applications that can be deployed flexibly from the edge to the cloud

The partnership enables virtualization and containerization of automation software with Red Hat OpenShift to provide advanced flexibility in hardware deployment, optimized according to application needs. It also provides efficient system orchestration, enabling real-time, data-based decision making at the edge and further processing in the cloud.

Red Hat OpenShift, the industry’s leading enterprise Kubernetes platform, with Red Hat Enterprise Linux  as its foundation, provides ABB with a single consistent application platform, from small single node systems to scaled-out hyperconverged clusters at the industrial edge, which simplifies development and management efforts for ABB’s customers. 

“This exciting partnership with Red Hat demonstrates ABB’s commitment to meet customer needs by seeking alliances with other innovative market leaders,” said Bernhard Eschermann, Chief Technology Officer, ABB Process Automation. “The alliance with Red Hat will see ABB continue helping our customers improve their operations as they navigate a rapidly evolving digital landscape. It will give them access to the tools they need to integrate plantwide IT and OT, while reducing risks and optimizing performance.” 

Red Hat OpenShift increases the deployment flexibility and scalability of ABB Ability Edgenius, a comprehensive edge platform for industrial software applications, together with ABB Ability Genix Industrial Analytics and AI Suite, an enterprise-grade platform and applications suite that leverages industrial AI to drive Industry 4.0 digital business outcomes for customers. ABB’s Edgenius and Genix can both be scaled seamlessly and securely across multiple deployments. With this partnership, ABB will have access to capabilities like zero-touch provisioning (remote configuration of networks) which can increase manageability and consistency across plant environments. 

“Red Hat is excited to work with ABB to bring operational and information technology closer together to form the industrial edge. Together, we intend to streamline the transition from automated to autonomous operations and address current and future manufacturing needs using open-source technologies,” said Matt Hicks, executive vice president, Products and Technologies, Red Hat. “As we work to break down barriers between IT and the plant level, we look to drive limitless innovation and mark a paradigm shift in operational technology based on open source.” 

HPE Announces GreenLake Modern Private Cloud and New Cloud Services

Hewlett Packard Enterprise (HPE) announced advances to its GreenLake flagship Software-as-a-Service platform at its Discover 2022 user conference. Below are the update summaries:

  • Unified experience across edge to cloud 
  • Deepens security 
  • Extends developer tools 
  • Strengthens capabilities to run workloads at scale
  • Transformed and modern private cloud experience with automated, flexible, scalable pay-as-you-go private cloud for traditional and cloud-native workloads
  • Eight new cloud services including backup and recovery, block storage, compute operations management, data fabric, disaster recovery, hyperconverged infrastructure, as well as industry-vertical cloud services for customer engagement and payments

“Three years ago, at HPE Discover, HPE committed to delivering our entire portfolio as a service by 2022,” said Antonio Neri, president and CEO, HPE. “Today, I am proud to say that not only have we delivered on that commitment, we have become a new company. HPE GreenLake has emerged as the go-to destination for hybrid cloud, and our industry-leading catalog of cloud services enables organizations to drive data-first modernization for all their workloads, across edge to cloud. The innovations unveiled today further build on our vision to provide the market with an unmatched platform to spur innovation and drive transformation.”

You have to give Neri credit. I was in the crowd three years ago when he made this audacious commitment to turn the entire company in a new direction. Not only has that goal been accomplished, but also customers have accepted it. The results reported have been outstanding.

In Q2 2022, HPE reported Annualized Revenue Run-Rate (ARR) of $829 million and triple digit as-a-service orders growth for the third consecutive quarter.

Mitigate Supply Chain Disruption

I am moderating a discussion on the Web July 27 at 11 am EDT among three experts sharing ideas about using technologies we have today to help mitigate supply chain disruption. We discuss supply chain control towers, digital twin (yes, the digital twin concept expands to include the entire supply chain), and research on optimization through simulation.

The discussion is sponsored by Hitachi Vantara and the experts are employed there. It’s a thoroughly non-commercial presentation brought to you by IIoT World.

Some discussion involves IoT and the need for sensors to provide data. Also developing a digital twin and using it for simulation as an aid to executive supply chain decision making. Check it out. It’ll be recorded and provided as a LinkedIn Live broadcast as well.