The Industrial Internet of Things along with other manufacturing IT advances led to a rise in unstructured data causing storage challenges increasing the importance of data mobility. Continuing my theme today of data, this is news from a company new to me called Datadobi, which touts itself as a global leader in unstructured data migration software.
It released a report earlier this month by 451 Research, part of S&P Global Market Intelligence, which reveals the major impact that data growth is having on storage management, highlighting the rise in retention of unstructured data exacerbating the storage challenges faced by organizations. The report, which was commissioned from 451 Research features data from 451 Research’sVoice of the Enterprise: Storage, Data Management & Disaster Recovery 2021survey, underscores the need for businesses to plan for the disposition of ageing data, implement migration and protection plans, and understand storage-consumption costs if they are to effectively manage data, whether on-premises and in the cloud.
The report identifies data growth as the number one storage management challenge in most organizations, ahead of a range of issues such as disaster recovery requirements, cost, and migration. In addition, respondents said they expect their data management to grow by 26% over the next 12 months.
The growth in unstructured data is also complicating the approach organizations need to take to disaster recovery and data protection. As the report points out, this is “forcing them to look beyond traditional on-premises storage infrastructure to leverage public cloud storage or managed services to help reduce the burden on the IT staff. Hybrid IT.” This means that organizations will need to manage growing levels of unstructured data, both on-premises and in the cloud. As a result, companies will require cross-platform data migration (data mobility), protection, and management that are rooted in clear and identifiable ownership. (Business Impact Brief by 451 Research, part of S&P Global Market Intelligence)
To address these challenges, the report says: “Organizations should seek out tools and platforms that allow them to leverage a wide array of vendor and cloud offerings to preserve choice while also preserving the ability to negotiate lower prices and superior service levels from their vendors. The ability to provide deeper insight into how data is being used is invaluable given the price disparity between high performance storage tiers and low-cost archive tiers.”
Commenting on the report, Michael Jack, Datadobi CRO, said: “Data growth continues to be a top challenge in most organizations. As cloud adoption continues, efficient data mobility takes on new importance and must be automated as much as possible. Retention policies, data protection policies, and data disposition policies need to be clearly defined through partnerships between business data owners and the infrastructure owners who provide storage services to the business.”
Jack continued: “By delivering enterprise-class NAS migration software to migrate and protect data anywhere, Datadobi helps customers around the world address these challenges and remain in control of their storage management strategy.”
Download a full copy of the report.t
This week’s virtual conference is Hitachi Vantara Social Innovation. I watched a few sessions yesterday. I’ll be back today for a little more between household chores and an interview. The key takeaway is data-driven. Showcased were the Disney Company and American Heart Association with executives discussing the business value of successfully employing a data-driven culture with Hitachi Vantara infrastructure.
I have been sitting on this press release for three weeks because I actually had a huge backlog of news. It begins by touting its position as Leader in Gartner’s Magic Quadrant for Industrial IoT Platforms. It is in great company with PTC and Microsoft.
Highlights of the news include:
- Lumada Manufacturing Insights: The industry-specific solution integrates data from multiple sources – from vibration, video, lidar and audio – to detect disturbances in the supply chain and allows for greater visibility and planning.
- Smart Spaces & Lumada Video Insights: With improved team collaboration and incident response, these solutions use data sourced from trains, first responder vehicles and factories, among others, to improve quality and safety in the human-machine interaction.
- Lumada Edge Intelligence: Faster than ever before, made possible by an integration with Google Cloud, this solution accelerates data preparation and ML workflows for automated decision-making and more time to focus on the value of insights.
Hitachi Vantara, the digital infrastructure, data management, and digital solutions subsidiary of Hitachi announced advancements to the Lumada software platform and industry solutions to accelerate the digital transformation of industrial processes.
Improving manufacturing operational outcomes involves comprehensive data analysis and integration from thousands of moving parts across remote and industrial environments. Lumada is Hitachi’s digital platform that connects data, assets, and people to fuel industry innovation. It is the software foundation for Lumada Industry Solutions, that extract data-driven insight and drive better operational and business outcomes. The updated Lumada portfolio allows customers to automate tasks and make faster decisions by training data models in the cloud and deploying them to edge devices, creating actionable insights from diverse data sets at lower infrastructure cost.
“Across the globe, industries are dealing with increasing complexity, a faster changing environment and greater competition that together are driving a need for accelerated digitalization. Supply chain disruptions, health and safety measures and operational challenges have highlighted this need for data-driven innovation,” said, Radhika Krishnan, Chief Product Officer, Hitachi Vantara. “Today’s advancements allow our customers to make faster, more informed decisions so industries can thrive in our rapidly digitalizing future.”
Delivering Deeper Insights and Faster Time to Value
Hitachi Vantara is accelerating industrial digitalization with major enhancements to data-driven offerings for manufacturing, extending AI and automation from edge to core, and delivering deep real-time insights from new combinations of data and connections.
Lumada Manufacturing Insights:
- This industry solution delivers greater visibility across a customers’ supply chain subsystems with the supply chain module’s ability to implement supply chain control tower solutions and take direct, demand-driven action.
- Integrating and correlating data from multiple sources– from asset health data to vibration, video, lidar and audio – to detect potential failure of a machine, manufacturers can better predict points of failure and perform preventive maintenance, reducing downtime and improving output.
- Also new is the ability to automate forms for digitization of factory floor processes – a practice that is still largely done with pen, paper and spreadsheets – to establish ‘if this, then that’ protocols across manufacturing processes.
- Lumada Manufacturing Insights is now also available on the Microsoft Azure Marketplace for easier integration with Microsoft cloud environments.
Smart Spaces & Lumada Video Insights:
- These industry solutions leverage new workflow automation within Hitachi Visualization Suite and a mobile application for improved team collaboration and incident response.
- An expanded Hitachi Edge Gateway portfolio includes industry-tailored and “ruggedized” versions that allow for data integration from sources such as trains, industrial spaces, or first responder vehicles and equipment, and includes higher compute power at the edge with CPU or GPU options to enable new outcomes, and make faster, more data-driven decisions.
- Sensor fusion creates the ability to co-analyze video, lidar, and other data to enable new use cases such as quality assurance and analysis of human-machine interaction, while improving accuracy.
Lumada Edge Intelligence:
- Integration between the Lumada software platform and Google Cloud allows customers to speedup data preparation by adjusting resources on-demand and combining multiple data types for better insights.
- Updates to Lumada Edge Intelligence also simplify Machine Learning workflows by pushing models to edge devices for faster automated decision-making without reliance on point tools.
- New APIs for edge management and data access allow reuse of assets, gateways, and software to create integrated solutions utilizing existing infrastructure.
“Meat & Livestock Australia has been collaborating with Hitachi Vantara on a number of digital projects over the past five years leading up to our latest project, the Connected Beef Supply Chain Control Tower,” says Dr. Nigel Tomkins, program manager, grassfed productivity at Meat & Livestock Australia. “Hitachi Vantara’s Lumada Manufacturing Insights has allowed our industry to integrate both sensor and system data to provide insights across the supply chain—this has led to improved productivity and quality outcomes. We look forward to leveraging the capabilities of the Supply Chain Control Tower even further—gaining insight on factors impacting supply and consumer demand.”
“Our industry is experiencing rapid digitalization and a distinct increase in the pace of business. This underscores the need for more agility and predictability in everything we do and what we deliver to our customers,” Petra Sundström, VP & Head of Digital Offering, Sandvik Rock Processing at industrial manufacturer Sandvik. “We’re collaborating with Hitachi to innovate our business models. With Lumada Manufacturing Insights, we are now able to offer predictive maintenance as a service–delivering the outcomes our customers are looking for in this digital era.”
“For manufacturers to get real end-to-end benefits from data-driven solutions, it’s important to focus not only on the obvious areas of production. Other data sources and solutions beyond the factory floor should also be looked at. For example, use video analytics to study material flow from receiving dock to warehouse to shop floor; use lidar to monitor employee movements from a safety perspective. There are so many ways to use this technology, and the applications become more apparent as the team familiarizes with the sensors and analytics,” says Allen Ahlert, senior director, Engineering with Hitachi Computer Products (America), Inc. which leverages Lumada solutions at its 352,000 sq. ft manufacturing and supply facility in Norman, OK. “Hitachi Vantara has been able to approach this holistically beyond what point solutions can do to create comprehensive, rich insights across facilities and processes.”
I have not talked with anyone from Rockwell Automation for months. So, it was time to catch up with Keith Higgins who joined the company within the past couple of years as VP of Digital Transformation leading the software group. As we might expect, digital transformation technologies and products include the analytics portfolio, MES, and the coordination with PTC’s products including ThingWorx, Kepware, and Vuforia.
Since I was fresh from a conversation with another supplier about the Edge, I brought that up in the context of analytics and ThingWorx. Higgins began to explain the power of using the PLC as an edge device. Rockwell has not talked to me for years about the PLC, but I remember that for years it has added compute and networking capability into that platform. Time for me to get an update there, too. My wild guess is that no sufficiently enticing partnership could be hacked out with Dell Technologies or HPE using their Edge compute. And, they already had a powerful Edge device that just needed IT-level bolstering. This will be interesting to watch.
Higgins brought up a tire plant example where having production data in context at the edge with the ability to perform predictive analytics combined for a powerful management tool.
One theme that recurs in this discussion in general is the necessity for solid context for data. Higgins having brought that up regarding the tire plant example, continued to a discussion of a technology/product developed in partnership with Microsoft called SmartObjects. This is a rich data model that adds deep context to data. My feeble way of thinking of this would be something like a modern data model like MQTT and OPC UA on steroids (no disparagement of either of those technologies meant).
I’ve been thinking deeply about productivity lately, so I asked about it. Rockwell views its contribution to its customers’ productivity in three buckets:
- Assets—building on predictive analytics, predictive maintenance, condition monitoring, and the like;
- Production line—improving utilization of the production assets;
- Human productivity—for example, the recent acquisition of CMMS supplier Fiix
I’m definitely interested in seeing where Rockwell’s new emphasis in software and edge goes. Many years ago, I asked then-CEO Keith Nosbusch about the software business. He said at that time it was an experiment. Higgins didn’t say that exact thing, but his remarks left no doubt that his area is primed to be a Rockwell growth vehicle.
Our schedules finally converged. I caught up with Tom Bradicich, PhD, known within Hewlett Packard Enterprise (HPE) as “Dr. Tom,” to learn the latest on the converged edge. Tom is one of the half-dozen or so people I know who can dump so much information on my brain that it takes some time to digest and organize it. He led development of the Edgeline device connecting with the Industrial Internet of Things. He is now VP and HPE Fellow leading HPE Labs developing software to come to grips with the complexities of the converged edge and “Converged Edge-as-a-Service”.
He likes to organize his thoughts in numerical groups. I’m going to discuss converged edge below with his groupings:
4 Stages of the Edge
7 Reasons for IoT and the Edge
3 Act Play
The foundation of the converged edge is found in the 3 C’s:
- Perpetual Connectivity
- Pervasive Computing
- Precision Controls
I remember Tony Perkins following up the demise of Red Herring magazine (charting the hot startup and M&A craze of the 90s, the magazine grew so large it came in two volumes for a while) with an online group called AlwaysOn. Trouble is, back in the 90s, we weren’t “always on.” Persistent connectivity was beyond our technology back then. Now, however, things have changed. We have so much networking, with more to come, that perpetual connectivity is not only possible, but also mundane.
HPE didn’t take a personal computer and package it for the edge. It developed Edgeline with the power of its enterprise compute along with enterprise grade stacks. It is powerful.
Then we have the 4 Stages of the Edge:
- Things—sensors and actuators
- Data Capture & Controls
- Edge IT (networking, compute, storage)
- Remote Cloud or Data Center
This is where Internet of Things meets the Enterprise.
Why do we need edge compute and not just IoT-to-Cloud? 7 Reasons:
- Minimize Latency
- Reduce bandwidth
- Lower cost
- Reduce threats
- Avoid duplication
- Improve reliability
- Maintain compliance
The Converged Edge is a 3-Act Play:
- Edgeline systems & software; stack identicality
- Converged embedded PXI and OT Link
- Converged Edge-as-a-Service
At this point in time, we are faced with 12 challenges to implementation:
- Limited bandwidth
- Smaller footprint for control plane and container
- Limited to no IT skills at the edge
- Higher ratio of control systems for compute/storage nodes
- Provisioning & lifecycle management of OT systems and IoT devices
- OT applications are primarily “stateful”, cloud unfriendly
- Data from analog world & industrial protocols
- Unreliable connectivity—autonomous disconnect operation
- Higher security vulnerabilities
- Hostile and unfamiliar physical environments and locations
- Long-tail hardware and software revenue model—many sites, fewer systems
- Deep domain expertise needed for the many unique edges
Of course, we could go into each of these items. Dr. Tom does in one of his latest talks (I believe it was at Hannover). We should pause at number 12, though. This is an often-overlooked necessity by AI evangelists and other predictive maintenance would-be disrupters. When you begin messing with industrial, whether process or discrete manufacturing, it really pays to know the process deeply.
I can’t believe I summarized this in less than a 600-word essay (is that still the common university requirement?). It is just an outline, but it should reveal where HPE has been and where it is going. I think its power will be disruptive to industrial architectures.
The action occurs at the edge these days. At least, edge as defined by the IT groups as they reach out toward the plant or factory with networks, compute, and other technologies as IT searches for more and better data to feed their decision-making systems. This release comes from ZEDEDA, a company relatively new to me that is quickly filling a space in the system.
- Direct integration simplifies secure deployment and management of Kubernetes clusters and hardware at the distributed edge at scale, without requiring specialized IT skills
- Collaboration with SUSE enables fully automated deployment of K3s clusters in the field, directly from ZEDEDA’s built-in app marketplace
- Open foundation and additional support for native Docker containers and VMs on edge hardware prevents lock-in and enables any brownfield or greenfield application
Following are details
ZEDEDA has announced direct integration with Kubernetes to simplify remote deployment and management of Kubernetes clusters on edge nodes at scale.
There is a clear trend for the majority of workloads to be containerized and to increasingly leverage Kubernetes for standardization, redundancy, and scale-out. However, this presents users with a challenge because the same tools developed for orchestrating and deploying Kubernetes in centralized data centers or the public cloud do not scale down to constrained edge nodes in the field. Organizations also face the challenge of transitioning to cloud-native development principles at the edge while accommodating their legacy software investments.
ZEDEDA’s cloud-based orchestration solution has a simple and intuitive UI along with comprehensive APIs that abstract all the complexities of provisioning Kubernetes clusters at the distributed edge, automating cluster bring-up on target edge nodes within minutes. With a few clicks or API calls, administrators without specialized IT skills can deploy Kubernetes clusters in the field and remotely manage the entire lifecycle of both their Kubernetes runtime and the underlying hardware at scale.
“Adding Kubernetes support on top of our already industry-leading distributed edge orchestration solution gives our customers unmatched flexibility,” said ZEDEDA founder and CEO Said Ouissal. “By collaborating with SUSE, we’ve teamed up with the leader in Kubernetes to put these types of advanced deployments within reach for anyone.”
The solution integrates with customers’ existing CI/CD workflow, features a robust and unique Zero Trust security architecture and supports any combination of virtual machines (VMs) and native Docker containers. In addition to supporting the choice of Kubernetes runtime distribution, ZEDEDA’s unique enablement of VMs on edge nodes enables deployment of any combination of legacy Windows-based applications (e.g., SCADA, HMI, Historian, VMS, POS), monolithic Linux-based images, and other popular container runtimes such as Docker/Moby, Azure IoT Edge and AWS Greengrass.
Optimized for the Unique Needs of the Distributed Edge
Orchestration solutions initially built for the data center cannot seamlessly transition to meet the unique needs of distributed edge computing in areas of available compute footprint, autonomy, security and deployment scale. ZEDEDA’s solution is purpose-built to address these needs by starting with a lowest-common-denominator foundation that scales up to create a bridge to the data center paradigm.
Distributed edge nodes under management can range from IoT gateways to managed telco and universal enterprise CPE infrastructure to small remote server clusters. The solution supports any vertical use case including IoT, edge AI, virtualized networking and security technologies in deployments spanning the factory floor, retail stores, oil rigs, wind turbines, transportation systems and beyond. The full-stack edge orchestration approach is based on an eventual consistency model to ensure that edge nodes will run autonomously in these environments, even if they periodically lose connectivity to the ZEDEDA cloud orchestrator.
ZEDEDA’s Zero Trust security architecture assumes that edge nodes distributed in the field are physically accessible, in addition to not having a defined network perimeter. Features include support for silicon-based root of trust, measured boot, remote attestation, crypto-based ID (eliminating local device login), full disk encryption, remote port blocking, distributed firewall and more. Distributed firewall capability enables secure routing of data between edge applications and both on-prem and cloud resources based on network-wide policies.
Collaboration with SUSE
Working with a leading oil and gas services company, ZEDEDA and SUSE have recently enabled an edge solution that consolidates existing SCADA software with NFV capability and other functionality on compute clusters within their trucks to optimize monitoring of oil wells through wireline analytics.
SUSE’s K3s runtime is now available in ZEDEDA’s built-in app marketplace, ready for bulk deployment on edge computing clusters with a few clicks.
“Edge is about management at scale, and our customers need a solution that provides low-level hardware management and visibility as they increasingly look to deploy Kubernetes clusters outside of centralized data centers,” said Keith Basil, vice president of Cloud Native Infrastructure at SUSE. “We think it’s great that ZEDEDA has integrated with Kubernetes and that K3s is now available in their marketplace. We are excited for their contribution to the rapidly growing Kubernetes ecosystem.”
Built on an Open Foundation to Prevent Lock-in and Facilitate Ecosystem Growth
ZEDEDA’s subscription-based cloud orchestration solution leverages the bare metal EVE-OS deployed on edge nodes. EVE-OS is an open, secure and universal operating system for distributed edge computing with vendor-neutral APIs, hosted within Project EVE in the Linux Foundation’s LF Edge organization. In addition to preventing vendor lock-in, EVE-OS provides an anchor point to unify an ecosystem of edge computing hardware and software, similar to what Android does for the mobile market.
The company’s growing open edge ecosystem provides end users, solution OEMs and system integrators with choice ofhardware, applications and services, thereby greatly reducing the complexity of dealing with the inherently diverse edge landscape. The app marketplace that is part of ZEDEDA’s cloud orchestrator features popular edge applications spanning industrial connectivity, edge application frameworks, security, networking, analytics, data management and cloud connectivity. Customers can also create private marketplaces with their own curated and in-house developed content.
ZEDEDA’s strategic integration with Kubernetes follows their recent announcement of seamless integration with Microsoft Azure IoT to simplify the deployment and scalability of solutions leveraging Azure IoT. Additional integrations that further simplify distributed edge computing deployments are in development.
ThinkIQ introduced itself to me just a couple of months ago, and I set aside some time to talk with an acquaintance from the industrial software market Niels Anderson. https://themanufacturingconnection.com/2021/02/manufacturing-saas-platform-tracks-material-flow-not-assets/ The company published a couple of news items this month. One announces an extension to the product line; the second announces a partnership with CESMII for a smart manufacturing initiative in the poultry processing industry.
VisualOps Solutions to Suite of Products
ThinkIQ announced VisualOps, which is designed to help organizations obtain easy access to data from a material view, new visibility, and a path towards Industry 4.0 Manufacturing.
ThinkIQ VisualOps was created as a second step for companies on the path to Industry 4.0 Manufacturing. The benefits include having data standardized and available in one location, empowering manufacturing leaders, plant managers, process and data engineers, and operators, by allowing them to explore their manufacturing and supply chain data within the context of their business. The new function can also start the process of creating alerts and notifications that may bring problems to immediate attention.
“The addition of VisualOps allows customers to start the journey of monetizing their manufacturing and supply chain data using an Industry 4.0 Platform that will help them achieve their digital transformation goals,” said Niels Andersen, CTO and CPO of ThinkIQ. “This product will help organizations obtain the benefits of Industry 4.0 and lead them on the path to Smart Manufacturing.”
Some of the additional benefits of ThinkIQ VisualOps include:
- Ability to move companies past raw data to being able to explore, compare, and be aware of the data — with standardized metrics and views to bring wide visibility and context to what is currently just digital bits.
- Allows organizations to harness the power of what are mostly disconnected existing data streams from IoT, IIoT, HMIs, PLCs, CRM, MES, digitized manual data, and partner data, all into one single location.
- Includes on-premise gateways & connectors to centralize the data and securely send this data to the cloud, and most clients don’t need to add any new hardware or software to their existing environment.
- Software includes sourcing existing data from Automation, IoT and IIoT, CRM, and other digital captures, and also includes an equipment profile library, equipment modeling, manufacturing process layout, trending, standardized dashboards, and basic limits & notifications.
ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform collects data across the operation (existing and IIoT sensors) and leverages AI, ML to provide actionable real time insights (e.g., identify correlations and root causes, traceability and yield issues, etc.). It creates a new level of capability beyond what independent disconnected operating environments can provide today.
CESMII and ThinkIQ To Transform Global Food Leader’s Poultry Processing
CESMII selected ThinkIQ for inclusion in its co-funded Smart Manufacturing Innovation Projects. This project aims to transform poultry processing operations at one of the world’s largest food companies, and recognized leader in protein. The project seeks to quantify the impact of variability in the supply chain and the processing of chicken by-products and understand the resulting impact of yield for four product streams, including: bone meal, feather meal, chicken meal, and blood meal.
ThinkIQ will utilize the CESMII Smart Manufacturing Innovation Platform and create Profiles that optimize yield and material utilization on the food company’s poultry processing line. This will enable decisions based on real-time constraints in material flows, manufacturing operations, and energy consumption in a protein-based food processing environment. This project will demonstrate increased operational efficiencies that can be extended to other food processing and energy-intensive industries.
“Delivering value through interoperability and scalability is essential,” says Doug Lawson, CEO of ThinkIQ. “We are leveraging the CESMII SM Innovation Platform to avoid implementing yet another information island on the plant floor, thereby reducing cost and complexity, and facilitating rapid return on investment.”
ThinkIQ’s ability to track material flow through the entire process from “farm to fork” will provide the company with a unique view of their entire poultry processing, allowing them to gain visibility into real-time variance in their manufacturing process. This enables them to quantify and reduce impact of variance improving yield, provide feedback in “plant time” to reduce off-spec, waste, etc. and present profit enhancement opportunities throughout the project.