Back in the 70s I actually had a job in a manufacturing company with a principle function regarding data. All the engineering, product, and cost data. Little did I know then that 45 years later data is the new oil. (Don’t really believe all the hype.)
I’ve been interested in the new function and applications called DataOps, among other things for data management. I’ve been part of the HPE VIP Community for a few years. HPE isn’t talking as much about manufacturing lately, but many of its technologies are germane. I picked up this short post from the Community site on data management.
Data management complexity is everywhere in the modern enterprise, and it can feel insurmountable. IT infrastructure has evolved into a complex web of fragmented software and hardware in many organizations, with disparate management tools, complicated procurement processes, inflexible provisioning cycles, and siloed data. Organizations are running multiple storage platforms, each with their own management tools, which becomes a progressively tougher problem at scale.
Complexity is a growing problem, and according to a recent ESG study, 74% of IT decision makers acknowledge that it is holding them back in their digital transformation journey. Their data management capabilities simply can’t keep pace with business demands. As a result, IT organizations are forced to spend time managing the infrastructure, as opposed to really leveraging the data.
Constant firefighting might sound like business as usual to many tech pros, but it can be avoided. To accelerate transformation, IT leaders must confront and eliminate data complexity first, getting data to flow where and when it needs to, and allowing the business to get back to business.
Of course, HPE has a solution—GreenLake for Block Storage.
What if the time has come to rethink all these specific silos and strategies that we build software solutions around?
Folk/rock group The Byrds popularized a Pete Seeger tune in the 1960s, “To everything (turn, turn, turn) there is a season (turn, turn, turn) and a time for every purpose under heaven.”
The time has come to rethink all the departmental silos manufacturing executives constructed over the years with vendors targeting their applications to fit. This era of the Internet of Things (IoT), sensor-driven real-time data, innovative unstructured databases, powerful analytics engines, and visualization provide us with new ways of thinking about organizing manufacturing.
HMI/SCADA can become IoT enabling software expanding beyond the normal visualization role. Types of MES software break the bounds of traditional silos. Not just quality metrics, OEE calculators, or maintenance schedulers, what if we thought of MES as operational intelligence bringing disparate parts together? These can provide managers of all levels the kind of information needed for better, faster decision making.
I have worked with a number of maintenance and reliability media companies. They have all been embroiled in discussions of the comparative value of maintenance strategies: Reactive (run-to-failure), Preventive, Predictive, Reliability-centered. These are presented as a continuum progressing from the Stone Age to StarTrek. With them are always discussion about which is best.
The IT companies I have worked with fixated on predictive. They had powerful predictive analytics to combine with their database and compute capabilities and saw that as the Next Big Thing. They were wrong.
I was taught early in my career that Preventive was also known as scheduled maintenance. Management sends technicians out on rounds on a regular basis with lube equipment and meters to check out and lubricate and adjust. As often as not, these adjustments would disturb the Force and something would break down.
What if? What if we use all the sensor data from equipment sent to the cloud to a powerful database? What if we use that data to intelligently dispatch technicians to the necessary equipment with the appropriate tools to fix before breaking and at an appropriate collaborative time?
A company called Matics recently was introduced to me via a long-time marketing contact. They wanted to talk about the second definition of preventive maintenance. Not just unscheduled rounds but using sensor-driven data, or IoT, to feed its Central Data Repository with the goal of providing Real-time Operational Intelligence (RtOI) to its customers.
According to Matics, its RtOI system has provided customers with:
- 25% increased machine availability
- 30% decrease in rejects
- 10% reduction in energy consumption
Smarter preventive maintenance leverages continuous condition monitoring targeting as-needed maintenance resulting in fewer unnecessary checks and less machine stoppage for repair.
I am not trying to write a marketing piece for Matics, although the company does compensate me for some content development. But their software provides me a way to riff into a new way of thinking.
Usually product engineers and marketing people will show me a new product. I’ll become enthused. “Wow, this is cool. Now if you could just do this and this…” I drive product people crazy in those meetings. I think the same here. I like the approach. Now, if customers can take the ball and run with it thinking about manufacturing in a a new way, that would be cool—and beneficial and profitable. I think innovative managers and engineers could find new ways to bring engineering, production, and maintenance together in a more collaborative way around real-time information.
I’ve felt DataOps was destined to be an important data management tool since I was introduced to it a few years back. Hitachi Vantara is one of two companies I follow specifically bringing this technology to industrial applications. Here it introduces a new portfolio bringing IIoT specifically into the core working with digital twins, machine learning (ML), and user interface.
Background. Ultimately, most industrial IoT difficulties are rooted in data management shortcomings. However, these challenges are not the same as those faced in a purely IT setting. For example, operational technology (OT) data is high-velocity time series and event information that many times lacks the detailed metadata descriptors and features needed to leverage it outside of the operations organization. In comparison, business IT data comes across in batches or transaction records with different metadata descriptors where time-stamp references are not always correlated. Merging these datasets, in context, is not trivial work, but, if done right, yields new operational insights that can provide a competitive advantage.
Lumada Industrial DataOps automates the process of abstracting, tagging, and rationalizing IT and OT data and organizes it in the data lake or data warehouse so it is usable for analysis and building AI and ML solutions. Data pipelines are established, and multiple transformations and inferences can be calculated and orchestrated as part of the workflow. Industrial process engineers can work with data scientists, analysts, and applications consultants to unlock the combined value and make major operations improvements.
But reality hasn’t lived up to the promise, and industrial operations have had a mixed relationship with IoT technologies. While there has been considerable success at the project level, broad IIoT deployments and the resulting analytics capabilities have progressed in fits and starts. Enterprises will need to leverage IIoT as well as AI and ML technology across far more use cases to better support their existing workforce and overcome supply chain issues. It turns out that it is more complicated than developers anticipated to scale their IIoT proofs of concept to stretch across a company.
The Lumada Industrial DataOps portfolio adds IIoT Core software with IIoT platform framework capabilities. The new toolkit is delivered as IIoT Analytics to accelerate the convergence of traditional IT with expanding IIoT data sources and bring powerful new software-based capabilities to life. IIoT Analytics offers prepackaged modules that provide data integration and preconfigured functions that give you a faster start on your application so you can focus on fine-tuning it for your specific requirements. A typical IIoT Analytics toolkit includes:
- Digital twins for data and asset organization
- ML models for faster assembly
- Simulation software interfaces for greater accuracy
- ML services framework for deploying AI/ML applications
Lumada Industrial DataOps directly addresses the four key challenges that hinder the enterprise-wide expansion of IIoT applications.
Challenge 1: The Need for High-Level Data Management Organizations need solutions that make it possible to access data in motion and at rest from the widest array of sources, integrate all that data, transform the data, and perform analysis. While all that happens, data security must be maintained and policies enforced to adhere to compliance and governance requirements.
Challenge 2: Automating Data Organization To create an efficient production pipeline for AI models, data scientists and analysts need an environment within which they can organize data and build models to detect events. This requires a system that automates the data analysis function, rejecting noise and providing people with a rich data signal that can be predictive or prescriptive in context.
Challenge 3: Accelerate the Training of AI Models Starting every model from scratch is not practical, as this approach may introduce delays and costs that get in the way of meeting business objectives. Data science personnel instead need templates that provide a proven foundation that they can then refine and adapt to meet specific requirements in a timely manner.
Challenge 4: Shorten Application Delivery Time Engineers and developers also need ready-made application components that provide a starting point.
Using Lumada Industrial DataOps, organizations can accelerate their development of digital twins, which can be further combined with new AI and ML analytic templates that address a variety of critical industrial activities. These analytics include anomaly detection and prediction capabilities for maintenance and operations effectiveness. These data management and application building blocks support the many industry-specific solutions offered by Hitachi to speed cooperative deployment efforts for Hitachi clients and partner organizations.
Lumada Industrial DataOps embraces the synergistic srelationships between data management, AI applications, and the next-level decision-making required in modern industrial environments. With Lumada Industrial DataOps, Hitachi empowers industrial enterprises to move their IIoT-driven AI applications out of the endless pilot phase and more quickly develop and scale for enterprise-wide deployment.
This news is a bit old dating from the first of May. Its relevancy maintains its freshness—another look at the major IT companies looking for market in manufacturing. This holds personal interest in that once again I am not invited back to an IT company user conference because they tried a manufacturing vertical without success. (I could have told them, but that story will hold for another place and time.)
Google Cloud has co-developed Manufacturing Data Engine and Manufacturing Connect. These solutions are said to enable manufacturers to connect historically siloed assets, process and standardize data, and improve visibility from the factory floor to the cloud. Once data is harmonized, the solutions enable three critical AI- and analytics-based use cases–manufacturing analytics & insights, predictive maintenance, and machine-level anomaly detection.
- Ford, Kyocera, and Phononic among early customers to enhance data transparency and optimize production with new manufacturing-specific solutions
- Cognizant, C3 AI, GFT, Intel, Litmus, Quantiphi, SoftServe, Sotec, Splunk, among partners supporting the new solutions
Manufacturing Data Engine and Manufacturing Connect, available today, help manufacturers unify their data and empower their workforce with easy-to-use analytics and AI solutions based on cloud infrastructure.
This continues the discussion I made yesterday about DataOps. The rapid move to data organizations and technology in manufacturing continues to amaze me.
Manufacturing Data Engine is an end-to-end solution that processes, contextualizes, and stores factory data on Google Cloud’s data platform. It provides a configurable and customizable blueprint for the ingestion, transformation, storage, and access to factory data. It integrates key Google Cloud products, including Cloud Dataflow, PubSub, BigQuery, Cloud Storage, Looker, Vertex AI, Apigee, and more, into a manufacturing-specific solution.
Manufacturing Connect is a factory edge platform co-developed with Litmus Automation that quickly connects to, and streams data from, nearly any manufacturing asset and industrial system to Google Cloud, based on an extensive library of more than 250 machine protocols. Deep integration with the Manufacturing Data Engine unlocks rapid data intake into Google Cloud for processing machine and sensor data. The ability to deploy containerized applications and ML models to the edge enables new dimensions of use cases.
Once data is centralized and harmonized by the Manufacturing Data Engine and Manufacturing Connect, it can then be used to address a growing set of industry-specific use cases.
Data holds no value unless it can be analyzed and visualized.
Manufacturing analytics & insights, which helps manufacturers quickly create custom dashboards to visualize key data—from factory KPIs such as Overall Equipment Effectiveness (OEE), to individual machine sensor data. Integrated with the Manufacturing Data Engine, engineers and plant managers can automatically set up new machines and factories, enabling standardized dashboards, KPIs, and on-demand drill-downs into the data to uncover new insights opportunities throughout the factory. These can then be shared easily across the enterprise and with partners.
According to ESG’s April 2022 The State of DataOps survey sponsored by HighByte, 97% of organizations face data integration challenges. One of the most common integration challenges is the inability to apply automation to processes or workflows, as reported by 28% of respondents.
Industrial software company, HighByte, addressed this problem by adding templates and expanding connectivity to leading data lakes and warehouses with HighByte Intelligence Hub version 2.4.
I’ve been bullish on the utility of DataOps and HighByte’s approach to solving the problems of data. This news relates to the latest version of its Intelligence Hub—version 2.4.
This version enables users to scale data operations projects faster with less effort across a wider range of connections. The release introduces several new features, including instance and input templates, custom conditions, global functions, and parameters that enable users to define common, reusable components within the Intelligence Hub to improve its speed of deployment and maintainability. The latest release also includes new native connections to leading data lakes, warehouses, and sources, including Amazon S3, Amazon Redshift, Microsoft Azure Blob Storage, and Modbus.
“The Intelligence Hub will enable us to simplify our data architecture, making our data both easier to manage and more accessible to a wider audience through a unified namespace,” said Hayden Lovett, Global CI Engineer Specialist at Scholle IPN. Now part of SIG, Scholle IPN is a global leader in innovative sustainable packaging solutions. “With the new templating capabilities and custom conditions in the Intelligence Hub, we can rapidly create and manage modeled performance and OEE data for our machines. The ability to merge data from many sources will make new use cases possible, like precisely measuring how quickly machine issues are resolved and who performed them. It’s a powerful tool.”
To learn more about the release and see a live demonstration of the software, please register for the webinar, “More Data, Less Clicks: Automate Your Industrial Data Operations with Templates and Event Flows,” on June 29, 2022 at 11:00 AM ET.
GE Digital held multiple meetings at the recent ARC Forum discussing a number of advancements. Executives talked frankly about the coming break up of GE corporate with a positive spin on Digital landing with the Energy group. I’m not a financial analyst, but this restructuring seems to meet with widespread approval. Perhaps with a new CEO at Digital and new organization we’ll see even more from it. Following are summaries of four announcements from last week.
Proficy Plant Applications Upgrades
GE Digital announced the latest version of its Manufacturing Execution System (MES) Proficy Plant Applications, part of the Proficy Smart Factory portfolio. New features in the 2022 version include expanded discrete manufacturing capabilities developed through co-innovation with existing customers as well as a better user experience for the modern connected worker.
Customer results with Proficy Plant Applications have included saving 45 minutes per shift, per line, per business unit at one of the world’s largest consumer packaged goods (CPG) companies; 3X increase in throughput for an automotive battery manufacturer; and 80% reduction in downtime at a pharmaceutical company.
The new discrete features include extending support for serialized and lot-based discrete manufacturing such as job shops and complex discrete environments. These extensions include the ability to support the execution of very large lots plus the flow of complex routes that include optional, parallel, and mutually exclusive operations. Proficy Plant Applications has also expanded the scope of discrete to include complex receiving inspection rules as well as linking them to non-conformance.
Implementation speed of the Proficy Plant Applications solution has been improved with a new low-code Data Flow Editor, easing integration with other systems, and modelling complex logic. These capabilities allow faster installs and upgrades including seamless ERP integration with approximately 20 out-of-the-box messages for MES-ERP communication.
Proficy Plant Applications is also designed to empower the modern Connected Worker and operating teams with easy, flexible access to their manufacturing data for better decision making. The software provides a contemporary Web User Interface (UI) with widgets as embedded in out-of-the-box screens and available for custom screen development in the software’s no-code development environment.
Available on-premises and in the cloud, the software includes four modules: Efficiency Management, Production Management and Tracking, Quality Management, and Batch Analysis. Proficy Plant Applications is used at thousands of diverse manufacturing sites around the world including top food and beverage, non-food consumer packaged goods, automotive, pharmaceutical, aerospace, chemicals, heavy equipment, and other manufacturers.
GE Digital Partners with Visionaize to Provide 3D Visualization Capabilities to APM
GE Digital announced a technology partnership with Visionaize, a provider of world-class operational 3D Digital Twin solutions. This technology is designed to help empower data-driven businesses with high accuracy visualization capabilities for operational use cases that can help to save time, reduce cost, and improve safety. Visionaize’s V-Suite Starter can be incorporated into GE Digital’s APM Mechanical Integrity software designed to help streamline maintenance activities and improve worker efficiency across asset intensive industries.
While industrial companies can choose to review data and information in any number of formats including plant blueprints, Piping and Instrumentation Diagrams (P&ID), and Isometrics through various means of CAD/CAM software solutions, the implementation of V-Suite Starter is designed to allow users to view real-time and historical asset data on the 3D model to deliver timely and valuable insights at the point of action. The 3D visualizations of key asset information combine “see it, model it, and understand future behavior” to help asset-intensive companies solve key challenges associated with asset downtime, quicker time to resolution, maintenance costs, and workforce productivity.
Visionaize integrates 3D plant models with APM Mechanical Integrity to provide data-driven visualization of key APM Integrity attributes by color coding the model to filter out noise and focus on what matters most. The ability to dynamically visualize and contextualize data for operations and maintenance activities help make inspectors, planners, maintenance technicians, corrosion analysts, and Risk Based Inspection (RBI) analysts more efficient and effective in their daily jobs.
Utilizing 3D visualization in GE Digital’s APM solution can help improve worker safety and productivity with situational awareness and less time spent in hazardous environments. An accurate 3D model is designed to allow recovery planning even before workers can access the site itself to facilitate faster time to resolution. And it can help to optimize maintenance programs with more efficient scheduling to accelerate planned and unplanned downtime activities.
GE Digital Launches Accelerator Tools to Help Energy Companies Accelerate Digital Transformation
GE Digital announced its new Accelerator product line adding best-in-class software tools to help empower companies in asset-intensive energy industries to quickly configure its Asset Performance Management (APM) and other offerings. GE Digital Accelerators are designed to enable faster time to value, scale APM to a wider range of assets, and help enterprises gain financial value at both their facilities and across the enterprise.
GE Digital’s first set of Accelerators includes an extensive library of predictive analytics, asset management strategies, and standard health and reliability processes. These can be applied to the entire energy value chain, from oil & gas to power generation — ranging from renewables like wind and solar to more traditional gas power and nuclear production assets.
Energy companies have used asset management practices for decades on the highest criticality equipment, ensuring high availability to meet the demands of their customers. The increasing complexity and scale of industrial facilities, along with the need for market flexibility, requires applying asset performance practices to a much wider array of equipment, using more advanced digital technologies. Accelerators will help energy companies expand asset management processes without consuming more time and resources building custom configurations and health monitoring techniques.
GE Digital Achieves AWS Energy Competency Status
GE Digital announced that it has achieved Amazon Web Services (AWS) Energy Competency status. This designation recognizes that GE Digital has demonstrated deep expertise leveraging AWS to build, implement, and integrate technology that transforms complex business and operational systems to help accelerate the energy transition.
GE Digital’s Asset Performance Management (APM) solutions for both Power Generation and Oil & Gas industries were the basis of receiving the AWS Energy Competency status. These software solutions are designed to help optimize the performance of industrial assets to increase reliability and availability, to minimize costs, and reduce operational risks. For power generation, this is an imperative as plants must be able to reliably respond to more dynamic operating models as more and more renewables are added.
To receive the AWS Energy Competency Partner status, AWS Partners undergo a rigorous technical validation process, including a customer reference audit. The AWS Energy Competency provides energy customers the ability to more easily select skilled partners to help accelerate their digital transformation with confidence.