Looks like we have a serious new entrant for cloud services for manufacturing software—Google Cloud. For the past many months, AWS and Azure have been the cloud infrastructure partners of choice. Google Cloud has begun making some inroads. Any thoughts on how this will shake out?
I thought I was getting a press release. This looks to be more like a blog post by Dominik Wee, Managing Director Manufacturing and Industrial of Google Cloud.
Across the industrial sector, many digital enablers and disruptive technologies are transforming businesses to be more efficient, profitable, nimble, and secure: smart factories with connected machines, the rise of the industrial internet of things (IIoT), proliferation of sensors and data, and new cloud strategies, to name a few.
Our work with the PI System is designed to help industrial companies modernize their data and take it beyond the operational space to Google Cloud, deriving more insights and business value. Thanks to our partnership with AMD and OSIsoft, now part of AVEVA,customers can effectively, safely, and easily deploy a fully functioning PI System to Google Cloud. Together, we assist customers along that transformational journey through the launch of GCP deployment scripts for PI Core.
The PI System is the market-leading data management platform for industrial operations in essential industries such as energy, mining, oil and gas, utilities, pharmaceutical, facilities, manufacturing, and more. The PI System automatically captures sensor data from every relevant source, refines and contextualizes the data, and delivers it to a broad ecosystem of people, tools, and applications for analysis and strategic planning. It also serves as a single source of truth for consistent decision making.
With the PI System, industrial companies can generate insights from their data, including:
Analyzing seasonal trends
Determining if utilities are meeting the demands of production
Comparing the performance of different lots of raw material
Determining when maintenance is required on equipment
Optimizing the utilization or performance of a production line
The new deployment scripts are a natural addition to PI Integrator for Business Analytics, which integrates Google Cloud withPI Core, the on-premises portion of the PI System. PI Integrator for Business Analytics can deliver critical operational data directly to Google Cloud destinations like Cloud Storage, BigQuery, and Pub/Sub. It can also be deployed either on-prem or on Google Cloud.
The solution allows customers to take time-series data from one or many of their PI Core deployments and move it ontoBigQuery, Google Cloud’s enterprise data warehouse for analytics. Customers can take advantage of exabyte-scale storage and petabyte-scale SQL queries in this high-speed, in-memory BI engine, enabling cloud-based artificial intelligence (AI) and machine learning (ML) tools that provide deep insights like anomaly detection and predictive maintenance from that data.
Through our partnership, PI Core-based solutions for industrial customers are further protected by Google Cloud’s robust security features, like encrypting cloud data at rest by default, that keep data safe and more secure. What’s more, PI Core is aligned with real-time mission-critical industrial applications, avoiding delays in time-sensitive process automation or discrete manufacturing processes.
Serving a new slice of PI (System) on Google Cloud
GCP deployment scripts for PI Core are the latest achievement in our partnership. We developed these deployment scripts for PI Core on Google Cloud using Infrastructure as Code (IaC) with PowerShell and Terraform technologies. With PI Core on Google Cloud, customers can generate business insights through the power of Google Cloud’s Smart Data Platform with AI/ML combined with other contextualized data. The scripts accelerate an industrial customer’s digital transformation and support quick and iterative testing and prototyping. They’re also designed for customers who are considering moving their on-prem system to Google Cloud, or customers who would like to deploy PI Core as an aggregation system to bring diverse on-prem data from multiple PI Systems into one place.
These deployment scripts automate the provision of the PI Core components running on Google Cloud N2D VM instances powered by hardware such as 2nd Gen AMD EPYC processors. GCP deployment scripts for PI Core include recommended practices for both platforms by delivering automation to deploy test and development environments.
The scripts are offered in non-high availability (non-HA) configuration, with an HA version providing failover capabilities for PI Core coming soon. These scripts are open-sourced, and the non-HA version is now available on Google’s GitHub.
By running PI Core on Google Cloud, the industrial sector has an easy and efficient path to business insights, generating more value from their collected data in a safe and managed environment. Enabling PI Core deployment on Google Cloud through scripts is just one of the solutions that we’re building together. Stay tuned for more updates.
The team that started HighByte picked an open spot called DataOps in the OT/IT pantheon that I felt was today’s sweet spot. The IT companies introduced me to DataOps (the next thing past DevOps, I guess). The application looked, shall we say, useful. HighByte is up to version 1.4 just released and has many applications in the wild already. It is well worth your time checking out.
HighByte announced the release of HighByte Intelligence Hub version 1.4. HighByte Intelligence Hub is an Industrial DataOps solution enabling manufacturers to streamline their data architecture and reduce time to deploy new systems. The software supports open standards and provides Operations Technology (OT) teams with the flexibility to merge, prepare, and deliver industrial information to and from IT systems without writing code. With its latest release, HighByte Intelligence Hub now supports complex, multi-value sets of data, a broader range of disparate data sources, flow consolidation, and additional security features.
“HighByte Intelligence Hub enables me to easily model data—adding context, segmenting, and publishing data to the unified namespace”, said Tim Burris, a software architect at Ectobox. Ectobox is an industrial intelligence solutions company specializing in data-driven manufacturing, ISA-95, and software consulting and development for their clients. “The time HighByte Intelligence Hub has saved me is incredible. I’m using the flow capabilities, updated SQL connecter, and REST input to reduce custom code and deliver a maintainable integration solution to our clients. It’s a game changer.”
HighByte works with a network of system integrators and international distributors to take HighByte Intelligence Hub to market, serving customers in a wide variety of vertical markets including biotechnology, packaging, industrial products, and academic labs. “This release focused on meeting the needs of our customers and partners with a more sophisticated solution that can handle complex data,” said HighByte CEO Tony Paine. “HighByte Intelligence Hub is ready to support Edge-to-Cloud and Cloud-to-Edge use cases.”
HighByte Intelligence Hub is an Industrial DataOps application designed for scale. The latest release includes support for complex data sets that can be processed through HighByte Intelligence Hub’s modeling engine, transforming and contextualizing information at high volumes. The release also includes new codeless connections for CSV files and JDBC drivers, significant improvements to existing SQL, MQTT, and OPC UA connectors, multiple flow inputs and outputs, array handling, and REST templates. Security enhancements include secure connections over MQTT with SSL encryption as well as payload encryption and usernames and passwords for OPC UA connections.
I copied the information below from the HighByte blog written by Aron Semle to add more context and user ideas to the product news: https://www.highbyte.com/blog/new-use-cases-highbyte-intelligence-hub-version-1-4-is-here
Highlights & Use Cases
Use Multiple Flow Inputs and Outputs. Let’s say you’ve created a “boiler” model to model the ten boilers you have at your facility. Now you can create a single flow that sends data for all ten of the boilers to an AWS IoT Core topic or all the boiler data to multiple outputs, like a REST endpoint for debugging and Azure IoT Hub for production.
See Your Data. Now you can easily create and test any input in the UI, allowing you to see what data and schema the input returns. Not sure where that “amps” attribute was in that JSON payload? Simply expand the input in the expression builder and see all the data and schema right in the UI.
Work With Arrays. Arrays are everywhere, from OPC UA arrays representing set points in the PLC, to an array of JSON objects representing motor running hours from a maintenance system. HighByte Intelligence Hub now supports inputting and outputting simple and complex array types. You can index arrays in expressions, slice them, dice them, and even build arrays out of primitive OPC UA tags.
Read CSV Files. Many older industrial devices, especially testing equipment, write their data to a CSV file. This new input makes it easy to read one or more CSV files from a directory. Files can be static, or the input can index files, reading them in chunks and moving them to a completion directory when done.
Get More From Your OPC UA Connections. OPC UA now supports encryption and username and password. The connector also supports outputting a model to a tag group (i.e., multiple tags), making getting modeled data into an OPC UA server even easier.
Move SQL data Into and Out of Your UNS. You can now read multiple rows from a SQL table, run them through a model to do things like change the column name or add in new data, and output them to MQTT as an array of JSON objects representing each row. This makes getting SQL data into your UNS easy. The SQL connector output also now supports table creation, so you can connect an MQTT input into the UNS, subscribe to a topic, and output the data model directly to a SQL table to review data locally. SQL outputs can also call stored procedures or log data as JSON in cases where the model schema isn’t finalized.
Leverage REST Templates. Do you need to get data into InfluxDB using their line protocol? Or Elastic Search? The REST output now allows you to transform the default JSON output to any format you need, including XML. This makes integration with REST based interfaces easy and flexible.
Enable Cloud-to-Edge with MQTT. MQTT now supports inputs and secure (TLS) connections, allowing you to connect to MQTT brokers like AWS IoT Core. This enables “Cloud-to-Edge” use cases, where machine learning alerts generated in AWS can be sent back down to HighByte Intelligence Hub, transformed via modeling, and then output to protocols like Sparkplug, making the data immediately available in any Sparkplug-enabled SCADA system like Ignition. Another common use case with a similar data flow is streaming sensor data to AWS IoT Core from third-party sensor providers (i.e. LoRa sensors). This data can be pulled in, modeled, and output to the SCADA system using HighByte Intelligence Hub.
The rush to improve bottom lines and build closer relationships to customers through adding a valuable services portfolio has progressed from novel to commonplace. I’ve seen major IT companies pivot to a services-first strategy. This announcement of new services from GE Digital is a nice step forward.
GE Digital announced the availability of new Professional Services offerings. Quick Start for Industrial Advanced Analytics, Quick Start for Proficy Operations Hub, and Quick Start for OEE allow industrial companies to partner with GE Digital to deploy the company’s software with action plans for rapid digital transformation and continuous improvement.
“The market is moving fast, driving our customers to manage more SKUs, shorter production runs and more line change overs. Our solutions can scale from plant to enterprise-wide and then beyond the four-walls, from Plant Operations to Enterprise execution across many industries,” said Richard Kenedi, General Manager of GE Digital’s Manufacturing and Digital Plant business. “This requires accelerated pilot projects based on more than a million deployment hours of experience that provide faster results and a template for global visibility, orchestration, and optimization.”
The Quick Start for Industrial Advanced Analytics enables an industrial organization’s process engineers in AI and machine learning to accelerate problem detection, reduce variability, and optimize equipment and processes so that factors that are impacting process variability are identified quickly. GE Digital’s Professional Services team deploys Proficy CSense and provides a path for root cause analysis, early warning monitoring, predictive and proactive actions, process Digital Twin simulations, and process setpoint control and optimization.
With the Quick Start for Industrial Advanced Analytics, organizations can unlock the hidden potential within data to significantly improve operations in a way that simple trending or traditional analytics cannot. GE Digital’s Professional Services team develops behavioral models to identify abnormal conditions that might indicate problems such as faulty or inefficient uses of assets or systems, causes for variation in production rate or quality based on external factors or equipment, or root causes that could be resolved using advanced process control. Modelling the behavior of existing processes can enable new awareness and drive predictive behaviors for improvement in quality, maintenance, and production planning.
The Quick Start for Proficy Operations Hub helps industrial organizations to not only modernize their UI but also to merge data coming from different systems related to production KPIs, environmental health and safety, maintenance, and more. Teams across all levels in an organization can visualize their processes and make better decisions based on real-time and historical plant-wide application data, including historian, MES, and third-party solutions.
With the Quick Start for Proficy Operations Hub, users can monitor, control, provide data entry, and perform analysis all through Proficy Operations Hub using a desktop, laptop, or any mobile device. The Quick Start service provides a path for modernizing visualization and transitioning reports and dashboards to a thin client based on the latest Web-based UI interactions. With this Quick Start, companies can economically transition to a modern front-end without back-end or production changes, as part of a critical step toward digital transformation.
The Quick Start for OEE allows plant production managers, shift leaders, and line operators to rapidly deploy a targeted MES solution for one production line and measure benefit realization in weeks. OEE, a widely acknowledged best practice KPI, measures manufacturing productivity according to three standard perspectives: Quality (good production), Performance (fast production), and Availability (no downtime). Manufacturing teams can increase the effectiveness of their continuous improvement approach through the application of OEE via automatic data collection from automation systems, scaling by first focusing on a pilot line to quickly assess benefit realization, followed by defining a path to full enterprise visibility. This Quick Start helps manufacturers track losses with real-time integration, perform root-cause analysis, and solve the most impactful issues by identifying and implementing countermeasures.
ABB has moved its media relations, at least the people working with me, to England. Two other major European industrial technology providers also have moved marketing and media relations from the US to England. I have no idea whether this is a statement about the state of the US market or the state of their cost structure. Maybe both.
ABB changed CEOs several years ago and just changed again. These precipitated changes in corporate emphases—divesting several large businesses and adding others. The variety in these three announcements certainly reflects the new diversity of the corporate portfolio.
ABB drone-based leak detection and greenhouse gas measuring system
HoverGuard can help operators of USA’s three million miles of pipeline infrastructure to increase their safety and environmental capabilities in line with the USA 2020 PIPES Act.
According to the US Energy Information Administration (EIA), the majority of gas shipments in the US take place using the millions of miles of the nation’s pipeline infrastructure. On December 27, 2020, the Protecting Our Infrastructure of Pipelines and Enhancing Safety (PIPES) Act was signed into law in the United States. The Act directs gas pipeline operators to use advanced leak detection technologies to protect the environment and pipeline safety.
Detection of odorless and invisible gas leaks can be challenging and expensive. ABB’s latest addition to its ABB Ability Mobile Gas Leak Detection System, HoverGuard, provides the solution by finding leaks faster and more reliably than ever before.
HoverGuard is an Unmanned Aerial Vehicle (UAV)-based system that detects, quantifies and maps leaks up to 300 ft from natural gas distribution and transmission pipelines, gathering lines, storage facilities, and other potential sources quickly, safely and reliably. It automatically generates comprehensive digital reports that summarize results and can be shared in minutes after a survey.
The cloud-connected, multi-gas solution is also the first of its kind to quantify the three most important greenhouse gases methane, carbon dioxide and water vapor continuously while flying. Each greenhouse gas affects the environment differently and is present in the air in different amounts. Sourcing individual gases also provides important information to scientists and researchers when studying the complex environmental processes affecting climate and pollution.
Patented cavity enhanced laser absorption spectroscopy detects methane with a sensitivity more than 1,000 times higher than conventional leak detection tools. This sensitivity and speed allow HoverGuard to detect leaks while flying at altitudes of 130 ft, or higher, and at speeds greater than 55 mph. It can cover 10-15 times more land area per minute by operating on low-cost commercial UAV capable of carrying a payload of 6.6 lbs.
ABB calls for more collaboration with OEMs
ABB has signed a Memorandum of Understanding (MoU) with Hitachi Construction Machinery to share their expertise and collaborate in bringing solutions to market that will reduce the greenhouse gas (GHG) emissions associated with heavy machinery in mining.
The companies will explore possibilities to apply ABB’s electrification, automation, and digital solutions to mining trucks and excavators provided by Hitachi Construction Machinery as part of wider efforts with mine operators to electrify all processes from pit to port. Original equipment manufacturer (OEM) Hitachi Construction Machinery also brings expertise in driverless operation and labor-saving technologies. The aim of the combined solutions is to enhance the efficiency and flexibility of customer businesses, contributing to the reduction of CO₂ and the realization of a sustainable society.
The collaboration with Hitachi Construction Machinery is one of many that ABB are looking to develop with OEMs to accelerate the transition to all-electric mines.
“ABB is calling for more collaboration between OEMs and technology companies to fast-track the development of new emissions-reducing systems with electrification and automation of the whole mining operation the goal. We are ready to work more with OEMs to establish a common approach for the market, and through strategic collaboration provide solutions that can help enable a low-carbon society and make mining operations more responsible,” said Joachim Braun, Division President, Process Industries, ABB. “New emissions-reducing technologies can transform the energy-intensive mining industry to achieve an even more productive, but also sustainable future.”
“Today, the challenge of our customers is on electrification of trucks and the time to change is now. But nobody can achieve this transformation alone. Co-creation of solutions with OEMs and mining companies is needed to successfully integrate electrification in mines.” said Max Luedtke, ABB’s Global Head of Mining.
ABB Ability Performance Optimization Service for cold rolling mills
ABB has launched its new ABB Ability Performance Optimization Service for cold rolling mills, offering steel, aluminum, and other metals manufacturers opportunities to reach new levels of operational performance through technology, boosting their processes and profitability.
The new service – part of ABB’s metals digital portfolio and Collaborative Operations for Metals suite – combines continuous performance monitoring using ABB Ability Data Analytics for cold rolling mills, with real time support from ABB experts. ABB will work alongside customers with the vision of continuing to transform the metals industry.
The data analytics component uses process-specific algorithms based on a century of metals domain expertise to collect high frequency data from mill control systems and discover trends, benchmarks and other performance factors, sending alerts to operators and maintenance when opportunities to optimize performance are identified.
Alongside this, ABB experts are available to provide onsite or offsite support, recommending actions to ensure the mill maintains its performance targets against key performance indicators (KPIs) for productivity, quality and yield. Leveraging the collective strengths of metals producers and ABB experts, access to dashboards is shared, enabling all parties to drill down to individual coil level.
In addition, ABB experts can provide customers with detailed reports at regular intervals describing areas for improvement, identified trends, or problem areas found in historical data, allowing for continuous improvement over time.
Key benefits include continuous collaboration and access to experts; increased productivity through improved asset performance and reduced downtime; higher yield and quality resulting from immediate corrective action when problems occur; reduced risk of equipment failure; the ability to leverage insights across the enterprise and reductions in wastage, energy and other costs.
CESMII – the Smart Manufacturing Institute constitutes the USA equivalent of manufacturing initiatives found in Germany, Italy, China, and many other countries. I’m glad to see CEO John Dyck popping up in panels with Industrie 4.0 and other international conferences. We are in a strange geopolitical/business vortex right now where some countries have realized (in the US case—belatedly realized) that manufacturing is strategic for national survival yet businesses (as they have from the beginnings of civilization) have realized that they must engage with international trade for their survival.
While the world’s peoples sort all of this out, rest assured that there exists a US national initiative to promote smart manufacturing. Below is some new information. Perhaps you, too, can submit an RFP.
RFP to Create Industry-Vertical / Technology-Horizontal Smart Manufacturing Innovation Centers
CESMII – the Smart Manufacturing Institute has announced a new Request for Proposals (RFP) to expand the network of Smart Manufacturing Innovation Centers (“SMICs”). These are a collection of visible, credible, high value, regional extensions of CESMII. Each SMIC works as a partner organization with CESMII, with strong industry engagement and core competencies related to a key industry vertical (i.e., steel, automotive, chemical processing, discrete manufacturing, etc.) and/or a technology horizontal (i.e., artificial intelligence and machine learning, new sensing and wireless data communications, process control algorithms and predictive models, etc.).
Through this RFP, CESMII seeks to strategically expand its SMIC network through the addition of new SMIC partner organizations. These new SMIC organizations will benefit from their CESMII partnership in many ways:
Increased national and global leadership as a Smart Manufacturing Innovation Center.
Expanded portfolio and capabilities with access to industry-leading Smart Manufacturing (SM)
technology, knowledge and thought leadership.
CESMII-funded SMIC operating budget to implement SM technologies around existing assets.
CESMII-provided SMIC Operating Kit to increase the return on investment of industry
engagements and deepen the exposure to the manufacturing ecosystem.
A dedicated instance of the SM Innovation Platform (SMIP) and relevant software applications to showcase the SMIC organization’s manufacturing assets and competencies and demonstrate its integration with SMIC manufacturing assets to create testbeds. This will further increase the SMIC’s visibility and engagement with industry.
Potential to host membership Application Projects (self-funded) and lead working groups aligned to SMIC’s capabilities.
Additional opportunities for publication, promotion, and outreach.
Expanded Education and Workforce Development and membership-engagement opportunities.
“Smart Manufacturing Innovation Centers are one of several strategic initiatives to highlight the state-of- the-art in SM’s impact on energy productivity, quality, throughput, costs/profitability, safety, and asset reliability, in manufacturing environments,” explained John Dyck, CEO of CESMII. “CESMII is excited to establish this ecosystem to exemplify and espouse the characteristics, capabilities and behaviors consistent with the 4th Industrial Revolution. We are encouraging Universities, Colleges, Vocational Schools and industry Centers of Excellence to apply for our SMIC funding and become part of this exciting new ecosystem.”
Up to $1M in federal funding is being made available through this RFP. Selected SMIC candidates will need to provide matching funds to bring the total RFP value up to $2M. CESMII anticipates selecting 3 to 10 SMIC candidates with a performance period of 16 months.
CESMII is the United States’ national institute on Smart Manufacturing, driving cultural and technological transformation and secure industrial technologies as national imperatives. By enabling frictionless movement of information – raw and contextualized data – between real-time Operations and the people and systems that create value in and across Manufacturing organizations, CESMII is ensuring the power of information and innovation is at the fingertips of everyone who touches manufacturing.
Founded in 2016, in partnership with Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE), CESMII is the third institute funded by EERE’s Advanced Manufacturing Office. The Institute is accelerating Smart Manufacturing (SM) adoption through the integration of advanced sensors, data (ingestion – contextualization – modeling – analytics), platforms and controls to radically impact manufacturing performance, through measurable improvements in areas such as: quality, throughput, costs/profitability, safety, asset reliability and energy productivity. CESMII’s program and administrative home is with the University of California Los Angeles (UCLA).