Podcast 194 Beware Hype

Podcast 194 Beware Hype

Podcast 194 of my long-running series—Beware Hype of OT and IT

Platforms come and go–sometimes quickly with turns in technology. IoT platforms were all the rage. Just like IT/OT Convergence and other hyped tech. But engineers are quietly working together to apply the technologies to solve business and industrial problems. Don’t watch the hype. Notice when everyone is using it.

This podcast is sponsored by Ignition from Inductive Automation.

OSIsoft Discusses Digital Twin

OSIsoft Discusses Digital Twin

The concept of digital twins was born from the marriage known as cyber-physical systems. The cyber representation of a product or process was often held digitally within CAD/CAM or PLM systems. These became linked to the physical object through a feedback loop that kept the two in sync.

Digital Twin has moved from the esoteric to mainstream within industrial culture. And digital no longer is consigned to drawing databases, as my recent conversation with Michael Kanellos and Perry Zalvesey of OSIsoft reveals.

They described the process this way, “From devices all the way to buildings and factories, we’re now living in a world where everything is connected. And as these operations become more connected, it’s increasingly important to identify the strongest solution to monitor them. With the introduction of IoT, sensor and even AI technology to industrial operators, there’s been a surge of unfamiliar digital strategies – the latest being digital twins.”

OSIsoft prefers to consider digital twin as a loose term, as it can be either a complete network doppelganger or just a copy of key data streams to narrow in on specific issues. Everyone has their own preference and iteration.

OSIsoft named its digital twin technology the Asset Framework, which allows companies to take a project-by-project approach, creating solutions for each need on a rolling basis.

When one of its customers, DCP Midstream, began deploying OSIsoft’s AF tool it rolled out 12 AF based applications in two months, experiencing a $20-$25 million one-year return.

Application of OSIsoft’s Asset Framework has been strong in the water industry. Zalvesey says that his first work in the area was with modeling processes that were only static models. Today’s digital twins are dynamic. Designers can model the facility and objects within it. Each object has attributes that data are then associated with. Where originally there was a pump object—say we define “Pump 12” and associate data such as temperature and pressure and more. Now with Asset Framework, designers can create a template class “pump” and be able to replicate for as many pumps as a facility contains.

1. Asset Framework is the core digital twin offering. It’s as a relational layer on top of PI that combines all the data streams (temp, pressure, vibration) of an asset into one screen. A lot of people get fancy with the digital twin term but to us it’s a simulation combined with live data.

2. A simple AF template for a pump probably takes a half an hour to build. It can then be replicated ad inifinitum. It’s a drag and drop process. AF is part of PI Server (it was a separate product years ago but combined into it.) Complex ones can take months. Element, a company that OSIsoft helped incubate (and has since culled investment from Kleiner Perkins, GE and others) has built a service called AF accelerator. Basically, they parachute a team of data scientists to study your large assets and then develop automated ways to build AF templates for complete mines or offshore oil platforms. It still takes two months or so but they can streamline a lot of the coding tasks. BP used them.

3. Examples:

  • DCP. In 2017, the company launched an effort to digitize operations. One of the first steps was using PI to collect the data and use AF to create simple and complex digital twins. DCP has 61 gas plants for instance. Each one has been modeled with AF. Plant managers are show a live feed of current production, idealized production, and the differential in terms of gas produced and revenue. DCP discovered that it could increase production per plant on average $2000-$5000 per day, or millions a year, by giving the plant managers better visibility into current production and market pricing. In year one, it saved $20=$25 million, paying off the entire project (including the cost of building a centralized control center in Colorado and staffing it.) The next year (2018) it saved another $20 million.
  • MOL. One of the largest uses of AF. MOL tracks 400,000 data streams and has 21,000+ AF instances based on 300 templates (a single template can be replicated several times.) MOL says that it has added $1 billion EBITDA since 2010 by using its data better. With AF, for instance, they figured out why hydrogen corrosion was exceeding the norm. In some instances, they’ve used advanced analytics—an experiment to see if it could use high sulfur crudes required deep analytics—but most of the time MOL has made its improvements by creating AF templates, studying the phenomena and taking action.
  • Colorado Springs. Complete opposite end of big. It’s a small, regional utility.
  • Heineken uses AF to model its plants to reduce energy. Aurelian Metals used it to boost gold extraction from ore from 75% to 89%. Michelin saved $4 million because AF let them recover more quickly from a previous outage. Deschutes Brewery meanwhile boosted production by $450K and delayed a plant (per our 2018 meeting.
Podcast 194 Beware Hype

Navigating a New Industrial Infrastructure

The Manufacturing Connection conceived in 2013 when I decided to go it alone in the world from the ideas of a new industrial infrastructure and enhanced connectivity. I even had worked out a cool mind map to figure it out.

Last week I was on vacation spending some time at the beach and reading and thinking catching up on some long neglected things. Next week I am off to Las Vegas for the Hewlett Packard Enterprise “Discover” conference where I’ll be inundated with learning about new ideas in infrastructure.

Meanwhile, I’ll share something I picked up from the Sloan Management Review (from MIT). This article was developed from a blog post by Jason Killmeyer, enterprise operations manager in the Government and Public Sector practice of Deloitte Consulting LLP, and Brenna Sniderman, senior manager in Deloitte Services LP.

They approach things from a much higher level in the organization than I usually do. They recognize what I’ve often stated about business executives reading about all these new technologies, such as, cloud computing, internet of things, AI, blockchain, and others. “The potential resulting haste to adopt new technology and harness transformative change can lead organizations to treat these emerging technologies in the same manner as other, more traditional IT investments — as something explored in isolation and disconnected from the broader technological needs of the organization. In the end, those projects can eventually stall or be written off, leaving in their wake skepticism about the usefulness of emerging technologies.”

This analysis correctly identifies the organizational challenges when leaders read things or hear other executives at the Club talk about them.

The good news, according to the authors: “These new technologies are beginning to converge, and this convergence enables them to yield a much greater value. Moreover, once converged, these technologies form a new industrial infrastructure, transforming how and where organizations can operate and the ways in which they compete. Augmenting these trends is a third factor: the blending of the cyber and the physical into a connected ecosystem, which marks a major shift that could enable organizations to generate more information about their processes and drive more informed decisions.”

They identify three capabilities and three important technologies that make them possible:

Connect: Wi-Fi and other connectivity enablers. Wi-Fi and related technologies, such as low-power wide-area networks (LPWAN), allow for cable-free connection to the internet almost anywhere. Wi-Fi and other connectivity and communications technologies (such as 5G) and standards connect a wide range of devices, from laptops to IoT sensors, across locations and pave the way for the extension of a digital-physical layer across a broader range of physical locations. This proliferation of connectivity allows organizations to expand their connectivity to new markets and geographies more easily.

Store, analyze, and manage: cloud computing. The cloud has revolutionized how many organizations distribute critical storage and computing functions. Just as Wi-Fi can free users’ access to the internet across geographies, the cloud can free individuals and organizations from relying on nearby physical servers. The virtualization inherent in cloud, supplemented by closer-to-the-source edge computing, can serve as a key element of the next wave of technologies blending the digital and physical.

Exchange and transact: blockchain. If cloud allows for nonlocal storage and computing of data — and thus the addition or extraction of value via the leveraging of that data — blockchain supports the exchange of that value (typically via relevant metadata markers). As a mechanism for value or asset exchange that executes in both a virtualized and distributed environment, blockchain allows for the secure transacting of valuable data anywhere in the world a node or other transactor is located. Blockchain appears poised to become an industrial and commercial transaction fabric, uniting sensor data, stakeholders, and systems.

My final thought about infrastructure—they made it a nice round number, namely three. However, I’d add another piece especially to the IT hardware part. That would be the Edge. Right now it is all happening at the edge. I bet I will have a lot to say and tweet next week about that.

Podcast 194 Beware Hype

HPE Unveils Converged Edge Systems To Bridge OT and IT

Hewlett Packard Enterprise (HPE) announced new HPE Edgeline Converged Edge System solutions that speed the deployment and simplify the management of edge applications, enabling customers to act on the vast amounts of data generated by machines, assets and sensors from edge to cloud.

I think this is another significant advance reflecting the utility of enterprise compute capability brought ever closer to the plant itself. If you are looking to be disruptive in your industry or are on a corporate engineering staff looking for OT alternatives, I’d suggest taking a long look at these technologies and then letting your imagination do its work.

The new solutions include:

  • HPE Edgeline OT Link Platform, an open platform that automates the interplay between diverse operational technologies (OT) and standard IT-based applications at the edge to enable intelligent and autonomous decision making;
  • HPE Edgeline systems management, the industry’s first systems management solutions designed specifically for the edge to ensure enterprise-grade reliability, connectivity and security;
  • HPE Edgeline EL300 Converged Edge System featuring OT link and HPE Edgeline systems management, providing superior resilience against harsh edge environments for a broad range of industrial deployments; and
  • HPE Edgeline Field Application Engineering Services are available from HPE Pointnext to help customers plan, build, and customize OT link-based Internet of Things (IoT) and cyber-physical systems.

To turn edge data into insight for real-time action, it must be processed close to its source to avoid the latency, bandwidth, and cost issues of sending the data to a remote data center. However, this opportunity comes with a set of unique challenges, including management of remote infrastructure, and the necessity to seamlessly connect sensors and industrial assets with IT applications at the edge.

“Deploying IoT, edge, and cyber-physical systems is a challenge requiring a fresh look at uniting the physical and digital worlds,” said Dr. Tom Bradicich, Vice President and General Manager, Converged Servers, Edge and IoT Systems, HPE. “With today’s announcements, we enable our customers to accelerate the delivery of applications that capitalize on edge data, safeguarded by enterprise-class management. And we lay the groundwork for a new ecosystem of intelligent edge solutions to drive innovation and growth across industries.”

Simplifying deployment of edge-to-cloud IoT and cyber-physical systems

Today, setting up an IoT or cyber-physical system is a laborious undertaking. It requires custom coding to orchestrate OT networks, control systems, and data flows with drivers, middleware, and applications running on IT systems. HPE Edgeline OT Link Platform is an open platform that significantly simplifies this process, reducing cost and time to market.

The solution includes:

HPE Edgeline OT Link Platform software, an open workflow engine and application catalogue, allowing customers to orchestrate components, data, and applications via a graphical drag-and-drop user interface. The HPE Edgeline OT Link Platform integrates an ecosystem of third-party applications running from edge to cloud – including AWS, Google, Microsoft, SAP, PTC, GE, and more – to make insights from the edge available across the enterprise and supply chain.

HPE Edgeline OT Link certified modules, HPE-developed adapters that connect to a broad range of OT systems, enabling bi-directional, time-sensitive, and deterministic control and communication, including high-speed digital input/output, CAN bus, Modbus, or Profinet. APIs and SDKs for these adapters are made available to the industry to facilitate third-party designs of OT link modules. OT link will also integrate FPGA modules to give customers maximal flexibility to connect to any industrial input/output device.

Enterprise-grade manageability and security at the edge

HPE also announced the industry’s first systems management solutions specifically designed to simplify the provisioning and management of edge infrastructure and applications, providing enterprise-grade manageability and security for remote systems with limited connectivity and IT expertise.

HPE Edgeline Integrated System Manager is embedded into HPE Edgeline Converged Edge Systems and features one-click provisioning, ongoing system health management, remote updates, and management even with intermittent wired and wireless connections. It also supports advanced security functions like preventing system boot file changes and remote system disablement during a security event. HPE Edgeline Infrastructure Manager software can remotely manage thousands of Edgeline Converged Edge Systems.

The HPE Edgeline Workload Orchestrator hosts a central repository for containerized analytics, AI, business, and IoT applications that can be pushed to HPE Edgeline Converged Edge Systems at the edge

Unparalleled convergence of OT and IT

The HPE Edgeline EL300 is a fan-less, low-energy system equipped with Intel Core i5 processors, up to 32GB of memory and 3TB of storage. It will also support Intel Movidius Myriad X vision processing units to enable video analytics and AI inference at the edge. The HPE Edgeline EL300 provides enhanced resiliency against shock, vibration, humidity, and dust, including IP50 and MIL-SPEC certifications, and can operate from -30 to +70 degrees Celsius. These features make the HPE Edgeline EL300 suitable to be deployed as an embedded system – for example, in production machines or in building infrastructure.

Expertise to accelerate deployment and create competitive advantage

To support these new offerings, HPE Pointnext, the services organization of Hewlett Packard Enterprise, provides HPE Edgeline Field Application Services, which help customers plan, design, build, and run IoT, edge and cyber-physical systems to accelerate deployment and ensure reliable and secure operation. These services include the evaluation of use cases, proof of value, solution deployment, and management of ongoing operations – helping customers get the most from OT/IT integrations.

Moreover, HPE Pointnext can help customers develop their own data acquisition, industrial network, and control components for HPE Edgeline OT Link Platform to create custom solutions and competitive advantage. HPE Edgeline OT Link Platform based solutions can be delivered on-premises with a turnkey deployment service, operated by HPE Pointnext.

Finally, HPE Edgeline EL300 Converged Edge System will be added to HPE GreenLake Flex Capacity, to deliver a consumption-based experience with usage-based payment, capacity metering, and tailored support, for customers who need a cloud-like experience for systems at the edge.

OSIsoft Discusses Digital Twin

OSIsoft Brings PI System Software to Amazon Web Services in the Cloud

I’m still deep in cyber security meetings in Germany. A pause here for software and cloud news from the west coast of America—OSIsoft and Amazon Web services. Since PI is used by many industrial companies, these announcements reveal the deep acceptance of cloud technologies.

In short, here are three bullets:

  • AWS Quick Starts for PI System: enables industrial customers to quickly deploy and manage the PI System on AWS.
  • PI Integrator for Business Analytics: optimized for AWS to reduce time and cost of bringing operational and IoT data to AWS for sharing or advanced analytics.
  • Enhanced connectivity and data sharing to accelerate digital transformation and shrink the OT-IT gap.

OSIsoft launched a suite of products today designed to enable manufacturers, utilities, and other industrial customers to run the OSIsoft PI System on Amazon Web Services.

AWS Quick Starts for the PI System consists of AWS CloudFormation templates, scripts, and reference architectures for quickly spinning up and managing a fully functioning PI System on AWS. Customers will use the PI System Quick Starts for moving PI System workloads to the AWS cloud or for providing an aggregate PI System across an enterprise, monitoring remote or isolated assets and enabling data science efforts.

The PI Integrator for Business Analytics, meanwhile, has been optimized to extract, clean and transmit data from PI Systems and reduce data preparation tasks that bog down big data and data science initiatives. Some customers have successfully used PI Integrator technology to reduce the time consumed by data preparation in advanced analytics projects by over 90%.

AWS Quick Starts will be available in 2019. PI Integrator for Business Analytics, previewed at Hannover Messe earlier this year, is available this month.

Under the Hood

Quick Starts are built by AWS solutions architects and partners to help deploy solutions on AWS, based on AWS best practices for security and high availability. These reference deployments implement key technologies automatically on the AWS Cloud, often with a single click and in less than an hour. You can build your test or production environment in a few steps, and start using it immediately.

The PI Integrator for Business Analytics can integrate to Amazon S3, Amazon Redshift, and Amazon Kinesis Data Streams, enabling industrial customers to speed up their data science experiments, combine disparate data sets for business intelligence, and operationalize the outcomes of advanced analytics that augment decision making.

The Life of the PI System

OSIsoft’s PI System transforms the vast number of operational data streams from sensors, devices and industrial processes into rich, real-time insights to help people save money, increase productivity or create connected products and services.

The PI System can be found inside thousands of companies and complex industrial sites around the globe. OSIsoft customers have used PI System technology to predict wind turbine failures, increase output at an iron mine by $120 million in a single year by fine-tuning logistics, reduce the power consumed by a supercomputer center at a national laboratory, deliver water services to millions of new customers in a major metropolitan city, transform how medicines are produced and reduce the time and expense and improve the quality and consistency of beer. Over 1,000 leading utilities, 90% of the world’s largest oil and gas companies and 65% of the Fortune 500 industrial companies rely the PI System in their operations.

“Worldwide, over 2 billion sensor-based data streams are managed by the PI System with some customers monitoring over 25 million data streams.

“Data from operations—the information being generated by chemical reactors, transformers and other industrial devices—is incredibly valuable. Operations data will be the most valuable asset companies have for moving ahead of the competition in the future. Until recently, this data has been mostly confined to the factory floor or production line in part because of the size, scope and complexity of the data generated by operations,” said John Baier, Director of Integration Technologies, Cloud Analytics Practice at OSIsoft. “Working with Amazon Web Services, we want to unlock the value of operations data by eliminating barriers and bringing it to as many people as possible.”

www.osisoft.com

Follow this blog

Get a weekly email of all new posts.