How To Avoid Pilot Purgatory For Your Projects

How To Avoid Pilot Purgatory For Your Projects

This is still more followup from Emerson Global Users Exchange relative to sessions on Projects Pilot Purgatory. I thought I had already written this, but just discovered it languishing in my drafts folder. While in Nashville, I ran into Jonas Berge, senior director, applied technology for Plantweb at Emerson Automation. He has been a source for technology updates for years. We followed up a brief conversation with a flurry of emails where he updated me on some presentations.

One important topic centered on IoT projects—actually applicable to other types of projects as well. He told me the secret sauce is to start small. “A World Economic Forum white paper on the fourth industrial revolution in collaboration with McKinsey suggests that to avoid getting stuck in prolonged “pilot purgatory” plants shall start small with multiple projects – just like we spoke about at EGUE and just like Denka and Chevron Oronite and others have done,” he told me.

“I personally believe the problem is when plants get advice to take a ‘big bang’ approach starting by spending years and millions on an additional ‘single software platform’ or data lake and hiring a data science team even before the first use case is tackled,” said Berge. “My blog post explains this approach to avoiding pilot purgatory in greater detail.”

I recommend visiting Berge’s blog for more detail, but I’ll provide some teaser ideas here.

First he recommends

  • Think Big
  • Start Small
  • Scale Fast

Scale Fast

Plants must scale digital transformation across the entire site to fully enjoy the safety benefits like fewer incidents, faster incident response time, reduced instances of non-compliance, as well as reliability benefits such as greater availability, reduced maintenance cost, extend equipment life, greater integrity (fewer instances of loss of containment), shorter turnarounds, and longer between turnarounds. The same holds true for energy benefits like lower energy consumption, cost, and reduced emissions and carbon footprint, as well as production benefits like reduced off-spec product (higher quality/yield), greater throughput, greater flexibility (feedstock use, and products/grades), reduced operations cost, and shorter lead-time.

Start Small

The organization can only absorb so much change at any one time. If too many changes are introduced in one go, the digitalization will stall:

  • Too many technologies at once
  • Too many data aggregation layers
  • Too many custom applications
  • Too many new roles
  • Too many vendors

Multiple Phased Projects

McKinsey research shows plants successfully scaling digital transformation instead run smaller digitalization projects; multiple small projects across the functional areas. This matches what I have personally seen in projects I have worked on.

From what I can tell it is plants that attempt a big bang approach with many digital technologies at once that struggle to scale. There are forces that encourage companies to try to achieve sweeping changes to go digital, which can lead to counterproductive overreaching. 

The Boston Consulting Group (BCG) suggests a disciplined phased approach rather than attempting to boil the ocean. I have seen plants focus on a technology that can digitally transform and help multiple functional areas with common infrastructure. A good example is wireless sensor networks. Deploying wireless sensor networks in turn enables many small projects that help many departments digitally transform the way they work. The infrastructure for one technology can be deployed relatively quickly after which many small projects are executed in phases.

Small projects are low-risk. A small trial of a solution in one plant unit finishes fast. After a quick success, then scale it to the full plant area, and then scale to the entire plant. Then the team can move on to start the next pilot project. This way plants move from PoC to full-scale plant-wide implementation at speed. For large organization with multiple plants, innovations often emerge at an individual plant, then gets replicated at other sites, rolled out nation-wide and globally.

Use Existing Platform

I have also seen big bang approach where plant pours a lot of money and resources into an additional “single software platform” layer for data aggregation before the first use-case even gets started. This new data aggregation platform layer is meant to be added above the ERP with the intention to collect data from the ERP and plant historian before making it available to analytics through proprietary API requiring custom programming. 

Instead, successful plants start small projects using the existing data aggregation platform; the plant historian. The historian can be scaled with additional tags as needed. This way a project can be implemented within two weeks, with the pilot running an additional three months, at low-risk. 

Think Big
I personally like to add you must also think of the bigger vision. A plant cannot run multiple small projects in isolation resulting in siloed solutions. Plants successful with digital transformation early on establish a vision of what the end goal looks like. Based on this they can select the technologies and architecture to build the infrastructure that supports this end goal.
NAMUR Open Architecture (NOA)
The system architecture for the digital operational infrastructure (DOI) is important. The wrong architecture leads to delays and inability to scale. NAMUR (User Association of Automation Technology in Process Industries) has defined the NAMUR Open Architecture (NOA) to enable Industry 4.0. I have found that plants that have deployed digital operational infrastructure (DOI) modelled on the same principles as NOA are able to pilot and scale very fast. Flying StartThe I&C department in plants can accelerate digital transformation to achieve operational excellence and top quartile performance by remembering Think Big, Start Small, Scale Fast. These translate into a few simple design principles:

  • Phased approach
  • Architecture modeled on the NAMUR Open Architecture
  • Ready-made apps
  • East-to-use software
  • Digital ecosystem
Improve IIoT Deployment

Improve IIoT Deployment

The Industrial Internet of Things by definition is all about connections. Connecting hundreds of devices which often have differing protocols is a huge challenge. In an attempt to facilitate IIoT deployments, ioTium has announced an alliance with Telit. The agreement allows Telit deviceWISE gateway technology on the ioTium Edge App Store for single-click deployment.

After wading through a couple of paragraphs of marketing generalities, I found the best explanation with this quote. “With the cooperation of Telit, customers can now rapidly connect different communications protocols like BACnet, OPC, Modbus or even proprietary protocols to various IoT cloud offerings such as Azure IoT, Siemens MindSphere or private cloud end points,” said Sri Rajagopal, CTO, ioTium. “All commissioning, data mapping, and contextualization can now be done remotely, dramatically reducing the time and cost of flying technicians and data scientists to the site to remediate in person.”

Then the obligatory quote from the partner. I’ve talked with Fred Yentz for many years about connecting data. Here’s his thought on this announcement. “Our alliance with ioTium establishes a best-in-class approach for digital connectivity in the industrial world,” said Fred Yentz, president Strategic Partnerships, Telit. “Together, we are providing industrial enterprise customers a secure, plug-and-play way to connect any machine to cloud-based applications to capitalize on the benefits of Industry 4.0.”

Solving this problem is mainly what the various platforms are attempting. I would be interested in hearing what is actually working out in the field. Comment or send me an email. Something is working, because engineers are doing this.

Data Protection Best Practices White Paper

Data Protection Best Practices White Paper

Standards are useful, sometimes even essential. Standard sizes of shipping containers enable optimum ship loading/unloading. Standard railroad gauges and cars enable standard shipping containers to move from ship to train, and eventually even to tractor/trailer rigs to get products to consumers. 

Designing and producing to standards can be challenging. Therefore the value of Best Practices.

Taking this to the realm of Industrial Internet of Things where data security, privacy and trustworthiness are essential, the Industrial Internet Consortium (IIC) has published the Data Protection Best Practices White Paper. I very much like these collaborative initiatives that help engineers solve real world problems.

Designed for stakeholders involved in cybersecurity, privacy and IIoT trustworthiness, the paper describes best practices that can be applied to protect various types of IIoT data and systems. The 33-page paper covers multiple adjacent and overlapping data protection domains, for example data security, data integrity, data privacy, and data residency.

I spoke with the lead authors and came away with a sense of the work involved. Following are some highlights.

Failure to apply appropriate data protection measures can lead to serious consequences for IIoT systems such as service disruptions that affect the bottom-line, serious industrial accidents and data leaks that can result in significant losses, heavy regulatory fines, loss of IP and negative impact on brand reputation.

“Protecting IIoT data during the lifecycle of systems is one of the critical foundations of trustworthy systems,” said Bassam Zarkout, Executive Vice President, IGnPower and one of the paper’s authors. “To be trustworthy, a system and its characteristics, namely security, safety, reliability, resiliency and privacy, must operate in conformance with business and legal requirements. Data protection is a key enabler for compliance with these requirements, especially when facing environmental disturbances, human errors, system faults and attacks.”

Categories of Data to be Protected

Data protection touches on all data and information in an organization. In a complex IIoT system, this includes operational data from things like sensors at a field site; system and configuration data like data exchanged with an IoT device; personal data that identifies individuals; and audit data that chronologically records system activities.

Different data protection mechanisms and approaches may be needed for data at rest (data stored at various times during its lifecycle), data in motion (data being shared or transmitted from one location to another), or data in use (data being processed).

Data Security

“Security is the cornerstone of data protection. Securing an IIoT infrastructure requires a rigorous in-depth security strategy that protects data in the cloud, over the internet, and on devices,” said Niheer Patel, Product Manager, Real-Time Innovations (RTI) and one of the paper’s authors. “It also requires a team approach from manufacturing, to development, to deployment and operation of both IoT devices and infrastructure. This white paper covers the best practices for various data security mechanisms, such as authenticated encryption, key management, root of trust, access control, and audit and monitoring.”

Data Integrity

“Data integrity is crucial in maintaining physical equipment protection, preventing safety incidents, and enabling operations data analysis. Data integrity can be violated intentionally by malicious actors or unintentionally due to corruption during communication or storage. Data integrity assurance is enforced via security mechanisms such as cryptographic controls for detection and prevention of integrity violations,” said Apurva Mohan, Industrial IoT Security Lead, Schlumberger and one of the paper’s authors.

Data integrity should be maintained for the entire lifecycle of the data from when it is generated, to its final destruction or archival. Actual data integrity protection mechanisms depend on the lifecycle phase of the data.

Data Privacy

As a prime example of data privacy requirements, the paper focuses on the EU General Data Protection Regulation (GDPR), which grants data subjects a wide range of rights over their personal data. The paper describes how IIoT solutions can leverage data security best practices in key management, authentication and access control can empower GDPR-centric privacy processes.

The Data Protection Best Practices White Paper complements the IoT Security Maturity Model Practitioner’s Guide and builds on the concepts of the Industrial Internet Reference Architecture and Industrial Internet Security Framework.

The Data Protection Best Practices White Paper and a list of IIC members who contributed to it can be found on the IIC website 

Cloud-based Platform Complements PI System

Cloud-based Platform Complements PI System

Digitalization requires digital data, which in turn requires a place to robustly store that data. This place is most often the cloud these days. OSIsoft PI System must be the most widely used industrial database. The company has released OSIsoft Cloud Services—a cloud-native, real-time data management system for unifying and augmenting critical operations data from across an organization to accelerate industrial analytics, data science projects, data sharing, and other digital transformation initiatives.

OCS highlights and capabilities:

  • Data sharing – partner companies can access a shared data stream to remotely monitor technology
  • Functionality – seamless crossover between the PI System and OCS to compare facilities, perform root cause analysis and run hypotheticals
  • Scalability – tests proved OCS can simultaneously manage over two billion data streams, and safely share information with partners
  • Petuum uses OCS to stream historical data and live data on production, temperature and variability to its AI platform to assist Cemex, a global cement manufacturer, improve yield and energy to 7% from 2%.
  • DERNetSoft uses OCS to aggregate data in one place, allowing users to access useful analytics for ways to reduce power and save money.
  • Pharma companies will use OCS to give a regulator access to anonymized drug testing or production, without risk of unauthorized users in the manufacturing networks.

With OCS, an engineer at a chemical producer, for example, could combine maintenance and energy data from multiple facilities into a live superset of information to boost production in real-time while planning analysts could merge several years’ worth of output and yield data to create a ‘perfect plant’ model for capital forecasts.

OCS can also be leveraged by software developers and system integrators to build new applications and services or to link remote assets.

“OSIsoft Cloud Services is a fundamental part of our mission to help people get the most out of the data that is at the foundation of their business. We want their cost of curiosity to be as close to zero as possible,” said Gregg Le Blanc, Vice President of Product at OSIsoft. “OCS is designed to complement the PI System by giving customers a way to uncover new operational insights and use their data to solve new problems that would have been impractical or impossible before.”

The Data Dilemma

Critical operations data—i.e. data generated by production lines, safety equipment, grids, and other systems essential to a company’s survival—is part of one of the fastest growing segments in the data universe. IDC and Seagate estimate in “Data Age 2025: The Evolution of Data to Life Critical” that “hypercritical” data for applications such as distributed control systems is growing by 54% a year and will constitute 10% of all data by 2025 while real-time data will nearly double to more than 25% of all data.

Critical operations data, however, can be extremely difficult to manage or use.

Data scientists spend 50 percent or more of their time curating large data sets instead of conducting analytics. IT teams get bogged down in managing VPNs for third parties or writing code for basic administrative tasks. Data becomes inaccessible and locked in silos. Over 1,000 utilities, 80% of the largest oil and gas companies, and 65% of the Fortune 500 industrial companies already use the PI System to harness critical operations data, turning it into an asset for improving productivity, saving money, and developing new services.

Natively compatible with the PI System, OCS extends the range of possible applications and use cases of OSIsoft’s data infrastructure while eliminating the challenges of capturing, managing, enhancing, and delivering operations data across an organization. Within a few hours, thousands of data streams containing years of historical data can be transferred to OCS, allowing customers to explore, experiment, and share large data sets the same day.

Two Billion Data Streams

The core of OCS is a highly scalable sequential data store optimized for time series data, depth measurements, temperature readings, and similar data. OSIsoft has also embedded numerous usability features for connecting devices, managing users, searching, transferring data from the PI System to OCS, and other functions. OCS can also accept data from devices outside of traditional control networks or other sources.

“The scale and scope of data that will be generated over the coming decades is unprecedented, but our mission remains the same,” said Dr. J. Patrick Kennedy, CEO and Founder of OSIsoft. “OSIsoft Cloud Services represent the latest step in a nearly 40 year journey and there’s more to come.”

To test the scalability and stability of OCS, OSIsoft created a deployment that contained the equivalent of the data generated by all of the smart meters in the U.S. over the last two years, or two billion data streams (100 million meters with 20 data streams each). OCS successfully stored up to 1.2 billion data points per hour and was managing all two billion streams simultaneously within 48 hours.

PaaS for OSIsoft Marketplace Partners

Software developers are already creating services based around OCS. DERNetSoft is creating a secure marketplace for sharing utility and electric power data to improve energy forecasts and peak shaving strategies. Meanwhile, others are collaborating with customers on efforts to bolster well integrity at oil drilling sites, pinpoint tank leakage, predict maintenance problems, and reduce energy consumption with OCS. OSIsoft partners developing OCS services include Petuum, Seeq, Toumetis, Transpara, Aperio, and TrendMiner. These services will be available from OSIsoft marketplace as they are released.

“Digital transformation requires the ability to compare data and outcomes across multiple plants and data sources,” says Michael Risse, VP/CMO at Seeq. “OCS is a unified solution for process manufacturing customers to enable this type of analysis, generating predictive insights on thousands of assets across company operations to improve production outcomes.”

Pricing and Availability

OCS is a subscription service currently available to customers and partners for use in facilities in North America. OCS will be extended to Europe and to other regions in the near future.

Pricing is based on the average number of data streams accessed, rather than the unique data streams stored, giving customers the freedom to experiment more freely with their data without incurring added costs..

Cloud-based Platform Complements PI System

Gaining Trust In Your Data Systems

Digitalization breeds the need for data and connected devices. Trusted connections and data are required for success. Siemens invited a diverse group of press, analysts, podcasters, and bloggers to Munich this week (November 26-28) to discuss cybersecurity and the Charter of Trust.

I will use the words of Siemens below to discuss the rationale for the Charter of Trust. However the idea is that if users cannot trust their data and connections, they will never go further into digitalization and therefore not realize the anticipated benefits.

Some of the analysts and others in the conference had trouble understanding how something seemingly vague and not specifically standards-based would work. I think they missed the point. First, standards are good, but they take a long time to develop. What was needed was not another new standard. What is needed is for many companies to agree to a set of principles and then commonly work toward them for the mutual benefit of the industry, users, and society.

Eva Schulz-Kamm, Global Head of Government Affairs at Siemens AG, and Rainer Zahner, Global Head of Cybersecurity Governance at Siemens told us the digital world is changing everything. Billions of devices are connected by the Internet of things. That holds great potential for everyone, but also great risk. The risk of exposure to cyber-attacks. The risk of losing control over the systems that run our infrastructures. Cybersecurity is therefore crucial to the success of our digital economy – because only if the security of data and networked systems is guaranteed will people actively support the digital transformation. Then explained why Siemens has initiated the Charter of Trust.

Siemens’ 171 years of experience have also shown that the best way to make a lasting difference isn’t as one company, but as an industry – not only as one nation, but as part of a global community. In modern history, competitor businesses have forged standards together that have carried the world from one industrial revolution to the next – including the unfolding digital transformation of industry. Countries without clear-cut geopolitical alliances have come together to forge cross-border agreements that grow trade and advance peace.

It’s in this spirit that Siemens launched the Charter of Trust earlier this year at the at the Munich Security Conference, a longstanding forum for business and government leaders to discuss geopolitical issues. Since then, several more global companies saw the value of the Charter of Trust, and signed on. These companies committed to create the first-of-its-kind global alliance focused on answering a very important question: How do we secure critical infrastructure – from our factories to our power grids – in the digital age?

We also are carrying an important message together: that when we talk about security today, it isn’t just about diplomacy and resolving military conflicts – it is increasingly about cyber attacks that seek to undermine our democratic and economic values.

The Charter of Trust then begins with these three goals:

  • protecting the data and assets of individuals and businesses;
  • preventing damage to people, businesses, and infrastructures;
  • building a reliable basis for trust in a connected and digital world.

“We know at the outset that a one-size fits all approach won’t work. We have instead agreed to 10 principles – from ensuring the highest levels of responsibility for cybersecurity within every company, to securing supply chains, products, and working with governments. Together, we will develop and continuously improve coordinated strategies and shared standards to protect critical infrastructures, public facilities and private companies.”

Charter of Trust members: The AES Corporation, Airbus, Allianz, Atos, Cisco, Dell Technologies, Enel, IBM, Munich Security Conference, NXP Semiconductors, SGS,. Deutsche Telekom, Total and TÜV SÜD.

Follow this blog

Get a weekly email of all new posts.