Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Digitization is on everyone’s lips these days. If you have not taken steps to implement and improve digital data flow, you are probably already behind. I receive information regularly from PwC and here is a new report on how digitization is reshaping the manufacturing industry. The report takes a look at 8 companies and showcase how they improved their efficiency, productivity and customer experience by ensuring they have the right capabilities central to their operating model and by matching them with strong skill sets in analytics and IT.

Pressure from the consumer, new regulations and advances in information technology are all reasons that are pushing manufacturing organizations to digitize so they can avoid falling behind the new breed of market-leading ‘digital champions.’ The report identifies 4 significant changes CEOs must implement to maximize the benefits of digitization.

1. Drive organizational changes that address new digital capabilities and digitalized processes – e.g., product and process design and engineering, end-to-end procurement, supply chain/distribution and after-sales – right from the top, because these are so new and different

2. Hire more software and Internet of Things (IoT) engineers and data scientists, while training the wider workforce in digital skills

3. Learn from software businesses, which have the ability to develop use cases rapidly and turn them into software products

4. Extend digitalization beyond IT to include significant operational technologies (OT) such as track and trace solutions and digital twinning

From the report, “Already, digitally ‘smart’ manufacturers are gaining a competitive advantage by exploiting emerging technologies and trends such as digital twinning, predictive maintenance, track and trace, and modular design. These companies have dramatically improved their efficiency, productivity, and customer experience by ensuring these capabilities are central to their operating models and by matching them with strong skill sets in analytics and IT. “

During 2018 and early 2019, PwC conducted in-depth digitisation case studies of eight industrial and manufacturing organisations in Germany, the US, India, Japan and the Middle East. Drawing on discussions and interviews with CEOs and division heads, we explored the key triggers for change these companies faced, assessed how digital solutions are being implemented and how digitisation is affecting key aspects of their operating models. We also compared our eight organisations with other publicly cited digitisation case studies, and leveraged PwC’s 2018 study Digital Champions: How industry leaders build integrated operations ecosystems to deliver end-to-end customer solutions and other ongoing PwC research.

This paper is the result of ongoing collaboration between PwC and the Global Manufacturing and Industrialisation Summit (GMIS). GMIS provides a forum for industry leaders to interact with governments, technologists and academia in order to navigate the challenges and opportunities brought about by the digital technologies of the Fourth Industrial Revolution. PwC has been a knowledge partner with GMIS since 2016.

The eight case studies in this report make clear how far the role of digital technology goes beyond traditional IT systems. It also encompasses OT and data and analytics technologies. Full integration and linkage among these different technologies, and the ecosystems they are part of, are essential to a successful digital transformation. Yet success is impossible without a digitally smart workforce that is familiar with Industry 4.0 skills and tools.

These challenges are the subject of the second part of the report Digital Champions: How industry leaders build integrated operations ecosystems to deliver end-to-end customer solutions, which will be published in January 2020.

The report will elaborate further on the emerging theory of digital manufacturing and operations, in which successful, digitised industrial organisations will increasingly have to act like software companies in response to four key factors:

  • The connected customer seeks a batch size of one, necessitating greater customisation of products and delivery time, improved customer experience, use of online channels and outcome-based business models.
  • Digital operations require both engineering and software abilities to enable extensive data analysis and IoT-based integration, as well as digitisation of products and services.
  • Organisations need augmented automation, in which machines become part of the organisation via closely connected machine–worker tasks and integrated IT and OT.
  • Future employees will be ‘system-savvy craftspeople’ with the skills to use sensors in order to collect and analyse accurate data, as well as design and manage connected processes.

About the authors

Anil Khurana is PwC’s global industrial, manufacturing and automotive industry leader. He is a principal with PwC US.

Reinhard Geissbauer is a partner with PwC Germany based in Munich. He is the global lead for PwC’s Digital Operations Impact Center.

Steve Pillsbury is a principal with PwC US and the US lead for PwC’s Digital Operations Impact Center.

ABB Updated MOM

ABB Updated MOM

Suppliers of manufacturing software, some from surprising places, are putting sizable investments into products that will help customers reap the rewards of digitalization. Today, I’m looking at both ABB and Emerson Automation Solutions. Previously I checked out GE Digital and Rockwell Automation. Each has taken a slightly different course toward the goal, but notice the common thread of enhancing software products to help customers prosper.

ABB enhances manufacturing management technology

The new version of ABB Ability Manufacturing Operations Management will offer new features including:

  • Enhanced user experience based on new HTML 5 web client;
  • A new smart interactive dashboard application that provides greater visibility and collaboration;
  • A new statistical process control (SPC) application, to determine if each process is in a state of control;
  • A new Batch Compare application – for advanced batch analysis.

“ABB Ability Manufacturing Operations Management is a comprehensive, scalable and modular software suite that optimizes visibility, knowledge and control throughout the operations domain,” said Narasimham Parimi, Head of Digital Products – Product Management, Process Control Platform. “This release provides a range of rich new functionality and a new enhanced user experience that enables operations to become more productive and responsive.”

ABB Ability Manufacturing Operations Management is designed to simplify production management by enabling performance monitoring, downtime management, and maintenance support, as well as providing statistical production analysis tools. It provides solutions and tools to facilitate the collection, consolidation and distribution of production, quality and energy information via the plant’s web-based reports, trends, and graphs.

A new, self-service dashboard application promotes increased collaboration, providing visibility from shop floor to top floor and spanning IT and OT environments. It increases data connectivity to all apps and modules within the MOM suite, combining historic and manufacturing data and providing the user with improved customization capabilities. Dashboards can be shared amongst users, further promoting collaboration between teams. Trends and events are displayed together, which enables customers to identify issues and opportunities enabling informed and timely decisions.

The new common services platform features an HTML 5 web platform that runs across all suites ensuring customers have a seamless user experience, so that applications can be viewed on different devices right down to a 10-inch tablet.

Statistical data process control (SPC) is used in manufacturing to determine if each process is in a state of control. The new SPC application works across all the different apps and modules and helps the user to improve quality and production related performance.

In addition to the existing Batch View and Batch Investigate features, a comparison option has been added to the platform’s batch analysis applications, allowing different types of comparison.

Cyber security remains one of the key issues in the advancement of Industry 4.0, and the new features in MOM include enhanced security.

Emerson Expands Analytics Platform

Plantweb Insight platform adds two new Pervasive Sensing applications that manage wireless networks more efficiently with a singular interface to the enterprise.

Emerson has added two new IIoT solutions to its Plantweb Insight data analytics platform that will enable industrial facilities to transform the way they manage their enterprise-level wireless network infrastructure.

As digitalization and wireless technology adoption continue to rapidly expand in industrial facilities throughout the world, the need for greater visibility of network infrastructure performance is key. These new Plantweb Insight applications provide a quick-to-implement, scalable IIoT solution that helps customers advance their digital transformation strategies and achieve greater operational efficiencies.

The new Plantweb Insight Network Management application provides continuous, centralized monitoring of WirelessHART networks. This first-of-its-kind application provides a singular, consolidated view of the status of all wireless networks in a facility, with embedded expertise and guidance for advanced network management.

A key feature of the Plantweb Insight Network Management application is a configurable mesh network diagram, providing visualization of network design and connections along with device-specific information. It also provides an exportable record of syslog alerts, network details outlining conformance to network best practices and more.

While the new network management application provides a holistic look at wireless networks, the Plantweb Insight Power Module Management application drills down to the device level, allowing facilities to keep their wireless devices appropriately powered so they can continuously transmit key monitoring data. By aggregating power module statuses, users can evolve traditional maintenance planning and implement more efficient and cost-effective practices.

“We were able to infuse a decade of experience with wireless technology into these new offerings,” said Brian Joe, wireless product manager with Emerson’s Automation Solutions business. “Our customers will now be able to manage and improve hundreds of networks through a singular interface, realizing significant efficiencies in individual network and wireless device management and maintenance.”

These new applications further enhance the Plantweb Insight platform, a set of pre-built analytics primarily focusing on monitoring key asset health. Other applications in the platform include pressure relief valve monitoring, heat exchanger monitoring and steam trap monitoring.

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

This announcement hits many trends and things you will eventually grow tired of hearing—partnerships, collaboration among companies, ecosystems, Kubernetes, containers, and, yes, 5G. The latter is coming. We just don’t know when and how, yet.

Wind River, a leader in delivering software for the intelligent edge, announced that it is collaborating with Dell EMC as a key hardware partner for distributed edge solutions. A combined software and hardware platform would integrate Wind River Cloud Platform, a Kubernetes-based software offering for managing edge cloud infrastructure, with Dell EMC PowerEdge server hardware. The initial target use case will be virtual RAN (vRAN) infrastructure for 5G networks.

“As telecom infrastructure continues to evolve, service providers are facing daunting challenges around deploying and managing a physically distributed, cloud native vRAN infrastructure,” said Paul Miller, vice president of Telecommunications at Wind River. “By working with Dell EMC to pre-integrate our technologies into a reference distributed cloud solution, we can cost-effectively deliver carrier grade performance, massive scalability, and rapid service instantiation to service providers as their foundation for 5G networks.”

“In a 5G world, new services and applications will not be driven by massively scaled, centralized data centers but by intelligently distributed systems built at the network edge,” said Kevin Shatzkamer, vice president of Enterprise and Service Provider Strategy and Solutions at Dell EMC. “The combination of Dell EMC and Wind River technology creates a foundation for a complete, pre-integrated distributed cloud solution that delivers unrivaled reliability and performance, massive scalability, and significant cost savings compared to conventional RAN architectures. The solution will provide CSPs with what they need to migrate to 5G vRAN and better realize a cloud computing future.”

Wind River Cloud Platform combines a fully cloud-native, Kubernetes and container-based architecture with the ability to manage a truly physically and geographically separated infrastructure for vRAN and core data center sites. Cloud Platform delivers single pane of glass, zero-touch automated management of thousands of nodes.

Dell EMC hardware delivers potent compute power, high performance and high capacity memory is well suited to low-latency applications.

A commercial implementation of the open source project StarlingX, Cloud Platform scales from a single compute node at the network edge, up to thousands of nodes in the core to meet the needs of high value applications. With deterministic low latency required by edge applications and tools that make the distributed edge manageable, Cloud Platform provides a container-based infrastructure for edge implementations in scalable solutions ready for production.

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Project Alvarium from Linux Foundation for Trusted Data

The IoT group that I’ve been working with for the past few years has been absorbed into the OEM group which is carrying on an expanded function. This blog post from Steve Todd, Dell Technologies Fellow, details the development of data confidence work that has been contributed to the open source Linux Foundation to seed Project Alvarium.

Following is a quick summary. Go to the blog for additional information about trusted data work.

A team of Dell Technologies specialists finished building the first-ever Data Confidence Fabric (DCF for short). The prototype code will be contributed to the Linux Foundation to seed Project Alvarium.

For several years, the CTO of the Dell Technologies Edge and IoT business unit has been touting a vision of data monetization. However, it’s hard to monetize untrusted Edge and IoT data. As he likes to say, “It’s midnight. Do you know where your data has been?” 

Enterprise storage systems have delivered trusted data to applications for a long time. We started our initial investigation wondering if these same trust principles could be applied to Edge and IoT ecosystems. Recent developments in data valuationdistributed ledgers, and data marketplaces facilitated everything coming together.

Five Levels of Trust

We started with the EdgeX Foundry chair of the Core Working Group, Trevor Conn. Trevor wrote the first-ever Data Confidence Fabric software using Go Lang, the same programming language EdgeX is written in. His Data Confidence Fabric software registered with EdgeX as a client and began processing simulated device data. The initial confidence score for this data was “0” (no trust was inserted). 

Dell Technologies then hired three computer science interns from Texas A&M to deploy EdgeX and the Data Confidence Fabric software on a Dell Gateway 3000 with a Trusted Platform Module (TPM) chip.

EdgeX was then adjusted to support N-S-E-W authentication by using VMware’s open-source Lightwave technology.

Dell Boomi software was invoked by the Data Confidence Fabric software to gather provenance and appended this metadata to the sensor reading.

The Data Confidence Fabric software then stored the data locally using IPFS (an immutable, open-source storage system). This fourth level of trust insertion gives an application confidence that the data/provenance has not been tampered with. It also has the additional benefit of enabling analytics to access data closer to the source.

The Data Confidence Fabric software then registered the data into VMware’s blockchain (based on the open-sourceProject Concord consensus algorithm). 

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Taking a Digital Journey

Keynoters have a tough time with originality these Digital Days with everyone emphasizing Digital Transformation. Steve Lomholt-Thomson, chief revenue officer of AVEVA, took us on a Digital Journey this morning. Setting the tone of the three days of AVEVA World Congress (North America edition).

Three technology trends to watch: an IoT boom; cloud/empowered edge; and, AI / ML. The theme is digital. The Digital Organization discovers its Digital DNA, figures out how to build that Digital DNA through people who challenge the status quo; and then figures out how to track talent flow.

Which all starts us on our Digital Journey. On this journey, we unify end-to-end data, connect data silos taking an wholistic view of the business, and then visualize our assets and supply chain. I believe implied in all this is the company’s product AVEVA System Platform. The company touted six customer stories with at least five of them (and probably the sixth) all leveraging System Platform.

Oh, and the only time the “W” word was used referred to past tense.

Other areas of the company were highlighted:

Focus on assets–asset performance management including how to use machine learning (ML) and artificial intelligence (AI) for predictive analytics (predictive maintenance.

How to combine it all into a Digital Twin–bringing the design lifecycle and physical lifecycle into congruence.

Recently hired head of North America business, Christine Harding, interviewed customers from Campbell’s (soup/snacks), Quantum Solutions (integration project at St. Louis/Lambert airport), and Suncor (Canadian oil sands).

I have the rest of today and then tomorrow to take deeper dives into many of these topics. If there is anything you want me to ask, send a note.

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

HighByte – A New Company Unveiled for DataOps

DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.

Intelligence Hub

HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.

HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.

“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”

The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”

To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected]

Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.

Background

From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.

DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.

DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.

From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.

At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers  — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.

Conclusion

As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.

Follow this blog

Get every new post delivered right to your inbox.