Whirlpool Migrates SAP Systems to Google Cloud for Sustainable Growth

Are you using a cloud service, yet? The competition among the various big-company cloud services working with industrial companies is becoming fierce. Here is a win for Google Cloud—one that I’ve only seen active recently. Whoever thought that these would grow to be such large businesses?

Highlights of this announcement include: 

  • Whirlpool Corp. announced the expansion of its strategic collaboration with Google Cloud to deliver critical business systems and applications on Google Cloud. 
  • As part of this expanded collaboration, Whirlpool is deploying its enterprise-wide SAP environment and applications on Google Cloud, providing its global teams with low-latency, secure access to SAP systems and data.By bringing these systems onto Google Cloud, Whirlpool is provided with an environment that ensures maximum uptime, provides global access to applications with very low latency, and empowers the company’s teams of data analysts to derive maximum value from its business data. 
  • Google Cloud is also providing Whirlpool with an elastic cloud infrastructure that can scale as needed and provides access to Google Cloud’s next-generation capabilities in AI, ML, and analytics. 


Whirlpool Corp. announced June 8 that it has expanded its strategic collaboration with Google Cloud to deliver critical business systems and applications on Google Cloud’s secure, reliable, and sustainable infrastructure. The company, which rolled out Google Workspace to its employees in 2014, has now deployed its enterprise-wide SAP environment and applications on Google Cloud, providing its global teams with low-latency, secure access to SAP systems and data.

Whirlpool relies on SAP for many aspects of its business, including supply chain management, manufacturing planning and IoT, enterprise resource planning (ERP), finance, customer relationship management (CRM), and more. Bringing these business-critical systems onto Google Cloud provides the company with an environment that ensures maximum uptime, provides global access to applications with very low latency, and empowers the company’s teams of data analysts to derive maximum value from its business data, such as data on its supply chain, financial systems, IoT, and more.

Google Cloud is also providing Whirlpool with a platform for growth, with elastic cloud infrastructure that can scale up or down as needed, and with access to Google’s next-generation capabilities in artificial intelligence, machine learning, and analytics that are increasingly significant drivers of digital transformation.

With this announcement, Whirlpool is also expanding its use of the cleanest cloud in the industry, ensuring that its business systems are run sustainably and responsibly. Google Cloud has matched 100 percent of its global electricity use with purchases of renewable energy every year since 2017, and is now building on that progress with a new goal of running entirely on carbon-free energy at all times by 2030.

“Whirlpool Corporation is committed to reaching zero emissions by 2030 and turning to Google Cloud’s clean infrastructure for our global business systems and applications is a step forward toward that goal,” said Dani Brown, senior vice president and CIO at Whirlpool Corporation. “We are excited to strengthen our strategic relationship with Google Cloud to empower our employees with cloud productivity solutions, and to ensure that our most critical business systems and applications are delivered securely, efficiently, and sustainably.”

“Whirlpool Corp. is creating a foundation for future growth with a forward-looking, cloud-first approach to its critical SAP systems, while maintaining a strong commitment to sustainability,” said Rob Enslin, President at Google Cloud. “We’re proud to expand our strategic collaboration with Whirlpool and will continue to support the company’s digital transformation across all of its global operations.”

Financial Risks When Delaying PLM Upgrades

Senior management have always been reluctant to invest in technology and especially upgrades once a technology is in place. I have seen instances where management lays off the senior engineers who implemented something like Advanced Process Control or Manufacturing Execution Systems keeping a recent graduate engineer to maintain the system, if even that. Management sees only a large salary cost reduction. Rarely is maintaining momentum a virtue.

I have been in way too many of these discussions in my career. I’ve seen results one way or another. There have been the instances where they had to hire back the laid off engineer at higher consultant rates to get the system back up and running properly.

So, this report from CIMdata detailing research on PLM software upgrading was hardly surprising. Disturbing, perhaps, but not surprising.

Digital transformation is a popular topic, and CIMdata has written much about it. While many still wonder whether digital transformation is real or just the latest buzzword, many industrial companies are taking its promise very seriously.

While it is clear to all within the PLM community that PLM is foundational to a meaningful digitalization program (or digital transformation strategy), this truth is not always understood by senior leadership within companies. While CIMdata believes that the level of investment in digital transformation is appropriate, based on our research and experience we find that executive awareness of the dependency of digital transformation on PLM is lacking. This lack of understanding of its association to PLM-related investment, sustainability and impacts on business performance and benefits puts many digital transformation programs at risk of becoming yet another program of the month.

This research on obsolescence identified areas that increased the cost of technology refresh and found that heavy customization was at the top of the list. This aligns with CIMdata’s experience in the field and is why companies strive to be more out-of-the-box with their PLM implementations. CIMdata’s view is that customization can add significant value to a PLM implementation, but it needs to be either business or cost justified and deliver an appropriate return on investment over the long-term (i.e., even through subsequent solution upgrades).

A new study from CIMdata exposes the financial risk many organizations face when they take PLM upgrades for granted. According to the study, the cost of upgrades with legacy PLM vendors can average between $732,000 and $1.25 million. The study – which compares industry heavyweights such as Dassault, PTC, and Siemens – finds the Aras PLM platform is easiest to keep current. Aras users upgrade more frequently, over a shorter duration, and at less cost than other leaders in the space. 

What’s behind PLM obsolescence? According to CIMdata, “A sustainable PLM solution is one that can meet current and future business requirements with an acceptable return on investment (ROI) via incremental enhancements and upgrades.” But as clearly shown in the research, many companies using PLM software are not staying current. The five reasons are: 


1. Technically Impossible. Typically, after an arduous deployment and the necessary customization to meet the businesses current needs, the software is no longer capable of upgrading. 
2. No ROI. If you take a year to upgrade and it costs close to a million dollars, the cost and impact to the business is so outrageous it can’t be justified.

3. No Budget. Not having the budget is a real concern, but often the lack of budget is a mistake—a mis-prioritization of what’s important to your organization’s future growth, often combined with a high percentage of the overall budget being consumed by technical debt. 
4. Companies overinvest and therefore are committed. The only thing worse than spending large amounts of money on the wrong thing is doubling down and spending more, expecting a better experience. The pandemic has accelerated the need to change, to expect transformation with less risk, less cost, and greater ROI that will lead to greater business resiliency. Throwing good money after bad is no longer being tolerated—there is more of a focus on the bottom-line and doing more with less. 
5. Leadership Doesn’t Understand Dependency of Digital Transformation on PLM. If your PLM system hasn’t been upgraded in years and isn’t the foundation for continuous digital transformation efforts, there is an absolute lack of understanding of how PLM can transform a business.

The Converged Edge Explained

Our schedules finally converged. I caught up with Tom Bradicich, PhD, known within Hewlett Packard Enterprise (HPE) as “Dr. Tom,” to learn the latest on the converged edge. Tom is one of the half-dozen or so people I know who can dump so much information on my brain that it takes some time to digest and organize it. He led development of the Edgeline device connecting with the Industrial Internet of Things. He is now VP and HPE Fellow leading HPE Labs developing software to come to grips with the complexities of the converged edge and “Converged Edge-as-a-Service”.

He likes to organize his thoughts in numerical groups. I’m going to discuss converged edge below with his groupings:

3 C’s

4 Stages of the Edge

7 Reasons for IoT and the Edge

3 Act Play

12 Challenges

The foundation of the converged edge is found in the 3 C’s:

  1. Perpetual Connectivity
  2. Pervasive Computing
  3. Precision Controls

I remember Tony Perkins following up the demise of Red Herring magazine (charting the hot startup and M&A craze of the 90s, the magazine grew so large it came in two volumes for a while) with an online group called AlwaysOn. Trouble is, back in the 90s, we weren’t “always on.” Persistent connectivity was beyond our technology back then. Now, however, things have changed. We have so much networking, with more to come, that perpetual connectivity is not only possible, but also mundane.

HPE didn’t take a personal computer and package it for the edge. It developed Edgeline with the power of its enterprise compute along with enterprise grade stacks. It is powerful.

Then we have the 4 Stages of the Edge:

  1. Things—sensors and actuators
  2. Data Capture & Controls
  3. Edge IT (networking, compute, storage)
  4. Remote Cloud or Data Center

This is where Internet of Things meets the Enterprise.

Why do we need edge compute and not just IoT-to-Cloud? 7 Reasons:

  1. Minimize Latency
  2. Reduce bandwidth
  3. Lower cost
  4. Reduce threats
  5. Avoid duplication
  6. Improve reliability
  7. Maintain compliance

The Converged Edge is a 3-Act Play:

  1. Edgeline systems & software; stack identicality
  2. Converged embedded PXI and OT Link
  3. Converged Edge-as-a-Service

At this point in time, we are faced with 12 challenges to implementation:

  1. Limited bandwidth
  2. Smaller footprint for control plane and container
  3. Limited to no IT skills at the edge
  4. Higher ratio of control systems for compute/storage nodes
  5. Provisioning & lifecycle management of OT systems and IoT devices
  6. OT applications are primarily “stateful”, cloud unfriendly
  7. Data from analog world & industrial protocols
  8. Unreliable connectivity—autonomous disconnect operation
  9. Higher security vulnerabilities
  10. Hostile and unfamiliar physical environments and locations
  11. Long-tail hardware and software revenue model—many sites, fewer systems
  12. Deep domain expertise needed for the many unique edges

Of course, we could go into each of these items. Dr. Tom does in one of his latest talks (I believe it was at Hannover). We should pause at number 12, though. This is an often-overlooked necessity by AI evangelists and other predictive maintenance would-be disrupters. When you begin messing with industrial, whether process or discrete manufacturing, it really pays to know the process deeply.

I can’t believe I summarized this in less than a 600-word essay (is that still the common university requirement?). It is just an outline, but it should reveal where HPE has been and where it is going. I think its power will be disruptive to industrial architectures.

Collaboration to Drive Digital Transformation Initiatives for Oracle Customers

We’ve seen news about the clouds of many IT companies. Here is news about Oracle. Two companies are collaborating to bring Oracle customers to the Oracle Cloud. In so doing, they wish to help the customers achieve “digital transformation.” Someday we may have a definitive explanation of just what this digital transformation is. It isn’t in this news. But I believe that this instance means making a company’s digital data available.

Kalypso, a professional services firm, announced its partnership with AVATA, a services provider for Oracle Cloud, enterprise resource planning (ERP), supply chain management (SCM), and enterprise performance management (EPM) solutions. The partnership will help clients accelerate digital transformation initiatives with Oracle’s suite of Cloud applications. Both Kalypso and AVATA have long-established and successful Oracle practices; the partnership extends each firms’ expertise and reaches to deliver more comprehensive SCM, ERP, product lifecycle management (PLM), and enterprise data management (EDM) solutions through Oracle’s Cloud technology.

The two firms complement each other’s Oracle Cloud application implementation capabilities. AVATA delivers strategic and implementation services related to Oracle’s SCM, ERP, and EPM solutions, and Kalypso provides services related to Oracle’s PLM, EDM, and emerging applications, such as IoT, blockchain and data science. With this combined expertise, the partnership can provide comprehensive coverage across Oracle’s Cloud applications portfolio to accelerate digital innovation and Industry 4.0 initiatives.

“We’re excited to combine forces with AVATA to deliver increased value for our rapidly expanding Cloud client base,” said Nigel Hallett, Oracle Practice Managing Director, Kalypso. “As digital transformation and Cloud have become even more critical to business success as a result of the COVID-19 pandemic, this partnership will enable us to better support our clients by helping them leverage the power of Oracle’s comprehensive, integrated Cloud solutions.”

Leveraging both companies’ experience from hundreds of Oracle implementations, the combined solutions portfolio and services expertise will help Oracle customers:

  • Migrate legacy/on-premises Oracle applications to the Cloud
  • Develop value-based solution roadmaps for digital transformation
  • Enable the digital thread with end-to-end Oracle Cloud enterprise solutions
  • Accelerate innovation by increasing connectivity and insights in the enterprise value chain

“We’re honored to partner with Kalypso to help our clients maximize the value of Oracle’s Cloud applications,” said Kevin Martin, Vice President, Sales, and Solutions, AVATA. “Kalypso’s strategic expertise complements ours well, and we look forward to joining forces to help our clients expand and make the most of their digital investments.”

Kalypso, a Rockwell Automation company, is a professional services firm helping clients discover, create, make and sell better products with digital. The firm provides consulting, digital, technology, business process management, and managed services across the innovation value chain. For more information, visit http://kalypso.com

AVATA is a leading strategic partner of Oracle and is recognized for its global capabilities in

helping companies solve critical business challenges through people, processes, and technology. We offer the unique blend of real-world industry experience, best practices, and software expertise that sets us apart from pure system integrators. Leveraging Oracle SCM and ERP Cloud solutions, we provide our clients with a strategy that fits their organization and competitive processes that differentiate them in their respective markets and successfully deliver rapid improvements impacting bottom-line performance. AVATA is headquartered in the US with resources throughout the USA, Australia, India, and Europe. Follow AVATA on Twitter and LinkedIn. For more info, visit www.avata.com.

New Research Reveals Enterprise “Digital Divide” Has Increased as a Result of the Pandemic

OK, we probably already know that companies who had invested in digital technology before the pandemic were prepared to ramp it up and survive—both commercially and with the sudden work-from-home necessity. This research report bears that out. It is also paid for by a mobility provider, so the research results showing potential growth in that market is hardly surprising.

However, if you are on the fence about technology investments, consider this research from Stratix Corporation, and VDC Research surveying North America-based IT stakeholders to better understand how they adapted their technology programs to rapid disruption due to the pandemic, and gauge the impact of these changes on their investment and larger strategic initiatives for their mobile and frontline workforces.

The survey’s findings establish that only those already ahead of the curve on IT investments were able to continue operating normally. Consequently, those same innovative organizations are now most aggressively expanding their mobile technology investments and capabilities in response to COVID-19. This leaves the trailing majority of businesses that were most impacted by the pandemic in dire need of establishing their own “new normal” of baseline mobile technologies to empower them with the flexibility necessary to remain productive and competitive.

Additional findings from the report include:

  • Organizations using MMS were 36% more likely to continue to operate normally with no revenue impact from COVID-19, than those who did not use MMS.
  • 55.4% of respondents invested in additional mobile technologies in response to the pandemic, and 47.7% added new mobile programs.
  • More than 75% of respondents either agree or strongly agree that COVID-related disruptions have transformed the ways in which their organizations engage and do business with their customers.
  • 33.2% say COVID-19 has aggressively accelerated the pace at which they pursue and roll out new IT/mobile technology projects.
  • Amongst those who are aggressively accelerating IT investment, 61% are most likely to move to a 1:1 model of device deployment, a significant paradigm shift away from corporate-owned shared devices.

“This paper and the research done to support it both highlight the organizational characteristics that enabled successful pandemic pivoting – factors such as a flexible IT infrastructure and mature technology vendor partnerships proved critical,” said Pat Nolan, Analyst, Enterprise Mobility & the Connected Worker with VDC Research. “Moreover, this research leverages firsthand insights to provide guidance regarding the cross-industry accelerated focus on disruption readiness that will be grounded in increased mobile technology investments and partnerships.”

“The VDC findings showcase the need for flexible IT infrastructure and the substantial value MMS provides the enterprise,” said Stratix Senior Vice President Marketing Elizabeth Klingseisen. “It’s clear that a combination of innovative planning and expert IT partners well-versed in technology deployments and support for mobile workforces is a must-have to successfully navigate disruption.”

The full survey results and findings can be found in the Enterprise Mobility Outlook: Thriving and Amidst Disruption Through Mobile First Strategies research report.

NI Joins Open Manufacturing Platform Organization

I first heard about the Open Manufacturing Platform during my last trip to Germany, well, my last business trip anywhere, last February. I wrote about it here–Open Manufacturing Platform Expands.This effort, led by Microsoft and BMW joined by ZF, Bosch, and ABInBev, “helps manufacturers leverage advanced technologies to gain greater operational efficiencies, factory output, customer loyalty, and net profits.” That’s a tall order. These are companies that I’ve seen leverage technology for improvements over the years. This should be an advancement.

This month’s news items (2) relating to OMP include NI through its recent acquisition Optimal Plus joining the organization and a new deliverable from the OMP’s working group.

NI says that it has joined OMP “with the goal of establishing an architecture and standards for auto manufacturers to better leverage and automate analytics to improve quality, reliability and safety.”

I had an opportunity to interview Michael Schuldenfrei, NI Fellow and OptimalPlus CTO about smart manufacturing, what OptimalPlus adds to NI, and OMP. The roots of OptimalPlus lie in enterprise software relative to manufacturing of semiconductors. An early customer was Qualcomm who used the software to collect and analyze data from its numerous manufacturing plants. It branched out into assemblies, such as with a new customer Nvidia. Later the company added mechatronics to its portfolio. That was a good tie in with NI.

Rather than become just another smart manufacturing application focusing on machines, OptimalPlus brings its focus to the product being manufactured. Given NI’s strength in test and measurement, this was a definite synergy. As I have written before here and here, this enterprise software addition to NI’s portfolio is just what the company needs to advance a level.

Michael told me he was an early advocate for OMP due to seeing how his technology worked with Tier 1 automotive suppliers to drive digital transformation process. 

NI announced that its latest acquisition, OptimalPlus, has joined the Open Manufacturing Platform (OMP), a consortium led by BMW, Microsoft, ZF, Bosch and ABInBev that helps manufacturers leverage advanced technologies to gain greater operational efficiencies, factory output, customer loyalty, and net profits.

The OMP’s goals include creating a “Manufacturing Reference Architecture” for platform-agnostic, cloud-based data collection, management, analytics and other applications. This framework will provide a standard way to connect to IoT devices on equipment and define a semantic layer that unifies data across disparate data sources. All in all, this has the potential to create a rich, open-source ecosystem that enables faster and easier adoption of smart manufacturing technologies.

In the same way that interpreters at the United Nations help delegates communicate and make new policies, standardized data formats accelerate the adoption of big data and machine learning, creating a universal translator between multiple machine and process types. OptimalPlus, now part of NI, will bring to OMP its vast domain expertise in automotive manufacturing processes and provide leading production companies with actionable insights and adaptive methods from its big data analytics platform.

“We’re honored to be invited to join the prestigious Open Manufacturing Platform, which plays a key role in helping manufacturers all over the world innovate,” said Uzi Baruch, VP of NI’s Transportation business unit. “With pressure mounting to ensure quality and prevent faulty parts from shipping, it’s important that manufacturers have access to the transformative powers of AI, machine learning and big data analytics. We’re excited to collaborate with industry leaders in the OMP consortium to help manufacturers evolve and optimize their processes.”

AI and advanced analytics help to streamline manufacturing, reduce costs and improve quality, reliability and safety. OMP makes it easier for manufacturers to deploy this technology across their operations and fulfill the promise of smart manufacturing.

White Paper: Insights Into Connecting Industrial IoT Assets

The second bit of news describes a first deliverable from the OMP as it progresses toward its objective.

OMP announced delivery of a critical milestone with the publication of our first white paper. The IoT Connectivity Working Group, chaired by Sebastian Buckel and co-chaired by Dr. Veit Hammerstingl of the BMW Group, authored Insights Into Connecting Industrial IoT Assets. Contributions from member companies Capgemini, Cognizant, Microsoft, Red Hat, and ZF present a consensus view of the connectivity challenges and best practices in IIoT as the 4th industrial revolution unfolds. This paper is the initial publication laying out an approach to solving connectivity challenges while providing a roadmap for future OMP work.

Manufacturing at an Inflection Point

The intersection of information technology (IT) and operational technologies (OT), as well as the advent of the Internet of Things (IoT), presents opportunities and threats to the entire manufacturing sector. In manufacturing, multiple challenges complicate the connection of sensors, actuators, and machines to a central data center. Lack of common standards and proprietary interfaces leads each engineer to solve similar problems, introducing inefficiencies and forcing the same learning curve’s ascension over and over. The long renewal cycles of shop floor equipment, software, and processes present gaps in modern technologies and a general avoidance of making significant institutional changes. This initial publication begins to tackle these problems and lays the groundwork for future, more detailed work.

IT/OT Convergence

Each connectivity challenge will have a range of diverse constituents and the content of this paper addresses issues faced by individuals and teams across job functions. Operational technology (OT) professionals are responsible for the commissioning, operation, and maintenance of shop floor equipment. Information technology (IT) personnel look after overall data processing, the hardware and software infrastructure, and enterprise-wide IT strategy. General managers and logistics teams are typically aligned at a corporate level, coordinating processes across a network of plants. Each of these functions will have roles spanning from operational hands-on to strategic and managerial. The unique demands of each part will require connectivity solutions to be forward-thinking and value-accretive while offering practical solutions implemented with minimal incremental investment.

Industrial IoT Challenges

Also explored in the paper, are the IIoT devices’ critical real-time needs for repeatability and high availability. An example is an AI model that optimizes the parameters of a bending machine based on the current air temperature and humidity. Possible connection failures or high latencies can lead to stopped or interrupted processes or products with insufficient quality.

Manufacturing throughput requirements vary from low bandwidth for simple sensors using small packets to much higher bandwidth required for streaming data for video analytics, vibration sensors, or AR/VR visualization.  A holistic connectivity solution can address this complexity successfully, spanning from the individual devices on the shop floor up through edge gateways and servers to the central data center or cloud resources such as compute and storage.

Network Levels

Networks are usually customized to their precise environment and the desired function, and therefore can be very complex.

In the white paper, we discuss the functions of each of the network levels, their benefits and limitations, and security considerations. Additional sections of the document cover common challenges in IIoT, connectivity levels, basic principles for successful connectivity solutions, communication types, and best practices for program implementation.