Financial Risks When Delaying PLM Upgrades

Senior management have always been reluctant to invest in technology and especially upgrades once a technology is in place. I have seen instances where management lays off the senior engineers who implemented something like Advanced Process Control or Manufacturing Execution Systems keeping a recent graduate engineer to maintain the system, if even that. Management sees only a large salary cost reduction. Rarely is maintaining momentum a virtue.

I have been in way too many of these discussions in my career. I’ve seen results one way or another. There have been the instances where they had to hire back the laid off engineer at higher consultant rates to get the system back up and running properly.

So, this report from CIMdata detailing research on PLM software upgrading was hardly surprising. Disturbing, perhaps, but not surprising.

Digital transformation is a popular topic, and CIMdata has written much about it. While many still wonder whether digital transformation is real or just the latest buzzword, many industrial companies are taking its promise very seriously.

While it is clear to all within the PLM community that PLM is foundational to a meaningful digitalization program (or digital transformation strategy), this truth is not always understood by senior leadership within companies. While CIMdata believes that the level of investment in digital transformation is appropriate, based on our research and experience we find that executive awareness of the dependency of digital transformation on PLM is lacking. This lack of understanding of its association to PLM-related investment, sustainability and impacts on business performance and benefits puts many digital transformation programs at risk of becoming yet another program of the month.

This research on obsolescence identified areas that increased the cost of technology refresh and found that heavy customization was at the top of the list. This aligns with CIMdata’s experience in the field and is why companies strive to be more out-of-the-box with their PLM implementations. CIMdata’s view is that customization can add significant value to a PLM implementation, but it needs to be either business or cost justified and deliver an appropriate return on investment over the long-term (i.e., even through subsequent solution upgrades).

A new study from CIMdata exposes the financial risk many organizations face when they take PLM upgrades for granted. According to the study, the cost of upgrades with legacy PLM vendors can average between $732,000 and $1.25 million. The study – which compares industry heavyweights such as Dassault, PTC, and Siemens – finds the Aras PLM platform is easiest to keep current. Aras users upgrade more frequently, over a shorter duration, and at less cost than other leaders in the space. 

What’s behind PLM obsolescence? According to CIMdata, “A sustainable PLM solution is one that can meet current and future business requirements with an acceptable return on investment (ROI) via incremental enhancements and upgrades.” But as clearly shown in the research, many companies using PLM software are not staying current. The five reasons are: 


1. Technically Impossible. Typically, after an arduous deployment and the necessary customization to meet the businesses current needs, the software is no longer capable of upgrading. 
2. No ROI. If you take a year to upgrade and it costs close to a million dollars, the cost and impact to the business is so outrageous it can’t be justified.

3. No Budget. Not having the budget is a real concern, but often the lack of budget is a mistake—a mis-prioritization of what’s important to your organization’s future growth, often combined with a high percentage of the overall budget being consumed by technical debt. 
4. Companies overinvest and therefore are committed. The only thing worse than spending large amounts of money on the wrong thing is doubling down and spending more, expecting a better experience. The pandemic has accelerated the need to change, to expect transformation with less risk, less cost, and greater ROI that will lead to greater business resiliency. Throwing good money after bad is no longer being tolerated—there is more of a focus on the bottom-line and doing more with less. 
5. Leadership Doesn’t Understand Dependency of Digital Transformation on PLM. If your PLM system hasn’t been upgraded in years and isn’t the foundation for continuous digital transformation efforts, there is an absolute lack of understanding of how PLM can transform a business.

Datadobi Points to Research Highlighting the Impact of Data Growth on Storage Management

The Industrial Internet of Things along with other manufacturing IT advances led to a rise in unstructured data causing storage challenges increasing the importance of data mobility. Continuing my theme today of data, this is news from a company new to me called Datadobi, which touts itself as a global leader in unstructured data migration software.

It released a report earlier this month by 451 Research, part of S&P Global Market Intelligence, which reveals the major impact that data growth is having on storage management, highlighting the rise in retention of unstructured data exacerbating the storage challenges faced by organizations. The report, which was commissioned from 451 Research features data from 451 Research’sVoice of the Enterprise: Storage, Data Management & Disaster Recovery 2021survey, underscores the need for businesses to plan for the disposition of ageing data, implement migration and protection plans, and understand storage-consumption costs if they are to effectively manage data, whether on-premises and in the cloud.

The report identifies data growth as the number one storage management challenge in most organizations, ahead of a range of issues such as disaster recovery requirements, cost, and migration. In addition, respondents said they expect their data management to grow by 26% over the next 12 months.

The growth in unstructured data is also complicating the approach organizations need to take to disaster recovery and data protection. As the report points out, this is “forcing them to look beyond traditional on-premises storage infrastructure to leverage public cloud storage or managed services to help reduce the burden on the IT staff. Hybrid IT.” This means that organizations will need to manage growing levels of unstructured data, both on-premises and in the cloud. As a result, companies will require cross-platform data migration (data mobility), protection, and management that are rooted in clear and identifiable ownership. (Business Impact Brief by 451 Research, part of S&P Global Market Intelligence)

To address these challenges, the report says: “Organizations should seek out tools and platforms that allow them to leverage a wide array of vendor and cloud offerings to preserve choice while also preserving the ability to negotiate lower prices and superior service levels from their vendors. The ability to provide deeper insight into how data is being used is invaluable given the price disparity between high performance storage tiers and low-cost archive tiers.”

Commenting on the report, Michael Jack, Datadobi CRO, said: “Data growth continues to be a top challenge in most organizations. As cloud adoption continues, efficient data mobility takes on new importance and must be automated as much as possible. Retention policies, data protection policies, and data disposition policies need to be clearly defined through partnerships between business data owners and the infrastructure owners who provide storage services to the business.”

Jack continued: “By delivering enterprise-class NAS migration software to migrate and protect data anywhere, Datadobi helps customers around the world address these challenges and remain in control of their storage management strategy.”

Download a full copy of the report.t

Hitachi Vantara Supports Data-Driven Business

This week’s virtual conference is Hitachi Vantara Social Innovation. I watched a few sessions yesterday. I’ll be back today for a little more between household chores and an interview. The key takeaway is data-driven. Showcased were the Disney Company and American Heart Association with executives discussing the business value of successfully employing a data-driven culture with Hitachi Vantara infrastructure.

I have been sitting on this press release for three weeks because I actually had a huge backlog of news. It begins by touting its position as Leader in Gartner’s Magic Quadrant for Industrial IoT Platforms. It is in great company with PTC and Microsoft.

Highlights of the news include: 

  • Lumada Manufacturing Insights: The industry-specific solution integrates data from multiple sources – from vibration, video, lidar and audio – to detect disturbances in the supply chain and allows for greater visibility and planning. 
  • Smart Spaces & Lumada Video Insights: With improved team collaboration and incident response, these solutions use data sourced from trains, first responder vehicles and factories, among others, to improve quality and safety in the human-machine interaction. 
  • Lumada Edge Intelligence: Faster than ever before, made possible by an integration with Google Cloud, this solution accelerates data preparation and ML workflows for automated decision-making and more time to focus on the value of insights. 

Hitachi Vantara, the digital infrastructure, data management, and digital solutions subsidiary of Hitachi announced advancements to the Lumada software platform and industry solutions to accelerate the digital transformation of industrial processes.

Improving manufacturing operational outcomes involves comprehensive data analysis and integration from thousands of moving parts across remote and industrial environments. Lumada is Hitachi’s digital platform that connects data, assets, and people to fuel industry innovation. It is the software foundation for Lumada Industry Solutions, that extract data-driven insight and drive better operational and business outcomes. The updated Lumada portfolio allows customers to automate tasks and make faster decisions by training data models in the cloud and deploying them to edge devices, creating actionable insights from diverse data sets at lower infrastructure cost. 

“Across the globe, industries are dealing with increasing complexity, a faster changing environment and greater competition that together are driving a need for accelerated digitalization. Supply chain disruptions, health and safety measures and operational challenges have highlighted this need for data-driven innovation,” said, Radhika Krishnan, Chief Product Officer, Hitachi Vantara. “Today’s advancements allow our customers to make faster, more informed decisions so industries can thrive in our rapidly digitalizing future.”  

Delivering Deeper Insights and Faster Time to Value 

Hitachi Vantara is accelerating industrial digitalization with major enhancements to data-driven offerings for manufacturing, extending AI and automation from edge to core, and delivering deep real-time insights from new combinations of data and connections.  

 Lumada Manufacturing Insights:  

  • This industry solution delivers greater visibility across a customers’ supply chain subsystems with the supply chain module’s ability to implement supply chain control tower solutions and take direct, demand-driven action.  
  • Integrating and correlating data from multiple sources– from asset health data to vibration, video, lidar and audio – to detect potential failure of a machine, manufacturers can better predict points of failure and perform preventive maintenance, reducing downtime and improving output.  
  • Also new is the ability to automate forms for digitization of factory floor processes – a practice that is still largely done with pen, paper and spreadsheets – to establish ‘if this, then that’ protocols across manufacturing processes.  
  • Lumada Manufacturing Insights is now also available on the Microsoft Azure Marketplace for easier integration with Microsoft cloud environments. 

Smart Spaces & Lumada Video Insights:  

  • These industry solutions leverage new workflow automation within Hitachi Visualization Suite and a mobile application for improved team collaboration and incident response.  
  • An expanded Hitachi Edge Gateway portfolio includes industry-tailored and “ruggedized” versions that allow for data integration from sources such as trains, industrial spaces, or first responder vehicles and equipment, and includes higher compute power at the edge with CPU or GPU options to enable new outcomes, and make faster, more data-driven decisions. 
  • Sensor fusion creates the ability to co-analyze video, lidar, and other data to enable new use cases such as quality assurance and analysis of human-machine interaction, while improving accuracy. 

Lumada Edge Intelligence:  

  • Integration between the Lumada software platform and Google Cloud allows customers to speedup data preparation by adjusting resources on-demand and combining multiple data types for better insights. 
  • Updates to Lumada Edge Intelligence also simplify Machine Learning workflows by pushing models to edge devices for faster automated decision-making without reliance on point tools.  
  • New APIs for edge management and data access allow reuse of assets, gateways, and software to create integrated solutions utilizing existing infrastructure. 

“Meat & Livestock Australia has been collaborating with Hitachi Vantara on a number of digital projects over the past five years leading up to our latest project, the Connected Beef Supply Chain Control Tower,” says Dr. Nigel Tomkins, program manager, grassfed productivity at Meat & Livestock Australia. “Hitachi Vantara’s Lumada Manufacturing Insights has allowed our industry to integrate both sensor and system data to provide insights across the supply chain—this has led to improved productivity and quality outcomes. We look forward to leveraging the capabilities of the Supply Chain Control Tower even further—gaining insight on factors impacting supply and consumer demand.” 

“Our industry is experiencing rapid digitalization and a distinct increase in the pace of business. This underscores the need for more agility and predictability in everything we do and what we deliver to our customers,” Petra Sundström, VP & Head of Digital Offering, Sandvik Rock Processing at industrial manufacturer Sandvik. “We’re collaborating with Hitachi to innovate our business models. With Lumada Manufacturing Insights, we are now able to offer predictive maintenance as a service–delivering the outcomes our customers are looking for in this digital era.” 

“For manufacturers to get real end-to-end benefits from data-driven solutions, it’s important to focus not only on the obvious areas of production. Other data sources and solutions beyond the factory floor should also be looked at. For example, use video analytics to study material flow from receiving dock to warehouse to shop floor; use lidar to monitor employee movements from a safety perspective. There are so many ways to use this technology, and the applications become more apparent as the team familiarizes with the sensors and analytics,” says Allen Ahlert, senior director, Engineering with Hitachi Computer Products (America), Inc. which leverages Lumada solutions at its 352,000 sq. ft manufacturing and supply facility in Norman, OK. “Hitachi Vantara has been able to approach this holistically beyond what point solutions can do to create comprehensive, rich insights across facilities and processes.” 

Plethora of News from Rockwell Automation

I received few news releases from Rockwell Automation for several years. Suddenly I gained a new friend, Jack, who sends something almost every week. It’s good to know that one of the largest control and automation suppliers in North America is still churning out updated products.

I’ve been saving these up for a bit. Included in this post:

  • CIP Security Proxy Device
  • Plant floor asset management
  • Stack light
  • Medium voltage drives
  • Connected Components Software Workbench
  • Managed Ethernet switch
  • Network security threat detection

CIP Security Proxy Device 

Industrial companies can now implement CIP Security expansively in their systems with the Allen-Bradley CIP Security Proxy. The CIP Security Proxy allows users to implement CIP Security on most devices on their network. 

The CIP Security Proxy works with EtherNet/IP-compliant devices. CIP Security is part of the defense in depth strategy, which can help defend against attacks where threat actors can remotely access a network and act maliciously. With the ability to provide CIP Security for a single device, a layer of security is added that can help protect the system.

Configuration for the proxy device can be achieved through FactoryTalk Policy Manager software and FactoryTalk system services. In addition, this device supports motion for Kinetix drives and offers a web server for viewing diagnostics. It allows for secure event generation syslog support and includes rotary switches for 192.168.1.xyz IP addressing. The proxy device also contains three one-gigabit EtherNet/IP ports and can operate in temperatures from -25° to +70° Celsius adding to the ease of use.

Plant-Floor Asset Management with Enhanced Software

Industrial workers can now more easily manage their hundreds or thousands of automation assets using the enhanced FactoryTalk AssetCentre software from Rockwell Automation. The latest release provides firmware and software lifecycle information for all assets in one place. This saves time because workers no longer need to connect to control cabinets and manually record information for each device. 

With the software’s enhanced asset inventory functionality, workers can quickly scan a network and see which devices are in a specific lifecycle state. Examples include devices running retired firmware or forecasted to be discontinued in the next six months. This helps identify products in the same lifecycle state and workers can better plan for replacements and upgrades. 

The FactoryTalk AssetCentre software also has a new security feature called archive management of change, which automates the process of authorizing who can change files and what they can change. It requires workers to explain why files need to be changed and verifies that only necessary files are being checked out. It also locks a file until changes are approved and escalates approval requests when needed. 

This helps enhance system security, which is particularly useful for some industries such as oil and gas, that require added levels of control over when changes are permitted. For example, one major food company reduced its downtime events from unknown or unauthorized changes by 7% using FactoryTalk AssetCentre software. It can also reduce downtime due to change management.

These updates add to the software’s existing ability to report discontinuation dates and the availability of replacement products. The enhanced software now also provides disaster-recovery support for more Rockwell Automation devices as well as third-party devices.

Stack Light

The new Allen-Bradley 856T Control Tower Stack Light system uses a modular design that incorporates brighter LED illumination and a broad offering of sound technologies. All signals in the system are 24V AC/DC powered, which means that just three power modules can cover the entire system. The latest additions to the 856T Control Tower Stack Light family are IO-Link enabled versions that provide diagnostic information and ease integration into a Connected Enterprise. 

IO-Link enabled versions of Bulletin 856T Control Tower Stack Lights enable users to monitor tower light and machine status in real-time, while allowing for simple remote set-up and troubleshooting.

Medium Voltage Drives

Allen-Bradley PowerFlex 6000T medium voltage drives now include TotalFORCE technology from Rockwell Automation, which provides precise control of speed and torque, diagnostic information for tracking system health and automatic adjustments to keep operations running smoothly.

The PowerFlex 6000T drives follow speed or torque commands closely in both open- and closed-loop vector control modes to deliver the precise control required for high performance and large loads.

The drives also continuously monitor operations to track the health of electrical components in the drive and motor and provide real-time diagnostic information to the control system.

Additionally, adaptive control features within the PowerFlex 6000T drives help isolate potentially harmful vibration and resonances, and automatically compensate for variances to help keep applications running. With load-observer technology, they also effectively reject disturbances when loads change suddenly, helping to keep operations running smoothly and increasing output.

Connected Components Software Workbench

Industrial engineers can more efficiently design and configure stand-alone machines using the latest release of Connected Components Workbench software from Rockwell Automation. With several new and enhanced features, the software improves download and build performance to create more efficient, user-friendly design processes.

Highlights of what’s new in version 13 of Connected Components Workbench software include:

  • A new Global and Local variable data grid that delivers capabilities to help engineers develop projects faster. For example, a quick declaration feature allows users to create multiple variables with the same prefix, suffix and data type in one click. An intuitive filter bar allows users to find tags quickly.
  • An enhanced Run Mode Change (RMC) capability that enables users to make edits without downloading project source code. This can speed up online edits and create smoother, more seamless design experiences.
  • A new Controller Organizer view that gives engineers the option to switch to a Logix Theme programming experience. This allows them to work in a more familiar environment and use copy-and-paste ladder logic from the Studio 5000 Logix Designer application.
  • An enhanced Global Connection capability on existing system tags in the PanelView 800 DesignStation that gives users greater flexibility to configure remote system connections.

The Connected Components Workbench software helps simplify the development of stand-alone machines that are built with the Rockwell Automation Micro Control system. Engineers can configure, program and visualize the major control components of their stand-alone machines in a single software environment. They can also use tools like the Micro800 Simulator to validate their application code without the need for hardware.

Managed Ethernet Switch

Allen-Bradley Stratix 5800 managed industrial Ethernet switch  supports layer 2 access switching and layer 3 routing for use in multiple layers of the architecture. Robust security capabilities and ISA/IEC 62443-4-2 certification help enhance network security.

The Stratix 5800 switch has fixed and modular designs, giving users flexibility to configure it based on application needs. It offers combinations of copper, fiber and Power over Ethernet (PoE) ports to support a wide range of architectures.

The switch helps ease integration by addressing the needs of both operations (OT) and IT teams. Studio 5000 Add-on Profiles enable premier integration into the Rockwell Automation Integrated Architecture. And the Cisco IOS-XE operating system helps ease integration to the enterprise.

“Reducing the complexity of IT/OT convergence is a priority today as companies need to connect their operations while managing challenges like skills shortages and security threats,” said Mark Devonshire, product manager, Rockwell Automation. “The Stratix 5800 managed switch helps simplify the jobs of IT and OT teams, and helps improve security and high performance for industrial environments.”

Certification to ISA/IEC 62443-4-2 verifies that the switch meets the standard’s technical requirements to security level 2 for industrial automation and control systems. This continues the efforts of Rockwell Automation to help secure industrial operations through certifications, expertise, products and services.

Rockwell Automation Expands Threat Detection Services with Cisco Cyber Vision

The longstanding alliance between Rockwell Automation and Cisco continues to find new ways to provide customer value with the announcement that Rockwell Automation is adding Cisco’s Cyber Vision solution to its existing LifecycleIQ Services portfolio of cybersecurity threat detection offerings.

While convergence is essential to a digital transformation, it also presents challenges such as siloed networks, cybersecurity threats, skills shortages, and an abundance of production data and solutions. The leaders in their respective industries have worked together to offer jointly developed architectures, services and products to help companies address these challenges as they work toward building a Connected Enterprise.

As this deeper integration between IT, cloud and industrial networks creates security issues that become digitization obstacles, Cyber Vision provides full visibility into industrial control systems to build secure infrastructures and enforce security policies – achieving the continuity, resilience, and safety of industrial operations. The addition of Cyber Vision to the LifecycleIQ Services threat detection offerings provides a unique switch-based architecture for customers with existing Cisco solutions, greenfield networks or those updating their Cisco network infrastructure.

Talking Digital Transformation with Rockwell Automation

I have not talked with anyone from Rockwell Automation for months. So, it was time to catch up with Keith Higgins who joined the company within the past couple of years as VP of Digital Transformation leading the software group. As we might expect, digital transformation technologies and products include the analytics portfolio, MES, and the coordination with PTC’s products including ThingWorx, Kepware, and Vuforia.

Since I was fresh from a conversation with another supplier about the Edge, I brought that up in the context of analytics and ThingWorx. Higgins began to explain the power of using the PLC as an edge device. Rockwell has not talked to me for years about the PLC, but I remember that for years it has added compute and networking capability into that platform. Time for me to get an update there, too. My wild guess is that no sufficiently enticing partnership could be hacked out with Dell Technologies or HPE using their Edge compute. And, they already had a powerful Edge device that just needed IT-level bolstering. This will be interesting to watch.

Higgins brought up a tire plant example where having production data in context at the edge with the ability to perform predictive analytics combined for a powerful management tool.

One theme that recurs in this discussion in general is the necessity for solid context for data. Higgins having brought that up regarding the tire plant example, continued to a discussion of a technology/product developed in partnership with Microsoft called SmartObjects. This is a rich data model that adds deep context to data. My feeble way of thinking of this would be something like a modern data model like MQTT and OPC UA on steroids (no disparagement of either of those technologies meant).

I’ve been thinking deeply about productivity lately, so I asked about it. Rockwell views its contribution to its customers’ productivity in three buckets:

  • Assets—building on predictive analytics, predictive maintenance, condition monitoring, and the like;
  • Production line—improving utilization of the production assets;
  • Human productivity—for example, the recent acquisition of CMMS supplier Fiix

I’m definitely interested in seeing where Rockwell’s new emphasis in software and edge goes. Many years ago, I asked then-CEO Keith Nosbusch about the software business. He said at that time it was an experiment. Higgins didn’t say that exact thing, but his remarks left no doubt that his area is primed to be a Rockwell growth vehicle.

The Converged Edge Explained

Our schedules finally converged. I caught up with Tom Bradicich, PhD, known within Hewlett Packard Enterprise (HPE) as “Dr. Tom,” to learn the latest on the converged edge. Tom is one of the half-dozen or so people I know who can dump so much information on my brain that it takes some time to digest and organize it. He led development of the Edgeline device connecting with the Industrial Internet of Things. He is now VP and HPE Fellow leading HPE Labs developing software to come to grips with the complexities of the converged edge and “Converged Edge-as-a-Service”.

He likes to organize his thoughts in numerical groups. I’m going to discuss converged edge below with his groupings:

3 C’s

4 Stages of the Edge

7 Reasons for IoT and the Edge

3 Act Play

12 Challenges

The foundation of the converged edge is found in the 3 C’s:

  1. Perpetual Connectivity
  2. Pervasive Computing
  3. Precision Controls

I remember Tony Perkins following up the demise of Red Herring magazine (charting the hot startup and M&A craze of the 90s, the magazine grew so large it came in two volumes for a while) with an online group called AlwaysOn. Trouble is, back in the 90s, we weren’t “always on.” Persistent connectivity was beyond our technology back then. Now, however, things have changed. We have so much networking, with more to come, that perpetual connectivity is not only possible, but also mundane.

HPE didn’t take a personal computer and package it for the edge. It developed Edgeline with the power of its enterprise compute along with enterprise grade stacks. It is powerful.

Then we have the 4 Stages of the Edge:

  1. Things—sensors and actuators
  2. Data Capture & Controls
  3. Edge IT (networking, compute, storage)
  4. Remote Cloud or Data Center

This is where Internet of Things meets the Enterprise.

Why do we need edge compute and not just IoT-to-Cloud? 7 Reasons:

  1. Minimize Latency
  2. Reduce bandwidth
  3. Lower cost
  4. Reduce threats
  5. Avoid duplication
  6. Improve reliability
  7. Maintain compliance

The Converged Edge is a 3-Act Play:

  1. Edgeline systems & software; stack identicality
  2. Converged embedded PXI and OT Link
  3. Converged Edge-as-a-Service

At this point in time, we are faced with 12 challenges to implementation:

  1. Limited bandwidth
  2. Smaller footprint for control plane and container
  3. Limited to no IT skills at the edge
  4. Higher ratio of control systems for compute/storage nodes
  5. Provisioning & lifecycle management of OT systems and IoT devices
  6. OT applications are primarily “stateful”, cloud unfriendly
  7. Data from analog world & industrial protocols
  8. Unreliable connectivity—autonomous disconnect operation
  9. Higher security vulnerabilities
  10. Hostile and unfamiliar physical environments and locations
  11. Long-tail hardware and software revenue model—many sites, fewer systems
  12. Deep domain expertise needed for the many unique edges

Of course, we could go into each of these items. Dr. Tom does in one of his latest talks (I believe it was at Hannover). We should pause at number 12, though. This is an often-overlooked necessity by AI evangelists and other predictive maintenance would-be disrupters. When you begin messing with industrial, whether process or discrete manufacturing, it really pays to know the process deeply.

I can’t believe I summarized this in less than a 600-word essay (is that still the common university requirement?). It is just an outline, but it should reveal where HPE has been and where it is going. I think its power will be disruptive to industrial architectures.