The IT architecture of industrial / manufacturing applications increasingly boosts the role of cloud and edge. These technologies have become core to Industrial Internet of Things (IIoT) and improved Software as a Service (SaaS).
These recent acquisition news items reflect the acceleration of the trend. One is from Siemens and the other PTC.
Siemens plans acquisition of Edge technology
- Siemens further expands its digitalization portfolio for industry
- Technology basis is the Docker IT standard
- Siemens Industrial Edge ecosystem enables easy and flexible use of Edge apps
Siemens is planning the acquisition of Edge technology from the US company Pixeom. With this action, Siemens is strengthening its Industrial Edge portfolio by adding software components for Edge runtime and for device management. Siemens Industrial Edge provides an ecosystem, which enables the flexible provision and use of apps. This means for example that appropriate apps can analyze data locally at the machine and send relevant data to the higher-level Industrial Edge Management System for global analytics. With this acquisition, Siemens is driving forward the expansion of its Digital Enterprise portfolio and the integration of cutting-edge technologies for the digital transformation of industry.
With the resulting Industrial Edge ecosystem, industrial companies can use production data even more efficiently and react more flexibly to changes in conditions.
Ralf-Michael Franke, CEO of Siemens’ Factory Automation Business Unit, explains: “Cutting edge technologies such as Edge Computing open up new scope for automation. With Siemens Industrial Edge, we are creating an open edge ecosystem which offers benefits for companies of any size.”
Siemens is using Docker standard container technology: the provision of apps in the management system will therefore be just as simple as functional upgrades and updates of Edge devices in the factory from a central point.
Siemens intends to acquire this technology from Pixeom and use it in the Factory Automation Business Unit, which is part of Siemens Digital Industries. Pixeom has sites in San José, California and Udaipur, India and employs 81 people worldwide. Closing of the transaction is planned for the fourth quarter of 2019. Both companies have agreed not to comment on the financial details of the transaction.
PTC Makes SaaS Acquisition
I sat in on the analysts/press conference where PTC president and CEO Jim Heppelmann discussed the reason for this announced acquisition of Onshape, creators of the “first” Software as a Service product development platform. The company had also just released fourth quarter results. PTC has a little more than $1 billion in revenues, with about 45% CAD and 35% PLM. Interestingly, the IoT business contributes just over 10% of revenues.
Onshape’s product development platform unites computer aided design (CAD) with data management and collaboration tools, for approximately $470 million, net of cash acquired. The acquisition is expected to accelerate PTC’s ability to attract new customers with a SaaS-based product offering and position the company to capitalize on the inevitable industry transition to SaaS. Heppelmann believes that that cloud-based SaaS is the future of CAD. Pending regulatory approval and satisfaction of other closing conditions, the transaction is expected to be completed in November 2019.
Located in Cambridge, MA, Onshape was founded in 2012 by CAD pioneers and tech legends, including Jon Hirschtick, John McEleney, and Dave Corcoran, inventors and former executives of SolidWorks. Onshape has secured more than $150 million in funding from leading venture capital firms and has more than 5,000 subscribers around the world. The company’s software offering is delivered in a SaaS model, making it accessible from any connected location or device, eliminating the need for costly hardware and administrative staff to maintain. Distributed and mobile teams of designers, engineers, and others can benefit from the product’s cloud nature, enabling them to improve collaboration and to dramatically reduce the time needed to bring new products to market – while simultaneously staying current with the latest software.
“PTC has earned a reputation for successfully pursuing new innovations that drive corporate growth,” said Heppelmann. “Building on the strong momentum we have with our on-premises CAD and PLM businesses, we look to our future and see a new growth play with SaaS.”
This acquisition is the logical next step in PTC’s overall evolution to a recurring revenue business model, the first step of which was the company’s successful transition to subscription licensing, completed in January 2019. The SaaS model, while nascent in the CAD and PLM market, is rapidly becoming industry best practice across most other software domains.
“Today, we see small and medium-sized CAD customers in the high-growth part of the CAD market shifting their interest toward SaaS delivery models, and we expect interest from larger customers to grow over time,” continued Heppelmann. “The acquisition of Onshape complements our on-premises business with the industry’s only proven, scalable pure SaaS platform, which we expect will open new CAD and PLM growth opportunities while positioning PTC to be the leader as the market transitions toward the SaaS model.”
For customers, the SaaS model enables faster work, improved collaboration and innovation, with lower up-front costs and with no IT infrastructure to administer and maintain. For software providers, the SaaS model has been proven to generate a more stable and predictable revenue stream, increase customer loyalty as customers benefit from earlier adoption of technology innovations, and enable expansions into new segments and geographies.
“At Onshape, we share PTC’s vision for helping organizations transform the way they develop products,” said Jon Hirschtick, CEO and co-founder, Onshape. “We and PTC believe that the product development industry is nearing the ‘tipping point’ for SaaS adoption of CAD and data management tools. We look forward to empowering the customers we serve with the latest innovations to improve their competitive positions.”
Onshape will operate as a business unit within PTC, with current management reporting directly to Heppelmann.
This is still more followup from Emerson Global Users Exchange relative to sessions on Projects Pilot Purgatory. I thought I had already written this, but just discovered it languishing in my drafts folder. While in Nashville, I ran into Jonas Berge, senior director, applied technology for Plantweb at Emerson Automation. He has been a source for technology updates for years. We followed up a brief conversation with a flurry of emails where he updated me on some presentations.
One important topic centered on IoT projects—actually applicable to other types of projects as well. He told me the secret sauce is to start small. “A World Economic Forum white paper on the fourth industrial revolution in collaboration with McKinsey suggests that to avoid getting stuck in prolonged “pilot purgatory” plants shall start small with multiple projects – just like we spoke about at EGUE and just like Denka and Chevron Oronite and others have done,” he told me.
“I personally believe the problem is when plants get advice to take a ‘big bang’ approach starting by spending years and millions on an additional ‘single software platform’ or data lake and hiring a data science team even before the first use case is tackled,” said Berge. “My blog post explains this approach to avoiding pilot purgatory in greater detail.”
I recommend visiting Berge’s blog for more detail, but I’ll provide some teaser ideas here.
First he recommends
- Think Big
- Start Small
- Scale Fast
Plants must scale digital transformation across the entire site to fully enjoy the safety benefits like fewer incidents, faster incident response time, reduced instances of non-compliance, as well as reliability benefits such as greater availability, reduced maintenance cost, extend equipment life, greater integrity (fewer instances of loss of containment), shorter turnarounds, and longer between turnarounds. The same holds true for energy benefits like lower energy consumption, cost, and reduced emissions and carbon footprint, as well as production benefits like reduced off-spec product (higher quality/yield), greater throughput, greater flexibility (feedstock use, and products/grades), reduced operations cost, and shorter lead-time.
The organization can only absorb so much change at any one time. If too many changes are introduced in one go, the digitalization will stall:
- Too many technologies at once
- Too many data aggregation layers
- Too many custom applications
- Too many new roles
- Too many vendors
Multiple Phased Projects
McKinsey research shows plants successfully scaling digital transformation instead run smaller digitalization projects; multiple small projects across the functional areas. This matches what I have personally seen in projects I have worked on.
From what I can tell it is plants that attempt a big bang approach with many digital technologies at once that struggle to scale. There are forces that encourage companies to try to achieve sweeping changes to go digital, which can lead to counterproductive overreaching.
The Boston Consulting Group (BCG) suggests a disciplined phased approach rather than attempting to boil the ocean. I have seen plants focus on a technology that can digitally transform and help multiple functional areas with common infrastructure. A good example is wireless sensor networks. Deploying wireless sensor networks in turn enables many small projects that help many departments digitally transform the way they work. The infrastructure for one technology can be deployed relatively quickly after which many small projects are executed in phases.
Small projects are low-risk. A small trial of a solution in one plant unit finishes fast. After a quick success, then scale it to the full plant area, and then scale to the entire plant. Then the team can move on to start the next pilot project. This way plants move from PoC to full-scale plant-wide implementation at speed. For large organization with multiple plants, innovations often emerge at an individual plant, then gets replicated at other sites, rolled out nation-wide and globally.
Use Existing Platform
I have also seen big bang approach where plant pours a lot of money and resources into an additional “single software platform” layer for data aggregation before the first use-case even gets started. This new data aggregation platform layer is meant to be added above the ERP with the intention to collect data from the ERP and plant historian before making it available to analytics through proprietary API requiring custom programming.
Instead, successful plants start small projects using the existing data aggregation platform; the plant historian. The historian can be scaled with additional tags as needed. This way a project can be implemented within two weeks, with the pilot running an additional three months, at low-risk.
I personally like to add you must also think of the bigger vision. A plant cannot run multiple small projects in isolation resulting in siloed solutions. Plants successful with digital transformation early on establish a vision of what the end goal looks like. Based on this they can select the technologies and architecture to build the infrastructure that supports this end goal.
NAMUR Open Architecture (NOA)
The system architecture for the digital operational infrastructure (DOI) is important. The wrong architecture leads to delays and inability to scale. NAMUR (User Association of Automation Technology in Process Industries) has defined the NAMUR Open Architecture (NOA) to enable Industry 4.0. I have found that plants that have deployed digital operational infrastructure (DOI) modelled on the same principles as NOA are able to pilot and scale very fast. Flying StartThe I&C department in plants can accelerate digital transformation to achieve operational excellence and top quartile performance by remembering Think Big, Start Small, Scale Fast. These translate into a few simple design principles:
- Phased approach
- Architecture modeled on the NAMUR Open Architecture
- Ready-made apps
- East-to-use software
- Digital ecosystem
The design engineering function originates data. It includes data about the structure of the plant or factory, data about the equipment and processes used to make the product, and data about the product(s) itself. In my early career, I embodied the movement of the data from design to operations and then back to design in a continuous loop of as designed—>as built—>as designed. I was also involved for a while in the development of a platform to automate this process using standards.
To say I’m interested in this area would be an understatement. And this process is important to all of you, too. Including those who siphon off some data for other uses such as accounting, customer service, maintenance, and reliability.
AVEVA, the integration of its iconic design engineering software and Schneider Electric’s software business, just introduced integrated engineering software designed to help customers transform the way capital projects are engineered, executed, and integrated into operations and maintenance.
The integrated portfolio comprises three software solutions. AVEVA Unified Engineering integrates process design with front-end engineering and detailed 3D based design. AVEVA Unified Project Execution links and streamlines procurement and construction processes for capital projects. AVEVA Enterprise Learning enables the rapid skilling of operators and engineers using Extended Reality (XR) and simulation tools, to ensure efficient startups and shutdowns, normal operations, and the ability to handle abnormal situations
“This launch builds on the recent news describing AVEVA’s capabilities as the first company in the engineering and industrial software market to comprehensively address the end-to-end digital transformation imperatives with an integrated portfolio of solutions that deliver efficiency, unlock value and empower people across the lifecycle of capital assets and operational value chains,” commented Craig Hayman, CEO, AVEVA. “It changes the way that owner operators engage with Engineering, Procurement and Construction (EPC) companies in designing, building, commissioning, and operating their capital assets.”
The functionality provided in these integrated solutions enables the realization of an EPC 4.0 strategy for owner operators, central to digital transformation in the capital-intensive process sectors. This allows collaboration on a global scale, through hybrid cloud architectures and on a common platform. The entire manufacturing process can be traced, tracked, and linked – from engineering and design, through procurement and construction, to handover and to operations and maintenance, as a comprehensive Digital Twin for the capital asset.
“As competition in the business world accelerates the time has come for industrial organization to innovate to facilitate the transition from the manual, document-centric processes, towards a data-driven vision of project design, procurement, and execution in order to increase safety, reduce costs, and minimize delays, “ commented Craig Hayman, CEO AVEVA. “With the launch of AVEVA Unified Engineering, a first of its kind solution, we are breaking down the silos between engineering disciplines and enabling our customers to turn conceptual designs into 3D models quickly, accelerating engineering to estimation and ensuring designs can be operated before committing billions of dollars.”
New AVEVA Unified Engineering enables the integration of the process model and plant model lifecycles from concept to detailed design, delivering frictionless collaboration for multi-discipline engineers to collaborate in the cloud. The net result is a minimum 50% improvement in engineering efficiency in FEED and up to 30% in detail design, which can yield a 3% total installed cost improvement. These savings can be re-invested to ensure engineering quality, accuracy, and maturity for downstream project execution business processes.
AVEVA Unified Project Execution solutions integrate with AVEVA Unified Engineering to further break down the silos within Procurement and Construction by combining key disciplines covering Contract Risk Management, Materials and Supply Chain Control, and Construction Management into one cloud based digital project execution environment. AVEVA Unified Project Execution solutions deliver up to 15% reduction in material costs, 10% reduction in field labor costs and reduces unbudgeted supplier change orders by up to 50%, which translates to 10% total installed costs savings opportunities for our customers.
AVEVA’s Enterprise Learning solutions combine traditional simulation-based learning with 3D connected learning management solutions. AVEVA’s learning solutions extend process models and 3D models from AVEVA Unified Engineering to fast track DCS panel operator training, field operator training, process and maintenance procedural training, and process safety situational awareness training using cloud and Extended Reality (XR) technology to deliver up to 2% Total Installed Cost reduction by improved operations readiness.
“Our Engineering portfolio enhancements will deliver increased agility for our customers, enabling them to reduce cost, risk, and delays, minimizing errors and driving rapid capital project execution. The cost savings are realized by mitigating capital investment risks at the process design stage, cutting engineering man-hours by up to 30% in plant design, reducing material costs in procurement by up to 15% as well as reducing field labor costs in construction by up to 10%,” commented Amish Sabharwal, SVP, Engineering Business, AVEVA. “With these new solutions AVEVA is providing integration across all stages of the capital project, from conceptual design to handover, to optimize collaboration and break down silos between both engineering disciplines and project stages.”
DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.
HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.
HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.
“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”
The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”
To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected].
Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.
From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.
DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.
DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.
From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.
At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.
As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.
A short blurb on a product that I didn’t know that Rockwell Automation had—a rack-mount compute platform that can be used as a virtual machine server.
The information did not come through a traditional press release. It was a New Product Spotlight with a request to run in the products section. Well, I’m not a magazine or traditional media, so I don’t have a “products section.” However, I have great empathy for PR firms these days. They really have to push to keep clients happy in a tough market with demanding client executives.
Given that I’ve been spending so much time at IT conferences and everyone speculates about what Rockwell is up to, I found this one intriguing. “The new VersaVirtual appliance from Rockwell Automation provides all the computing, networking and storage capabilities needed to deploy and maintain up to 15 virtual machines in one ready-to-use appliance.”
Two key features:
First, it avoids the potential pitfalls of a do-it-yourself virtualized architecture. This appliance is pre-engineered. It arrives as a complete product from one source.
Second, it is an Industrial Data Center with scaled down cost and complexity for smaller applications.
They even remove the objection of needing an IT department. The VersaVirtual appliance comes with one-year remote monitoring and administration so that users receive around-the-clock system monitoring to help prevent downtime. Customers will also receive support from certified IT/OT professionals who have an average response time of three minutes to help resolve technical issues.
It is a hyperconverged (compute, networking and storage) appliance for entry-level virtualization. And the benefits: Virtualization brings an average of 74% decrease in total cost of ownership, reduces downtime, adds compute capability, and comes with trusted IT/OT services and support.
Looks like an entry-level IT platform from the OT leader. Interesting.