Suppliers of manufacturing software, some from surprising places, are putting sizable investments into products that will help customers reap the rewards of digitalization. Today, I’m looking at both ABB and Emerson Automation Solutions. Previously I checked out GE Digital and Rockwell Automation. Each has taken a slightly different course toward the goal, but notice the common thread of enhancing software products to help customers prosper.
ABB enhances manufacturing management technology
The new version of ABB Ability Manufacturing Operations Management will offer new features including:
- Enhanced user experience based on new HTML 5 web client;
- A new smart interactive dashboard application that provides greater visibility and collaboration;
- A new statistical process control (SPC) application, to determine if each process is in a state of control;
- A new Batch Compare application – for advanced batch analysis.
“ABB Ability Manufacturing Operations Management is a comprehensive, scalable and modular software suite that optimizes visibility, knowledge and control throughout the operations domain,” said Narasimham Parimi, Head of Digital Products – Product Management, Process Control Platform. “This release provides a range of rich new functionality and a new enhanced user experience that enables operations to become more productive and responsive.”
ABB Ability Manufacturing Operations Management is designed to simplify production management by enabling performance monitoring, downtime management, and maintenance support, as well as providing statistical production analysis tools. It provides solutions and tools to facilitate the collection, consolidation and distribution of production, quality and energy information via the plant’s web-based reports, trends, and graphs.
A new, self-service dashboard application promotes increased collaboration, providing visibility from shop floor to top floor and spanning IT and OT environments. It increases data connectivity to all apps and modules within the MOM suite, combining historic and manufacturing data and providing the user with improved customization capabilities. Dashboards can be shared amongst users, further promoting collaboration between teams. Trends and events are displayed together, which enables customers to identify issues and opportunities enabling informed and timely decisions.
The new common services platform features an HTML 5 web platform that runs across all suites ensuring customers have a seamless user experience, so that applications can be viewed on different devices right down to a 10-inch tablet.
Statistical data process control (SPC) is used in manufacturing to determine if each process is in a state of control. The new SPC application works across all the different apps and modules and helps the user to improve quality and production related performance.
In addition to the existing Batch View and Batch Investigate features, a comparison option has been added to the platform’s batch analysis applications, allowing different types of comparison.
Cyber security remains one of the key issues in the advancement of Industry 4.0, and the new features in MOM include enhanced security.
Emerson Expands Analytics Platform
Plantweb Insight platform adds two new Pervasive Sensing applications that manage wireless networks more efficiently with a singular interface to the enterprise.
Emerson has added two new IIoT solutions to its Plantweb Insight data analytics platform that will enable industrial facilities to transform the way they manage their enterprise-level wireless network infrastructure.
As digitalization and wireless technology adoption continue to rapidly expand in industrial facilities throughout the world, the need for greater visibility of network infrastructure performance is key. These new Plantweb Insight applications provide a quick-to-implement, scalable IIoT solution that helps customers advance their digital transformation strategies and achieve greater operational efficiencies.
The new Plantweb Insight Network Management application provides continuous, centralized monitoring of WirelessHART networks. This first-of-its-kind application provides a singular, consolidated view of the status of all wireless networks in a facility, with embedded expertise and guidance for advanced network management.
A key feature of the Plantweb Insight Network Management application is a configurable mesh network diagram, providing visualization of network design and connections along with device-specific information. It also provides an exportable record of syslog alerts, network details outlining conformance to network best practices and more.
While the new network management application provides a holistic look at wireless networks, the Plantweb Insight Power Module Management application drills down to the device level, allowing facilities to keep their wireless devices appropriately powered so they can continuously transmit key monitoring data. By aggregating power module statuses, users can evolve traditional maintenance planning and implement more efficient and cost-effective practices.
“We were able to infuse a decade of experience with wireless technology into these new offerings,” said Brian Joe, wireless product manager with Emerson’s Automation Solutions business. “Our customers will now be able to manage and improve hundreds of networks through a singular interface, realizing significant efficiencies in individual network and wireless device management and maintenance.”
These new applications further enhance the Plantweb Insight platform, a set of pre-built analytics primarily focusing on monitoring key asset health. Other applications in the platform include pressure relief valve monitoring, heat exchanger monitoring and steam trap monitoring.
This announcement hits many trends and things you will eventually grow tired of hearing—partnerships, collaboration among companies, ecosystems, Kubernetes, containers, and, yes, 5G. The latter is coming. We just don’t know when and how, yet.
Wind River, a leader in delivering software for the intelligent edge, announced that it is collaborating with Dell EMC as a key hardware partner for distributed edge solutions. A combined software and hardware platform would integrate Wind River Cloud Platform, a Kubernetes-based software offering for managing edge cloud infrastructure, with Dell EMC PowerEdge server hardware. The initial target use case will be virtual RAN (vRAN) infrastructure for 5G networks.
“As telecom infrastructure continues to evolve, service providers are facing daunting challenges around deploying and managing a physically distributed, cloud native vRAN infrastructure,” said Paul Miller, vice president of Telecommunications at Wind River. “By working with Dell EMC to pre-integrate our technologies into a reference distributed cloud solution, we can cost-effectively deliver carrier grade performance, massive scalability, and rapid service instantiation to service providers as their foundation for 5G networks.”
“In a 5G world, new services and applications will not be driven by massively scaled, centralized data centers but by intelligently distributed systems built at the network edge,” said Kevin Shatzkamer, vice president of Enterprise and Service Provider Strategy and Solutions at Dell EMC. “The combination of Dell EMC and Wind River technology creates a foundation for a complete, pre-integrated distributed cloud solution that delivers unrivaled reliability and performance, massive scalability, and significant cost savings compared to conventional RAN architectures. The solution will provide CSPs with what they need to migrate to 5G vRAN and better realize a cloud computing future.”
Wind River Cloud Platform combines a fully cloud-native, Kubernetes and container-based architecture with the ability to manage a truly physically and geographically separated infrastructure for vRAN and core data center sites. Cloud Platform delivers single pane of glass, zero-touch automated management of thousands of nodes.
Dell EMC hardware delivers potent compute power, high performance and high capacity memory is well suited to low-latency applications.
A commercial implementation of the open source project StarlingX, Cloud Platform scales from a single compute node at the network edge, up to thousands of nodes in the core to meet the needs of high value applications. With deterministic low latency required by edge applications and tools that make the distributed edge manageable, Cloud Platform provides a container-based infrastructure for edge implementations in scalable solutions ready for production.
The IoT group that I’ve been working with for the past few years has been absorbed into the OEM group which is carrying on an expanded function. This blog post from Steve Todd, Dell Technologies Fellow, details the development of data confidence work that has been contributed to the open source Linux Foundation to seed Project Alvarium.
Following is a quick summary. Go to the blog for additional information about trusted data work.
A team of Dell Technologies specialists finished building the first-ever Data Confidence Fabric (DCF for short). The prototype code will be contributed to the Linux Foundation to seed Project Alvarium.
For several years, the CTO of the Dell Technologies Edge and IoT business unit has been touting a vision of data monetization. However, it’s hard to monetize untrusted Edge and IoT data. As he likes to say, “It’s midnight. Do you know where your data has been?”
Enterprise storage systems have delivered trusted data to applications for a long time. We started our initial investigation wondering if these same trust principles could be applied to Edge and IoT ecosystems. Recent developments in data valuation, distributed ledgers, and data marketplaces facilitated everything coming together.
Five Levels of Trust
We started with the EdgeX Foundry chair of the Core Working Group, Trevor Conn. Trevor wrote the first-ever Data Confidence Fabric software using Go Lang, the same programming language EdgeX is written in. His Data Confidence Fabric software registered with EdgeX as a client and began processing simulated device data. The initial confidence score for this data was “0” (no trust was inserted).
Dell Technologies then hired three computer science interns from Texas A&M to deploy EdgeX and the Data Confidence Fabric software on a Dell Gateway 3000 with a Trusted Platform Module (TPM) chip.
EdgeX was then adjusted to support N-S-E-W authentication by using VMware’s open-source Lightwave technology.
Dell Boomi software was invoked by the Data Confidence Fabric software to gather provenance and appended this metadata to the sensor reading.
The Data Confidence Fabric software then stored the data locally using IPFS (an immutable, open-source storage system). This fourth level of trust insertion gives an application confidence that the data/provenance has not been tampered with. It also has the additional benefit of enabling analytics to access data closer to the source.
The Data Confidence Fabric software then registered the data into VMware’s blockchain (based on the open-sourceProject Concord consensus algorithm).
Keynoters have a tough time with originality these Digital Days with everyone emphasizing Digital Transformation. Steve Lomholt-Thomson, chief revenue officer of AVEVA, took us on a Digital Journey this morning. Setting the tone of the three days of AVEVA World Congress (North America edition).
Three technology trends to watch: an IoT boom; cloud/empowered edge; and, AI / ML. The theme is digital. The Digital Organization discovers its Digital DNA, figures out how to build that Digital DNA through people who challenge the status quo; and then figures out how to track talent flow.
Which all starts us on our Digital Journey. On this journey, we unify end-to-end data, connect data silos taking an wholistic view of the business, and then visualize our assets and supply chain. I believe implied in all this is the company’s product AVEVA System Platform. The company touted six customer stories with at least five of them (and probably the sixth) all leveraging System Platform.
Oh, and the only time the “W” word was used referred to past tense.
Other areas of the company were highlighted:
Focus on assets–asset performance management including how to use machine learning (ML) and artificial intelligence (AI) for predictive analytics (predictive maintenance.
How to combine it all into a Digital Twin–bringing the design lifecycle and physical lifecycle into congruence.
Recently hired head of North America business, Christine Harding, interviewed customers from Campbell’s (soup/snacks), Quantum Solutions (integration project at St. Louis/Lambert airport), and Suncor (Canadian oil sands).
I have the rest of today and then tomorrow to take deeper dives into many of these topics. If there is anything you want me to ask, send a note.
DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.
HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.
HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.
“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”
The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”
To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected]
Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.
From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.
DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.
DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.
From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.
At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.
As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.
Digitalization requires digital data, which in turn requires a place to robustly store that data. OSIsoft PI System must be the most widely used industrial historian database. Last November I wrote about the company bringing its PI System to Amazon Web Services (AWS). The company has released OSIsoft Cloud Services—a cloud-native, real-time data management system for unifying and augmenting critical operations data from across an organization to accelerate industrial analytics, data science projects, data sharing, and other digital transformation initiatives.
Given how deep I’ve been led into IT over the past few years, this advancement from OSIsoft becomes part of a significant trend of blending IT and OT just above the control layer. In fact, if you segregate off the actual “control” part of automation, that system itself has become an important element of Internet of Things (IoT) blending into the overall IT infrastructure. If you are not thinking the big picture in today’s industrial environment, then you will be missing important paths to profitability.
Back to today’s OSIsoft news. The OSIsoft Cloud Services (OCS) in a capsule:
- Data sharing – partner companies can access a shared data stream to remotely monitor technology
- Functionality – seamless crossover between the PI System and OCS to compare facilities, perform root cause analysis and run hypotheticals
- Scalability – tests proved OCS can simultaneously manage over two billion data streams, and safely share information with partners
- Petuum uses OCS to stream historical data and live data on production, temperature and variability to its AI platform to assist Cemex, a global cement manufacturer, improve yield and energy to 7% from 2%.
- DERNetSoft uses OCS to aggregate data in one place, allowing users to access useful analytics for ways to reduce power and save money.
- Pharma companies will use OCS to give a regulator access to anonymized drug testing or production, without risk of unauthorized users in the manufacturing networks.
With OCS, an engineer at a chemical producer, for example, could combine maintenance and energy data from multiple facilities into a live superset of information to boost production in real-time while planning analysts could merge several years’ worth of output and yield data to create a ‘perfect plant’ model for capital forecasts.
OCS can also be leveraged by software developers and system integrators to build new applications and services or to link remote assets.
“OSIsoft Cloud Services is a fundamental part of our mission to help people get the most out of the data that is at the foundation of their business. We want their cost of curiosity to be as close to zero as possible,” said Gregg Le Blanc, Vice President of Product at OSIsoft. “OCS is designed to complement the PI System by giving customers a way to uncover new operational insights and use their data to solve new problems that would have been impractical or impossible before.”
Data scientists spend 50 percent or more of their time curating large data sets instead of conducting analytics. IT teams get bogged down in managing VPNs for third parties or writing code for basic administrative tasks. Data becomes inaccessible and locked in silos. Over 1,000 utilities, 80% of the largest oil and gas companies, and 65% of the Fortune 500 industrial companies already use the PI System to harness critical operations data, turning it into an asset for improving productivity, saving money, and developing new services.
Natively compatible with the PI System, OCS extends the range of possible applications and use cases of OSIsoft’s data infrastructure while eliminating the challenges of capturing, managing, enhancing, and delivering operations data across an organization. Within a few hours, thousands of data streams containing years of historical data can be transferred to OCS, allowing customers to explore, experiment, and share large data sets the same day.
The core of OCS is a highly scalable sequential data store optimized for time series data, depth measurements, temperature readings, and similar data. OSIsoft has also embedded numerous usability features for connecting devices, managing users, searching, transferring data from the PI System to OCS, and other functions. OCS can also accept data from devices outside of traditional control networks or other sources.
“The scale and scope of data that will be generated over the coming decades is unprecedented, but our mission remains the same,” said Dr. J. Patrick Kennedy, CEO and Founder of OSIsoft. “OSIsoft Cloud Services represent the latest step in a nearly 40 year journey and there’s more to come.”
OCS is a subscription service currently available to customers and partners for use in facilities in North America. OCS will be extended to Europe and to other regions in the near future.