Gasp! Signs of common sense begin to pervade the discussion of digitalization and its cousins–connected (everything?), digital twin, cyber-physical, and so forth. Meaning that it’s all about leadership.
Suppliers constantly develop or enhance technologies within products. But I’m betting that just about all of you already have more digital data than you know what to do with. I’m betting most of you already have some products and connectivity–and have had for 15 years or longer.
What is always lacking is the will, the ingenuity, the, yes, leadership, to use all of this to its most beneficial effect.
Leadership doesn’t just appoint someone to head an exploratory team. It sets vision and expectations about how a new business model can send the company on a growth and success trajectory.
Leadership sees data as an asset and asks how it can be used to further goals of profitability, process uptime, improved quality, faster time to market, better/faster customer service, supply chain smoothing, and more.
Leadership organizes and motivates people to forge new paths into the economy.
Simply compiling digital data is a waste of time and resources. Leadership treats it as a foundation for success.
I’ve received a couple of news items about something called the Open Manufacturing Platform (OMP). I have searched in vain for a website–maybe a GitHub or Linux Foundation or something. This is sponsored by Microsoft, so no surprise that it is built on Microsoft Azure. I guess the open part is open connectivity to Azure.
I had a brief chat in Hannover a couple of weeks ago and picked up this press release. The companies putting this together have added members. Just a few right now. Like always, I adopt a “wait-and-see” attitude to see how this develops.
- Anheuser-Busch InBev, BMW Group, Bosch Group, Microsoft, ZF Friedrichshafen AG named OMP steering committee members
- OMP was established in 2019 as an independent initiative under the umbrella of the Joint Development Foundation
- First working groups created: IoT Connectivity, Semantic Data Model, Industrial IoT Reference Architecture and Core Services for Autonomous Transport Systems
- The first appearance of the Open Manufacturing Platform
The Open Manufacturing Platform (OMP) has expanded, with new steering committee members and new working groups established. OMP is an alliance founded in 2019 to help manufacturing companies accelerate innovation at scale through cross-industry collaboration, knowledge and data sharing as well as access to new technologies. The OMP was founded under the umbrella of the Joint Development Foundation, which is part of the Linux Foundation.
Original members The BMW Group and Microsoft welcome Anheuser-Busch InBev (AbInBev), Bosch Group and ZF Friedrichshafen AG as steering committee members. The OMP steering committee has approved a number of working groups to focus on core areas important to the industry, including IoT Connectivity, semantic data models, Industrial IoT reference architecture, and core services for ATS (autonomous transport systems).
Common approach to industry challenges
The expansion of intelligent manufacturing is driving new efficiencies and increased productivity, as well as revealing new challenges. Within the industry, legacy and proprietary systems have resulted in data silos, making operation-wide insight and transformation daunting. As common challenges across the industry, they often require a high degree of investment for modest returns within any one organization. The OMP has been developed to address this, where manufacturers and their value chains come together to identify and develop solutions that address these non-differentiating problems. It brings together experts across the manufacturing sector — including discrete and process manufacturing, transportation and consumer goods, industrial equipment, and more.
“Our goal is to drive manufacturing innovation at scale, accelerate time-to-value and drive production efficiencies by jointly solving mutual challenges, based on an open community approach. The OMP helps manufacturing companies unlock the potential of their data, implement industrial solutions faster and more securely, and benefit from industrial contributions while preserving their intellectual property (IP) and competitive advantages, mitigating operational risks and reducing financial investments,” said Jürgen Maidl, Senior Vice President Production Network and Supply Chain Management at the BMW Group.
Scale innovation through common data models and open technology standards
The OMP operates under the umbrella of the Joint Development Foundation (JDF). The JDF is part of the Linux Foundation and provides the OMP with infrastructure and an organizational framework to create technical specifications and support open industry standards. The OMP supports other alliances, including the OPC Foundation and Plattform Industrie 4.0, and leverages existing industry standards, open source reference architectures and common data models.
“Through the open collaboration approach that is the cornerstone of OMP, manufacturing companies will be able to bring offerings to market faster, with increased scale and greater efficiency,” said Scott Guthrie, Executive Vice President Cloud & AI at Microsoft. “Solutions will be published and shared across the community, regardless of technology, solution provider or cloud platform.”
The heart of OMP: working groups to address common manufacturing challenges
“Comprised of members from across the manufacturing industry, the collaboration framework and heart of the OMP are its working groups. We are very excited to join in a moment where our manufacturing facilities are becoming increasingly connected, and we are looking for innovative ways to make use of the treasure trove of data that is being generated,” said Tassilo Festetics, Global Vice President of Solutions at AB InBev. The OMP initial first working groups will focus on topics such as IoT Connectivity, Semantic Data Model, IIoT Reference Architecture and Core Services for ATS (autonomous transport systems). Initial focus areas include:
The OMP steering committee will support industry efforts to connect IoT devices and machines to the cloud. It is one of the first steps to digitize production lines and leverage cloud-connected Industrial IoT applications.
“Today, it is all about analytics and predictions but without data no analytics and without connectivity no data. Modern devices can easily be connected via the OPC Unified Architecture (OPC UA). Connecting machines and applications to the cloud that have been in production for decades comes with bigger interoperability challenges as various standards and interfaces must be addressed to interconnect these historically developed legacy systems (‘brownfield approach’). The working group IoT Connectivity will focus on providing industrial-grade edge and cloud functionalities for the integration and management of OPC UA devices in brownfield environments,” said Werner Balandat, Head of Production Management, ZF Friedrichshafen AG.
Another OMP working group focuses on semantic data modeling: Machine and manufacturing data are crucial for industrial companies to optimize production with artificial intelligence (AI). However, managing data in a common format across multiple sources with constantly evolving semantics is a real challenge.
“Data is the raw material for Industry 4.0 and a prerequisite for optimizing production with the help of artificial intelligence. At OMP, we are developing a semantic model that makes data understandable and illustrates its relations and dependencies. Users no longer receive cryptic, incomprehensible numbers and characters, but production-relevant information including their context. This semantic data structure ensures improvements along the entire value chain and makes AI-based business models possible on a large scale,” said Dr.-Ing. Michael Bolle, Member of the Board of Management, Robert Bosch GmbH.
The “Edge” is a hot space right now, although sometimes I’m not sure that everyone agrees what “Edge” is as they develop products and solutions. However, first thing this morning I saw this tweet from Tom Bradicich of Hewlett Packard Enterprise (@HPE) referring to an article that mentions him in CouputerWeekly.com. I’ve written about HPE at the edge and with IoT before. Looks like something’s up.
Tweet from @TomBradicichPhD Not only computing at the #edge, but also a new product category of “converging IT with #OT” systems (such as controls, DAQ, industrial protocols). Watch this space, my team’s next innovation is all this as-a-service. #aaS
Here is the rationale from the Computer Weekly article. “The benefits of edge computing have the potential to help businesses dramatically speed up their data analysis time while cutting down costs. @HPE’s Mark Potter and @TomBradicichPhD share how we can make this possible.”
In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.
It is fascinating to see our environment described by an enterprise IT writer. The truth is that following the Purdue Model, suppliers tried to make PLCs and DCSs part of the information infrastructure in parallel to supervising or executing control functions. That proved too unwieldy for control engineers to manage within the programming tools used. It was also too slow and not really optimized for the task.
Along came IT companies. I have followed a few over the past five years. They have had trouble with figuring out how to make a business out of edge compute, gateways, networking, and the like.
In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.
Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.
Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”
I get this analogy to converging technologies into a device such as the iPhone. I don’t know if we want to cede control over to an HPE compute platform (although it has plenty of horsepower), but the idea is tempting. And it would be thoroughly disruptive.
Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.
HPE’s investment in a new company called Pensando, which recently emerged from stealth mode and is founded and staffed by former Cisco technologists with former Cisco CEO John Chambers installed as Chairman, Sunil believes new categories of device will come to market, aimed at edge computing, could lead to a plethora of new devices perhaps to perform data acquisition and real-time data processing.
Mark Potter recently wrote in a blog post, By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in in a way that is completely air-gapped from the core system.
These are critically-important requirements in our hyper-connected, edge-centric, cloud-enabled and data-driven world – where billions of people and trillions of things interact.
This convergence is generating unimaginable amounts of data from which enterprises seek to unearth industry-shaping insights. And as emerging technologies like edge computing, AI and 5G become even more mainstream, enterprises have an ever-growing need to harness the power of that data. But moving data from its point of generation to a central data center for processing presents major challenges — from substantial delays in analysis to security, governance and compliance risks.
That’s where Pensando and HPE are making an industry-defining difference. By moving the traditionally data center-bound network, storage and security services to the server processing the data, we will eliminate the need for round-trip data transfer to centralized network and security appliances – and at a lower cost, with more efficiency and higher performance.
Here are benefits that Potter listed:
- Lower latency to competitive solutions, as operations will be carried at a 100Gbps network line rate speed;
- Controller management framework to scale across thousands of nodes with a federation of controllers allowing scale to 1M+ endpoints; and
- Security, governance and compliance policies that are consistently applied at the edge.
Announcements and discussions at this year’s iteration of the Industry Forum sponsored by ARC Advisory Group were amazingly diverse. Another IT supplier appeared. Security remained an issue. Most conversations revolved around open (open source and open interoperability), edge, 5G, collaboration/partnerships, software-defined, machine learning, MQTT and its companion Sparkplug, and most importantly, solving problems for end users.
Following is a brief recap. Follow the links for in-depth information. Of course, many company announcements fit into more than one bucket.
Examples one of a variety of open include Eclipse Foundation and the Open Group Open Process Automation Forum with the IT technology of Kubernetes thrown in.
The Eclipse Foundation launched the Sparkplug Working Group. Founding members Chevron, Canary Labs, Cirrus Link Solutions, HiveMQ, Inductive Automation, and ORing are defining an open standard specification to create interoperable MQTT IIoT solutions.
The Working Group will encourage the definition of technical specifications and associated implementations that rationalize access to industrial data, improve the interoperability and scalability of IIoT solutions, and provide an overall framework for supporting Industry 4.0 for oil and gas, energy, manufacturing, smart cities and other related industries.
Sparkplug is relatively new which leads to interoperability problems since each supplier and end user must create all definitions of the data. Success of this WG is essential for any widespread adoption of is. The Eclipse Foundation pointed out the intent and purpose of the Sparkplug specification is to define an MQTT topic namespace, payload, and session state management that can be applied generically. By meeting the operational requirements for these systems, Sparkplug will enable MQTT-based infrastructures to provide more valuable real-time information to business stakeholders as well.
The Open Group Open Process Automation Forum progresses. This topic broaches on both open and software-defined control. The Open Group Open Process Automation Forum (OPAF), in the first major update to the standard for open process automation systems since February 2019, has progressed perhaps more than I would have predicted after its unveiling only a few years ago at the ARC Forum.
Its first release focused on interoperability while the O-PAS Standard Version 2.0 provides a vendor-neutral Reference Architecture which enables the construction of scalable, reliable, interoperable, and secure process automation systems. The latest release, which is a Preliminary Standard of The Open Group, has further emphasis on standardized system configuration portability to significantly reduce capital cost and time investment for end-users. With these capabilities, end-users can easily exchange equipment without being tied to a single vendor or requiring individual configuration parameters to be written in different operating languages.
With their standard moving from interoperability to portable configurations, leaders told me that the next release will expand on this portability theme.
Bedrock Automation integrates Flow-Cal Flow Measurement into Open Secure Automation (OSA) Platform.
Speaking of both software-defined and open, Bedrock Automation Founder, CEO, and CTO Albert Rooyakkers explained its extension to the “Open Secure Automation (OSA)” platform with the addition of Flow-Cal algorithms “bringing oil and gas measurement and custody transfer securely into the digital age.” This was essentially a software addition to the platform to bring a new twist on flow computer functionality.
The new OSA +Flow family embeds industry-leading Flow-Cal measurement applications. Flow-Cal’s software has long been the industry’s choice for flow measurement and production-accounting data. Affirming Flow-Cal’s stature is the fact that the American Petroleum Institute (API) has selected it to develop, support, and distribute its standard flow measurement calculations.
The OSA +Flow software has been incorporated across all Bedrock controllers providing scalability for PLC, RTU, or DCS flow control requirements at custody transfer stations, separators, and other oil and gas production facilities. These solutions include full support of multi-drop serial, Ethernet, and HART for Coriolis, ultrasonic, and smart transmitters.
The system API compliant library, OPC UA, Inductive Automation software, MQTT, as well as software-defined I/O.
Diamanti Accelerates Energy and Service Organizations’ Adoption of AI/ML
AI and ML applications often leverage GPU processing for training models and they benefit from containers and Kubernetes—an open source container project. However, these processes are often complicated to adopt and run at scale. With the recent announcement of GPU support in the Diamanti AI/ML platform, enterprises have an easier on-ramp to managing large-scale containerized workloads under Kubernetes.
“We’re pleased to share the early customer traction we are seeing on our newest solutions in a wide range of industries including energy, services and more,” said Tom Barton, CEO of Diamanti. “These customers are validating state-of-the-art technologies internally while also benefiting from the reduced physical footprint and cost-savings that come with the Diamanti AI/ML platform.”
The new solution, announced in late 2019, is in early access today and fully supports Nvidia’s NVLink technology for higher performing workloads, as well as Kubeflow, an open source machine learning framework for Kubernetes that provides highly available Jupyter notebooks and ML pipelines. Combined with Diamanti’s Kubernetes control plane, this allows customers to deliver highly scalable environments for performance-intensive AI/ML workloads, accelerating model development and training.
A major energy company turned to Diamanti for a new workload leveraging AI/ML for optical character recognition (OCR) to scan invoices. The customer needed to scan more than 15,000 invoices a day. The legacy infrastructure could not keep up with the demand and eventually accrued a backlog of more than 200,000 invoices. Deploying the Diamanti solution with GPU support eliminated that backlog within hours.
Edge – 5G
As the other influencers at an HPE event told me once, “Gary, everything you do is the edge.” So it is not surprising that I had many conversations about the Edge. But 5G technology was also on many minds. The consensus opinion–5G will drive decision making to the edge.
As an example of edge at the Forum, here is an announcement from Opto 22. For as long as I’ve known the company, it continues to push the latest IT technologies mashed up with control and automation. This product release highlights its pioneering role in IoT.
Industrial automation manufacturer and industrial internet of things (IIoT) developer Opto 22 announced groov RIO, a family of intelligent, distributed input/output (I/O) for IIoT and automation applications. groov RIO represents a first-in-class solution for its ability to quickly connect traditional wired switches and sensors directly to Ethernet networks, software applications, and cloud platforms without intermediary control or communication hardware, such as PLCs, PACs, or PCs.
The first shipping version of groov RIO is the GRV-R7-MM1001-10, a standalone, 10-channel, multi-signal, multifunction I/O unit for signals including thermocouples (TCs), integrated circuit temperature devices (ICTDs), voltage inputs, current inputs, millivolt inputs, discrete DC inputs, self-wetting discrete inputs, discrete DC sinking outputs, and Form C mechanical relays. In addition, two channels provide special features like pulse counting, on- and off-time totalization, software latching, frequency measurement, and more. GRV-R7-MM1001-10 is completely standalone and software-configurable through a browser-based interface.
“When we designed groov RIO, we were looking for ways to democratize I/O data, because that’s what the IIoT is all about,” said Vice President of Product Strategy at Opto 22, Benson Hougland. “Although groov RIO can be used as remote I/O with our groov EPIC system or another control system, we also wanted it to operate autonomously, facilitating direct connection between I/O signals and databases, business software, or cloud IoT platforms.”
GRV-R7-MM1001-10 supports 12 different types of field I/O circuits. It also provides no-hassle, enclosure-free installation with multiple power options, including standard 802.3af Power-over-Ethernet (PoE); an extended operating temperature range; and UL Hazardous Locations and ATEX approvals.
Once installed, groov RIO can be independently managed and configured through browser-based tools. Per-channel I/O type and signal-processing options through groov Manage eliminate the need for a master control unit, and support for standard enterprise network services like DNS, DHCP, and VPN facilitates network connectivity. Embedded communication options range from efficient data publishing with MQTT Sparkplug to advanced signal processing, data aggregation, and transactions with databases and web services, using the low-code Node-RED environment and runtime.
Data —> Action
It’s all about data they all say. But when I talked with Mike Brooks who is now advising at AspenTech, he counseled, “Not too much data.” The action is in using data, not collecting it. Therefore the drawback (indeed, failure?) of data lakes. Too much storage, not enough usability. AspenTech exemplifies using Machine Learning not just to say they are in AI, but to find usable information for the companies to use to improve operations.
Collaboration – Partnerships
The Eclipse Foundation and OPAF exemplify collaboration and partnerships. Inductive Automation has community as a strategic initiative. Both founder Steve Hechtman and chief strategy officer Don Pearson highlighted it at last year’s Ignition Community Conference.
This announcement highlights community along with edge and other trends. Inductive Automation announced improvements to three products and a new development resource within Ignition by Inductive Automation. Ignition is an industrial application platform with tools for building solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), and the Industrial Internet of Things (IIoT).
The solutions include:
- New and improved products for Ignition Edge.
- An expansion of the Ignition Onboard program.
- Improvements to the Ignition Perspective Module.
- A new, free resource for developers: Ignition Exchange.
Ignition Edge will soon have three new products. Ignition Edge is a line of lightweight, limited, low-cost Ignition software solutions designed for embedding into field and OEM devices at the edge. They allow organizations to extend data collection, visualization, and system management to the edge of the network. With the new products coming soon, the lineup will include Ignition Edge Panel, Ignition Edge Compute, Ignition Edge Sync Services, Ignition Edge EAM (Enterprise Administration Module), and Ignition Edge IIoT.
The Ignition Onboard program now has easier access to industrial hardware that comes with Ignition already installed, configured, and licensed. Numerous device manufacturers are embedding Ignition and Ignition Edge into their devices — including Advantech, Moxa, OnLogic, Opto 22, and ORing.
The Ignition Perspective Module lets users easily build mobile industrial applications in HTML5 for monitoring and control of their processes directly from their mobile phones.
A significant part of the Inductive Automation strategy is to promote community among its customers and partners. The development has been ongoing for some time culminating into Ignition Exchange — a new, online space where developers can get free Ignition resources provided by Inductive Automation and the Ignition community. These resources can save time for developers.
OPAF, Bedrock Automation- as in take hardware platform add flow metering, exemplify the trend toward software-defined hardware.
I discussed ML in relation to AspenTech for decision making. Perhaps the industry is moving past the SciFi “artificial intelligence” part of the technology to emphasize real use cases deployed today.
To name a trend “operations” may sound archaic, but many conversations moved from technology to solving real problems for customers. This announcement from AVEVA exemplifies that trend.
AVEVA unveiled its new Discrete Lean Management software. The new offering improves operational efficiency through the digitalization of lean work management for both manual and automated production lines. AVEVA’s quick-to-deploy and easy to use digital tools enable access to production information, KPIs and notifications on dashboards, workstations and mobile devices to improve overall equipment and labor effectiveness, and to facilitate data-driven continuous improvement.
AVEVA Discrete Lean Management is designed to address the issues faced by operating manufacturing plants still using paper-based systems for lean and work order management, work instructions and data collection procedures. It enables physical records to be replaced with digital tools that mitigate the risk of manual processes and provides real time visibility into production performance allowing team collaboration in response to production issues.
The AVEVA Discrete Lean Management software solution is used in Schneider Electric’s manufacturing plants and has been successfully deployed in more than 70 smart factories globally resulting a 10% productivity increase due to downtime mitigation and 70% improved response-time due to automated escalation of production issues.
I actually visited one of the plants in the deployment—one in Lexington, KY. It was an excellent example of using software tools to enhance a lean process rather than getting in the way.
MQTT was mentioned all over the conference. This is a data transport technology. It is usable for both OPC UA and for Sparkplug. Some companies touting their use of the technology include:
I didn’t have as many security conversations as the past few years, but I did chat with some PAS Global executives, and the company announced several new products, along with some new branding.
PAS, now PAS Global, keeps building on the platform of alarm management and safety and its ability to see what is on the process plant’s network assuring integrity of the process control system.
While at ARC Forum, company executives stressed industrial operations must increase focus on cybersecurity while maintaining continuous vigilance on safety. Stated simply, organizations need to ensure OT integrity in the face of unprecedented opportunity and risk. PAS has introduced new and updated products to optimize the integrity of industrial assets and reduce cyber risk, improve process safety and reliability, and ensure OT data health.
PAS Cyber Integrity prevents, detects, and remediates industrial cyber threats. Version 6.5 introduces an enhanced user experience for OT asset inventory information and data collection and transfer. This release also provides support for multiple integration methods (REST API, Syslog, SQL, CSV, SDK), integration with Darktrace, and Microsoft Windows event analytics;
PAS PlantState Integrity Version 8.7 introduces enhancements to Independent Protection Layer (IPL) Assurance that include sensor monitoring and voting, analysis filtering, and process trip reporting.
PAS Decision Integrity enables trusted data for decision-making. Version 1.0 leverages capabilities from PAS Automation Integrity and adds support for OT data health monitoring (data chain accuracy and visualization) and data lake enrichment.
These new product releases will be generally available by the end of March.
I picked this news item up from The Economist Espresso app.
For years, technologists have gushed about the promise of the “Internet of Things”, enabling ordinary objects—from kettles to cargo ships—to communicate autonomously with each other. The two essential technologies speeding the IOT’s arrival, inexpensive sensors and super-fast networking kit, are advancing rapidly. Gartner, a research group, predicts that the global number of devices embedded with sensors will leap from 8.4bn in 2017 to 20.4bn in 2020. So is 5G, a telecoms-networking technology superior to today’s 4G mobile networks. But the world’s 5G system could split into two different and potentially incompatible entities. One has been developed by Huawei, a Chinese telecoms-equipment giant, at a cost of $46bn. But some are worried about the company’s links to the Chinese Communist Party. Several countries, led by America, have banned the use of Huawei’s gear in their systems for security reasons. The year 2020 could herald the arrival of the Splinternet of Things.
I daresay that most likely many countries in the world are concerned about the ability of the US government to monitor internet traffic through the technology of American companies. These swords always cut two ways when you take the larger view.
More relevant to this topic, though, could a potential splintering into two 5G systems globally impact IoT?
In the short term from what I can gather interviewing technologists, benefits from 5G will accrue from the ability for private, plant-wide broadband rather than from some global linking of sensors.
Perhaps we are a bit early for journalists’ raising fear, uncertainty, and doubt. Listening to people actually building out the technology, I think we are going to experience much benefit from 5G in the not-to-distant future.
Suppliers of manufacturing software, some from surprising places, are putting sizable investments into products that will help customers reap the rewards of digitalization. Today, I’m looking at both ABB and Emerson Automation Solutions. Previously I checked out GE Digital and Rockwell Automation. Each has taken a slightly different course toward the goal, but notice the common thread of enhancing software products to help customers prosper.
ABB enhances manufacturing management technology
The new version of ABB Ability Manufacturing Operations Management will offer new features including:
- Enhanced user experience based on new HTML 5 web client;
- A new smart interactive dashboard application that provides greater visibility and collaboration;
- A new statistical process control (SPC) application, to determine if each process is in a state of control;
- A new Batch Compare application – for advanced batch analysis.
“ABB Ability Manufacturing Operations Management is a comprehensive, scalable and modular software suite that optimizes visibility, knowledge and control throughout the operations domain,” said Narasimham Parimi, Head of Digital Products – Product Management, Process Control Platform. “This release provides a range of rich new functionality and a new enhanced user experience that enables operations to become more productive and responsive.”
ABB Ability Manufacturing Operations Management is designed to simplify production management by enabling performance monitoring, downtime management, and maintenance support, as well as providing statistical production analysis tools. It provides solutions and tools to facilitate the collection, consolidation and distribution of production, quality and energy information via the plant’s web-based reports, trends, and graphs.
A new, self-service dashboard application promotes increased collaboration, providing visibility from shop floor to top floor and spanning IT and OT environments. It increases data connectivity to all apps and modules within the MOM suite, combining historic and manufacturing data and providing the user with improved customization capabilities. Dashboards can be shared amongst users, further promoting collaboration between teams. Trends and events are displayed together, which enables customers to identify issues and opportunities enabling informed and timely decisions.
The new common services platform features an HTML 5 web platform that runs across all suites ensuring customers have a seamless user experience, so that applications can be viewed on different devices right down to a 10-inch tablet.
Statistical data process control (SPC) is used in manufacturing to determine if each process is in a state of control. The new SPC application works across all the different apps and modules and helps the user to improve quality and production related performance.
In addition to the existing Batch View and Batch Investigate features, a comparison option has been added to the platform’s batch analysis applications, allowing different types of comparison.
Cyber security remains one of the key issues in the advancement of Industry 4.0, and the new features in MOM include enhanced security.
Emerson Expands Analytics Platform
Plantweb Insight platform adds two new Pervasive Sensing applications that manage wireless networks more efficiently with a singular interface to the enterprise.
Emerson has added two new IIoT solutions to its Plantweb Insight data analytics platform that will enable industrial facilities to transform the way they manage their enterprise-level wireless network infrastructure.
As digitalization and wireless technology adoption continue to rapidly expand in industrial facilities throughout the world, the need for greater visibility of network infrastructure performance is key. These new Plantweb Insight applications provide a quick-to-implement, scalable IIoT solution that helps customers advance their digital transformation strategies and achieve greater operational efficiencies.
The new Plantweb Insight Network Management application provides continuous, centralized monitoring of WirelessHART networks. This first-of-its-kind application provides a singular, consolidated view of the status of all wireless networks in a facility, with embedded expertise and guidance for advanced network management.
A key feature of the Plantweb Insight Network Management application is a configurable mesh network diagram, providing visualization of network design and connections along with device-specific information. It also provides an exportable record of syslog alerts, network details outlining conformance to network best practices and more.
While the new network management application provides a holistic look at wireless networks, the Plantweb Insight Power Module Management application drills down to the device level, allowing facilities to keep their wireless devices appropriately powered so they can continuously transmit key monitoring data. By aggregating power module statuses, users can evolve traditional maintenance planning and implement more efficient and cost-effective practices.
“We were able to infuse a decade of experience with wireless technology into these new offerings,” said Brian Joe, wireless product manager with Emerson’s Automation Solutions business. “Our customers will now be able to manage and improve hundreds of networks through a singular interface, realizing significant efficiencies in individual network and wireless device management and maintenance.”
These new applications further enhance the Plantweb Insight platform, a set of pre-built analytics primarily focusing on monitoring key asset health. Other applications in the platform include pressure relief valve monitoring, heat exchanger monitoring and steam trap monitoring.