Analytics and AI Software Helps Optimize Operations

The idea that manufacturing and production enterprises make use of only a small fraction of the data it has accumulated apparently become common knowledge. ABB cites this as the driver for a new software platform called ABB Ability Genix Industrial Analytics and AI Suite.

ABB says it “operates as a digital data convergence point where streams of information from diverse sources across the plant and enterprise are put into context through a unified analytics model. Application of artificial intelligence on this data produces meaningful insights for prediction and optimization that improve business performance.”

“We believe that the place to start a data analytics journey in the process, energy and hybrid industries is by building on the existing digital technology – the automation that controls the production processes,” said Peter Terwiesch, President of ABB Industrial Automation. “We see a huge opportunity for our customers to use their data from operations better, by combining it with engineering and information technology data for multi-dimensional decision making. This new approach will help our customers make literally billions of better decisions.”

ABB Ability Genix is composed of a data analytics platform and applications, supplemented by ABB services, that help customers decide which assets, processes and risk profiles can be improved, and assists customers in designing and applying those analytics. Featuring a library of applications, customers can subscribe to a variety of analytics on demand, as business needs dictate, speeding up the traditional process of requesting and scheduling support from suppliers.

Get used to seeing these IT architecture phrases—supports a variety of deployments including cloud, hybrid and on-premise. Microsoft has also done a good job working with manufacturing as ABB also leverages Microsoft Azure for integrated cloud connectivity and services.

“The ABB Ability Genix Suite brings unique value by unlocking the combined power of diverse data, domain knowledge, technology and AI,” said Rajesh Ramachandran, Chief Digital Officer for ABB Industrial Automation. “ABB Ability Genix helps asset-intensive producers with complex processes to make timely and accurate decisions through deep analytics and optimization across the plant and enterprise.

“We have designed this modular and flexible suite so that customers at different stages in their digitalization journey can adopt ABB Ability Genix to accelerate business outcomes while protecting existing investments.”

A key component of ABB Ability Genix is the ABB Ability Edgenius Operations Data Manager that connects, collects, and analyzes operational technology data at the point of production. ABB Ability Edgenius uses data generated by operational technology such as DCS and devices to produce analytics that improve production processes and asset utilization. ABB Ability Edgenius can be deployed on its own or integrated with ABB Ability Genix so that operational data is combined with other data for strategic business analytics.

“There is great value in data generated by automation that controls real-time production,” said Bernhard Eschermann, Chief Technology Officer for ABB Industrial Automation. “With ABB Ability Edgenius, we can pull data from these real-time control systems and make it available to predict issues and prescribe actions that help us use assets better and fine-tune production processes.”

Why Invest in DataOps?

DataOps began popping onto my radar last fall. First there was a startup of former Kepware people developing DataOps for manufacturing enterprises, and then it had a featured role at an IT conference.

I have mentioned the two previously, which attracted the attention of Kevin E. Kline, who is working with Sentry One. He has a heck of a bio—Principal Program Manager, Bestselling author of SQL in a Nutshell, Founder & President Emeritus, PASS.Org, and a Microsoft MVP since 2003. He pointed me to a blog he had written that explains much about the topic. 

These passages are lifted from that blog to give you a taste. Check out the entire post for more details. Here is a description.

DataOps is a collaborative practice that improves integration, reliability, and delivery of data across the enterprise. It builds on the foundation of strong DevOps processes. Like DevOps, DataOps fosters communication between business functions like data platform, IT operations, business analytics, engineering, and data science. It focuses on streamlining and automating the data pipeline throughout the data lifecycle:

  • Data integration—simplifying the process of connecting to disparate data sources
  • Data validation—testing data to ensure that business decisions are supported by accurate information
  • Metadata management—maintaining a clear understanding of the topography of the data estate, origin, dependencies, and how the data has changes over time
  • Observability—capturing granular insights about data systems along with rich context to help DataOps teams better understand system behavior and performance

DataOps paves the way for effective data operations and a reliable data pipeline, delivering information that people trust with shorter development and delivery cycles.

This part discusses benefits. Later he discusses obstacles.

4 Benefits of DataOps Maturity

1. Collaboration

Terms that refer to effective collaboration are alignment, tearing down silos, “synergy,” and a newer term—interlock. These terms are prevalent in business because getting them right creates a force multiplier across departments. Imagine being in a rowboat with 10 other people, and none of them are rowing in the same direction. You might never get to where you’re trying to go.

A mature DataOps practice promotes up-front planning and construction, then automated ongoing execution. In other words, teams work together to define what will happen, and various software tools ensure that it happens the same way every time.

2. Reliability

Similar to the benefit of collaboration, the automation of data and analytics operations removes a potential element of human unpredictability. We, as human beings, are capable of great things like free thought and reason. These abilities serve us well in many situations. However, they can introduce problems when dealing with repetitive processes that must always follow the same steps.

3. Adaptability

With a mature, documented, and automated DataOps process, plans to introduce change require fewer hands, less time, and a lower probability of introducing errors. Using this approach also makes it easier to adapt testing procedures. This effectively reduces the time it takes to move from development to production for changes.

4. Agility

DevOps and DataOps have emerged from Agile project management practices. Because of those roots, agility becomes table stakes in DataOps processes. Data teams that already practice Agile methodologies will find it easier to define, implement, and mature their DataOps practice.

Digital Twin Alliance to Address Complex Digital Transformational Challenges

In brief: Three Organizations Combine Expertise to Bring Digital Twins to Life, Create Added Value, and Deliver Support Across the Asset Lifecycle

The idea of an open system for data flow from engineering through construction to startup to operation & maintenance, and perhaps even to decommissioning has intrigued me for years. I have worked with MIMOSA and its Open Industrial Interoperability Ecosystem for many years. Check it out.

For the most part, suppliers have been a bit slow to this game. The way of the world is that automation vendors never liked the “open” part, since their design emphasizes tight integration of as many parts as possible under their proprietary umbrella.

An ecosystem is one thing, and a partnership is another. Sometimes companies announce partnerships with great flourish and publicity only to see the great promise wither from neglect. Sometimes end users (owner/operators) reap significant benefit.

With that background, I approach the announcement of a partnership. I like the idea, but execution and sustainability will be proof of the strength of this partnership. Note that two of the companies are sort of like “conjoined twins” joined at the hip.

From the announcement:

DORIS Group, global Engineering and Project Management company in the energy industry, Schneider Electric, supplier of products and solutions for digital transformation of energy management and automation, and AVEVA an engineering and industrial software supplier, have agreed to develop a strategic partnership to deliver Digital Twin technology for the upstream oil and gas markets.

These new solutions will support the goals of oil & gas organizations to improve asset performance, increase sustainability and maximize return on capital on projects.

The three companies will combine offerings to bring engineering capabilities, an asset lifecycle software solution and digital specialization in order to create a fully formed digital twin to serve as a backbone for improving performance for the upstream sector. The new solution will:

  • Bring new assets on stream faster through the use of cloud-enabled software that improves collaboration and increases engineering efficiencies
  • Deliver enhanced safety leading to better business outcomes
  • Improve traceability through a single point of accountability
  • Enable remote operations and production assurance through a fully functional Living Digital Twin that mirrors all aspects of the operating asset

Oil & Gas owner operators have struggled to go digital due to the lack of a structured offering and orchestration as no single vendor currently delivers what is required to achieve this. Large amounts of data of various types, from different sources is another challenge they face, often leading to data inaccuracy and incompatibility, as well as difficulties in organizing that data and identifying trends.

Similarly, the oil & gas sector is under considerable pressure to quantify, track and reduce CO2 emissions as well as reduce overall pollution – this can be even more difficult with limited monitoring, no established method and no data-driven decision making.

Together, DORIS, AVEVA, and Schneider Electric will offer a structured digital and collaborative solution across the lifecycle of projects that will help oil & gas owner operators address many of these challenges.

Christophe Debouvry, CEO of DORIS Group, stated, “DORIS Group is excited to be strategically partnering with Schneider Electric and AVEVA in this unique venture which will allow us to accelerate the building out of our digital transformation strategy. Combining our complementary expertise will go a long way to providing a powerful enabler to offer our customers embarking on their digital transformational journeys with optimized solutions throughout their assets lifecycle.”

Craig Hayman, CEO AVEVA, also commented, “Leaders driving the next wave of transformation are moving quickly and that’s why this partnership with Schneider Electric and DORIS Group is so opportune. Our common aim is to support organizations on their digital journey especially in the current environment, helping them accelerate the use of digital technology, realize the value of a digital twin and also work towards a more sustainable future. It’s never been easier to begin a digital transformation program, as access to cloud computing, great connectivity, a merged edge and enterprise combined with analytics and machine learning, means that the ability to digitally drive productivity improvements into the industrial world is now unprecedented.”

Christopher Dartnell, President Oil & Gas and Petrochemicals at Schneider Electric, commented, “This partnership is in line with Schneider Electric’s objectives around Digitization and Energy Transition and we will bring our expertise in both energy and process efficiency to the industry. Our goal is to support customers looking to adopt a digital twin model, by offering our experience to facilitate the overall digital transformation for our clients enable them to improve lifecycle performance and safe operations while also making their operations more sustainable.”

Update on Plex Systems, Stand-alone MES and IIoT

I had an opportunity to talk with Ben Stewart, VP Product Strategy for Plex Systems, the other day to get my first deep dive for years. I talked with the company occasionally, but it was primarily an ERP developer with a robust MES incorporated. Much has happened lately.

What has set Plex apart for years is its SaaS, multi-tenant cloud offering. While competitors have only recently found a way to move from the license-based, client-server model to some form of cloud offering using browser-based connectivity with HTML 5 to offer visualization through tablets and smart phones.

Plex has made several moves within the past year to bolster its presence and offering and along the way garnered a Gartner recognition. First, Plex has made its MES available without its ERP so that users of other ERP solutions can add Plex SaaS MES. The second news is the Gartner recognition. And the third news item discusses Plex entering the IoT space through acquisition giving it a robust and comprehensive solution to its customers.

Plex Manufacturing Execution Suite (Plex MES)—a flexible cloud-based suite.

Plex Systems manufacturing operations capabilities are now available as a best-of-breed shop floor-specific offering called the Plex Manufacturing Execution Suite (Plex MES). This cloud-based suite is comprised of packages that satisfy the spectrum of smart manufacturing needs from MES to manufacturing operations management (MOM).

“Plex allowed us to quickly standardize our systems and processes across eight facilities globally, helping the company record production and quality checks in real-time,” said Jennifer McIntosh, ERP Manager of Gill Industries, a world-class supplier of advanced mechanisms and welded assemblies. “With the entire company now working from a single system of record, we are able to leverage MES capabilities to continuously improve our quality standards and optimize processes while reallocating our staff to focus on value-added business activities instead of system and server maintenance.”

Plex MES is designed to seamlessly connect insights from the shop floor up to the top floor, enabling production to deliver relevant real-time, operational information to key roles throughout the organization and empowering everyone to make better business decisions.

Plex MES gives manufacturers access to key capabilities required for smart manufacturing, including:

    Error-proofed control: Choreographed production processes are driven directly from the quality control plan to shorten cycle times and improve efficiency. A unique operator control panel is paperless and easy-to-use, allowing for increased productivity and fewer manual input errors. In-line quality control governs quality activities to ensure check sheet compliance, and real-time production reporting allows for real-time decisions.
  • High-resolution visibility: The full production lifecycle—from raw materials through finished goods—is accessible from anywhere on any connected device. Operations are monitored in real-time, delivering manufacturing intelligence for more accurate decision-making. Compliance risk is mitigated through database-driven traceability information, while increased visibility to asset performance creates more opportunities for continuous improvement.
  • Seamless connectivity: Flexible, configurable cloud MES is connected by design. It is easy to deploy and standardize enterprise-wide while connecting to enterprise systems like a corporate ERP. Edge connectivity to industrial automation ensures at-rate production recording to Plex MES in the cloud. The Plex MES solution is fully unified, reducing the risk of disruptions common with an MES comprised of multiple point-solutions.

“There is an underserved need among large organizations to tap into the invaluable data generated in plants around the world,” said Bill Berutti, CEO of Plex Systems. “Plex MES answers that need by consolidating our 20 years of manufacturing expertise into a highly targeted smart manufacturing solution that enables operational visibility, transparency, and accuracy, helping manufacturers standardize operations across multiple plants.”

Gartner

Plex Systems announced that Gartner has recognized it as a Challenger in the 2019 Magic Quadrant for Manufacturing Execution Systems. For this report, Gartner evaluates vendors on their Completeness of Vision and Ability to Execute. Plex is positioned furthest for Completeness of Vision in the Challengers quadrant and has improved its position on Ability to Execute compared to the previous year.

“We feel that recognition of Plex Systems as a Challenger is further validation of our ability to disrupt the MES market,” said Bill Berutti, CEO of Plex Systems. “Our manufacturing expertise is based on decades of helping customers exercise control over their shop floor operations while gaining access to invaluable data. Plex MES is flexible and scalable, answering a growing need among manufacturers to standardize their shop floors anywhere in the world.”

According to the 2019 Magic Quadrant for Manufacturing Execution Systems report, “The global MES market is a key pillar of smart factories and digital business for manufacturers. New technologies are starting to be leveraged, and disruptors are emerging. Supply chain technology leaders should use this research to select appropriate vendors and solutions.”

Plex delivers cloud MES and ERP to nearly 700 global process and discrete manufacturers. As a multi-tenant SaaS solution, manufacturers can easily implement, scale, and standardize operations with Plex across their plants throughout the enterprise.

Plex is also rated on Gartner Peer Insights, an online platform of ratings and reviews of IT software and services written and read by IT professionals and decision-makers. Verified, anonymous reviews provided by members of the Plex worldwide customer base include:

Plex Systems releases Industrial Internet of Things (IoT), a suite of solutions designed to solve business challenges generating from the shop floor.

Plex Systems released Plex Industrial IoT, which connects machines to the cloud, manages the resulting data streams, and contextualizes the information in real time. The first available offering will focus on asset performance management (APM), helping companies avoid manufacturing disruption caused by common problems like unplanned downtime, diminished machine performance, and substandard quality output.

Plex’s new solution enables manufacturers to implement and leverage connectivity in the era of Industry 4.0, breaking down siloes created by varying protocols and data types used by equipment and sensors by simplifying the connection to machines and the contextualization of data. Plex Industrial IoT grants access to the underlying machine intelligence, delivering to manufacturers timely and accurate insight in a single solution, eliminating operational surprises.

Plex Industrial IoT delivers:

  • Continuous improvement through access to historical IIoT data: The first solution within the suite focuses on asset performance management (APM), starting with an understanding of current and historical activity. This real-time assessment empowers shift supervisors and plant managers with the data to understand behaviors, trends, and diagnose root cause of common challenges like machine failures, efficiency dips, or substandard quality output.
  • Improved productivity with real-time asset dashboards: Plex Industrial IoT helps manufacturers monitor what is happening with any asset, in any facility, from any connected device with comprehensive pre-built dashboards. Customizable and real time, these dashboards deliver access to up-to-the-minute metrics and analytics. The information delivered by the dashboard enables manufacturing leaders to respond to live data immediately and accurately to improve operator performance or overall equipment effectiveness (OEE) mid-shift.
  • Minimized operational disruptions by predicting and preempting unplanned downtime: Data collected over time with Plex Industrial IoT, analyzed against historical trends and contextualized against MES and ERP data from the same facilities, will expose an unprecedented quality of shop floor to top floor insights. This reduces operational disruptions to help manufacturers better plan for the previously unanticipated.

This is Plex’s initial Industrial IoT offering following the acquisition of IIoT leader DATTUS in July 2018.

Connecting Plant to Execution to Business

When it was time to leave paid employment and head out on my own, I looked for a domain name I could buy with “connection” in it. Surveying the landscape of technology and applications in 2013, I saw that despite a couple of decades of thought on connecting sensors at one end to business systems on the other this was still the future.

I also thought that I’d make a living in the MES space. Wrong! That space has never supported media or analysts. But Internet of Things came along and changed the conversation. And the technologies that evolved with IoT have enabled connectivity at new levels.

Here is the story of plant level (Ignition SCADA from Inductive Automation) to the execution level (Sepasoft MES) to business level (SAP).

In brief, Sepasoft launches Sepasoft Business Connector & Interface For SAP ERP Modules for connecting Ignition to business systems. Easy-to-integrate toolset democratizes connectivity between manufacturing and the enterprise.

Sepasoft Business Connector and Interface for SAP ERP is a suite of modules for connecting Ignition by Inductive Automation to business systems like SAP ERP. Connecting the Enterprise Resource Planning (ERP) and Manufacturing Execution System (MES) layers of manufacturing operations, the Sepasoft Business Connector provides intuitive drag-and-drop interfaces for developing business logic, transforming data, and mapping data between Ignition and other systems.

The combination of Sepasoft and Ignition by Inductive Automation becomes a first-class MES and connectivity platform for the enterprise.

To ensure that the new product suite would present a unique offering and adhere to industry best practices, Sepasoft teamed up with 4IR Solutions due to their deep expertise in enterprise and SAP integration. “The Sepasoft Business Connector offers increased flexibility and substantial cost savings versus existing solutions. When factoring in licensing, maintenance, training, engineering, and support, customers can expect to pay an order of magnitude less compared to leveraging other middleware or developing custom solutions,” said Joseph Dolivo, Principal of Digital Transformation at 4IR Solutions.

“The Sepasoft Business Connector will allow our customers to improve their operations while reducing their costs,” said Keith Adair, Product Manager at Sepasoft. “The Sepasoft Business Connector can link directly to SAP via the Interface for SAP ERP module and provides built-in templates for common data exchange scenarios. With the addition of our Web Services module, the Sepasoft Business Connector can communicate with other ERP and business systems that support SOAP and RESTful endpoints.”

“We’re very excited to welcome a native SAP connector and business connectivity tool to the Ignition ecosystem,” said Don Pearson, Chief Strategy Officer for Inductive Automation. “As an integrated part of the Ignition platform, the Sepasoft Business Connector supports familiar idioms like drag-and-drop, tag binding, and scripting, while requiring minimal configuration of third-party systems.”

4IR Solutions Corp is the leading provider of Life Sciences and enterprise integration consulting services for the Fourth Industrial Revolution and beyond. 4IR’s consultants have decades of experience providing companies with enterprise-wide connectivity, partnering with leading vendors to capture and encapsulate that experience into new and innovative products.

HPE Emphasizing Software As Part of Pivot to As-a-Service Vision

HPE Discover Virtual Experience wrapped up last week, but I have much to think about and report. The HPE team did an excellent job pulling together a conference where we saw many different living rooms and home offices. Tough job; well done.

The release of a new software portfolio from HPE may sound more to the interest of enterprise architects, but I have already seen demos of where this also aids the coming together of OT and IT in order to bring the production side of an enterprise into more of a value to the enterprise. This is important toward the counteracting of recent enterprise history where production was a “black box” and corporate financial geniuses viewed it as something that could be moved around chasing low cost.

From the blog of Kumar Sreekanti, CTO and head of software at HPE, we learn about the coming together of Ezmeral—brand name for the software portfolio.

Digital transformation is being amplified by an order of magnitude. In fact, many business leaders that I’ve spoken with are now embracing a digital-first strategy—to compete and thrive in the midst of a global pandemic. And the enterprises that use data and artificial intelligence effectively are better equipped to evolve rapidly in this dynamic environment. Now these data-driven transformation initiatives are being accelerated to enable faster time-to-market, increased innovation, and greater responsiveness to the business and their customers.

As CTO and head of software at HPE, my focus is on delivering against our edge-to-cloud strategy and vision of providing everything as a service. Software is a very critical and important component of this strategy. It’s also essential to helping our customers succeed in their data-driven digital transformation journeys, now more than ever.

We’re committed to providing a differentiated portfolio of enterprise software to help modernize your applications, unlock insights from your data, and automate your operations—from edge to cloud. Today, we announced that we’ve unified our software portfolio with a new brand: HPE Ezmeral.

The HPE Ezmeral portfolio allows you to:

  • Run containers and Kubernetes at scale to modernize apps, from edge to cloud
  • Manage your apps, data, and ops – leveraging AI and analytics for faster time-to-insights
  • Ensure control for governance, compliance, and lower costs
  • Provide enterprise-grade security and authentication to reduce risk

Business innovation relies on applications and data. The apps and data running the enterprise now live everywhere—in data centers, in colocation centers, at the edge, and in the cloud. Most of the applications running businesses today are primarily non-cloud-native; and data is everywhere, with more and more data being generated at edge. Our customers are having real issues with non-cloud-native systems that will not or cannot move to the public cloud due to data gravity, latency, application dependency, and regulatory compliance reasons. Data has gravity, so our customers want to bring compute to the data not data to the compute. And because data is exploding, it’s driving the need for AI and machine learning at enterprise-scale—with the ability to harness and leverage petabytes of data.

Our customers want flexibility and openness; they want to eliminate lock-in. They want pay-per-use consumption in an as-a-service model. They want open solutions that give them the best of both worlds—with a modern cloud experience in any location, from edge to cloud. We address these needs by providing HPE GreenLake in the environment of your choice, with a consistent operating model, and with visibility and governance across all enterprise applications and data. Our software provides differentiated IP to deliver these cloud services through HPE GreenLake.

And in today’s news, we announced new cloud services from HPE GreenLake. This includes new HPE GreenLake cloud services for containers and machine learning operations—powered by our HPE Ezmeral Container Platform software to run containerized applications with open source Kubernetes, and HPE Ezmeral ML Ops software to operationalize the machine learning model lifecycle.