The idea that manufacturing and production enterprises make use of only a small fraction of the data it has accumulated apparently become common knowledge. ABB cites this as the driver for a new software platform called ABB Ability Genix Industrial Analytics and AI Suite.
ABB says it “operates as a digital data convergence point where streams of information from diverse sources across the plant and enterprise are put into context through a unified analytics model. Application of artificial intelligence on this data produces meaningful insights for prediction and optimization that improve business performance.”
“We believe that the place to start a data analytics journey in the process, energy and hybrid industries is by building on the existing digital technology – the automation that controls the production processes,” said Peter Terwiesch, President of ABB Industrial Automation. “We see a huge opportunity for our customers to use their data from operations better, by combining it with engineering and information technology data for multi-dimensional decision making. This new approach will help our customers make literally billions of better decisions.”
ABB Ability Genix is composed of a data analytics platform and applications, supplemented by ABB services, that help customers decide which assets, processes and risk profiles can be improved, and assists customers in designing and applying those analytics. Featuring a library of applications, customers can subscribe to a variety of analytics on demand, as business needs dictate, speeding up the traditional process of requesting and scheduling support from suppliers.
Get used to seeing these IT architecture phrases—supports a variety of deployments including cloud, hybrid and on-premise. Microsoft has also done a good job working with manufacturing as ABB also leverages Microsoft Azure for integrated cloud connectivity and services.
“The ABB Ability Genix Suite brings unique value by unlocking the combined power of diverse data, domain knowledge, technology and AI,” said Rajesh Ramachandran, Chief Digital Officer for ABB Industrial Automation. “ABB Ability Genix helps asset-intensive producers with complex processes to make timely and accurate decisions through deep analytics and optimization across the plant and enterprise.
“We have designed this modular and flexible suite so that customers at different stages in their digitalization journey can adopt ABB Ability Genix to accelerate business outcomes while protecting existing investments.”
A key component of ABB Ability Genix is the ABB Ability Edgenius Operations Data Manager that connects, collects, and analyzes operational technology data at the point of production. ABB Ability Edgenius uses data generated by operational technology such as DCS and devices to produce analytics that improve production processes and asset utilization. ABB Ability Edgenius can be deployed on its own or integrated with ABB Ability Genix so that operational data is combined with other data for strategic business analytics.
“There is great value in data generated by automation that controls real-time production,” said Bernhard Eschermann, Chief Technology Officer for ABB Industrial Automation. “With ABB Ability Edgenius, we can pull data from these real-time control systems and make it available to predict issues and prescribe actions that help us use assets better and fine-tune production processes.”
I first started hearing seriously about data ops last year at a couple of IT conferences. Then a group of former Kepware executives founded High Byte to point data ops specifically to the manufacturing industry. I told them I thought they had something there.
A representative of Seagate Technology sent me information about a study done with IDC about data in organizations. I haven’t had a relationship with Seagate for many years, but this is a timely report about enterprise data pointing out that 68% of data available goes unleveraged and that manufacturing is a laggard in this arena.
As enterprise data proliferates at an unprecedented pace – set to grow at a 42.2.% annual rate over the next two years – a new report from Seagate and IDC has revealed that the majority (68%) of data available to enterprises goes unleveraged, meaning data management has become more important than ever.
Furthermore and somewhat surprisingly, the manufacturing sector shows the lowest level of task automation in data management, lowest rate for full integration of data management functions as well as low adoption of both multicloud and hybrid cloud infrastructures.
The report also identifies the missing link of data management—DataOps—which can help organizations harness more of their data’s value and lead to better business outcomes.
The report, Rethink Data: Put More of Your Data to Work—From Edge to Cloud is based on a survey of 1500 global enterprise leaders commissioned by Seagate and conducted by the research firm IDC.
“The report and the survey make clear that winning businesses must have strong mass data operations,” says Seagate CEO Dave Mosley. “The value that a company derives from data directly affects its success.”
Some additional findings include:
- The top five barriers to putting data to work are: 1) making collected data usable, 2) managing the storage of collected data, 3) ensuring that needed data is collected, 4) ensuring the security of collected data, and 5) making the different silos of collected data available.
- Managing data in the multicloud and hybrid cloud are top data management challenges expected by businesses over the next two years.
- Two thirds of survey respondents report insufficient data security, making data security an essential element of any discussion of efficient data management.
The missing link of data management is reported to be data operations, or DataOps. IDC defines DataOps as “the discipline connecting data creators with data consumers.” While the majority of respondents say that DataOps is “very” or “extremely” important, only 10% of organizations report having implemented DataOps fully. The survey demonstrated that, along with other data management solutions, DataOps leads to measurably better business outcomes. It boosts customer loyalty, revenue, profit, cost savings, plus results in other benefits.
“The findings of this study illustrating that more than two-thirds of available data lies fallow in organizations may seem like disturbing news,” said Phil Goodwin, research director, IDC and principal analyst on the study. “But in truth, it shows how much opportunity and potential organizations already have at their fingertips. Organizations that can harness the value of their data wherever it resides—core, cloud or edge—can generate significant competitive advantage in the marketplace.”
Guidance to help organizations achieve better businesses outcomes
White papers can be an excellent learning tool. I’ve told marketing people for years that they should write these instead of all the overtly sales-y stuff they put out. Build trust and a sense of expertise by publishing documents that teach. It’s a bit like my sales “technique” back in the day. Here is a new one adding to the library of Digital Transformation.
The Industrial Internet Consortium (IIC) has published the Digital Transformation in Industry White Paper. This new white paper focuses on digital transformation in industry and the role innovation processes play in it. It also covers the disruptive technologies that transform the way companies operate, service and maintain equipment. The white paper is designed as a guide that business managers, technology managers and risk, security and safety managers can use to develop business models, leverage key technologies and determine the level of trustworthiness they will need as they begin their digital transformation journey.
“Digital Transformation is the next disruptive wave hitting industry. With this publication, we have described the key technologies that underpin digital transformation and the first steps for any enterprise looking to deploy them,” said Jim Morrish, Founding Partner of Transforma Insights and Co-chair of the IIC Digital Transformation Working Group.
Digital transformation initiatives fall into three categories:
- New business models – entails an enterprise transforming to offer a substantially changed service to technology users, often associated with new ways of charging for services
- Enterprise operations – focuses primarily on increasing the efficiency (or reducing the cost, or risk) of providing products and services to technology users
- Customer experience – focuses on changing the customer experience in absence of other changes. These projects tend to center on generating new service revenues or providing new services to customers, particularly field services.
“Digital transformation is a business strategy with the objective to improve business and industrial models and create new ones. This is achieved through the innovative and principled application of digital technologies along with business and organizational realignment,” said Bassam Zarkout, Founder of IGnPower and Co-chair of the IIC Digital Transformation Working Group. “Digital transformation is not a project. It is strategy led by a vision and powered by a committed program, which may involve multiple IIoT projects.”
The white paper covers a wide range of technologies that can enable digital transformation, such as:
- Edge Technology
- Hyper Connectivity
- Data Security
- Artificial Intelligence and Analytics
- Digital Twin
- Distributed Ledger
- Human-Machine Interface
- Additive Manufacturing
- Data Sharing
- Autonomous Robotic Systems
- Innovation at the IT/OT Boundary
- Micropower Generation–Energy Harvesting
- Technical Platforms for New Business Models and Payment Methods
Trustworthiness of systems is a key element of a digital transformation strategy; a lack of trustworthiness may place an organization at a disadvantage vis-à-vis its competitors and can have dire consequences. This could include human injury or worse, interruption of critical infrastructure, unintended disclosure of sensitive data, destruction of equipment, economic loss and reputational damage.
Overinvesting in trustworthiness, can on the other hand increase capital and maintenance costs, reduce flexibility and functionality and introduce cumbersome processes. “Companies embarking on digital transformation must weigh the risks and benefits of both underinvesting or overinvesting in trustworthiness,” added Morrish.
The IIC members who wrote the Digital Transformation in Industry White Paper include: Jim Morrish, Transforma Insights; Bassam Zarkout, IGnPower Inc.; Marcellus Buchheit, Wibu-Systems; Alex Ferraro, PwC; Chaisung Lim, Korea Industry 4.0 Association and Shi-Wan Lin, Yo-i Information Technology.
Also offered is a free IIC Webinar, “Digital Transformation – The Next Disruptive Wave Across Industries,” which runs on September 15, 2020 at 11am EDT, 8am PDT, 1700 CEST.
The Industrial Internet Consortium is a membership program transforming business and society by accelerating the Industrial Internet of Things (IIoT). The IIC delivers a trustworthy IIoT in which the world’s systems and devices are securely connected and controlled to deliver transformational outcomes. The Industrial Internet Consortium is a program of the Object Management Group (OMG).
It almost sounds like a ’50s SciFi movie.
For a couple of months into the Covid pandemic, my inbox collected a steady stream of press releases about what this or that company was doing to either fight the coronavirus or prepare workplaces and workforces for the return to the office. That mighty river has turned into a stream at the end of summer.
The CTO of a Siemens company on NPR’s Tech Nation with Moira Gunn (good podcast, by the way) and I have interviewed Siemens about its combining of technologies to provide for safer workplaces in light of infectious viruses.
Then I received this note from Marty Edwards, VP of OT Security, Tenable, whom I’ve known for years as a reputable security specialist. “Prediction: Workers who return to the office may well bring new vulnerabilities with them.”
“While many critical infrastructure workers who operate, manage and secure the OT that underpins our economy can’t bring their work home, some of their colleagues certainly can. It’s likely that functions such as sales, marketing, HR, finance and legal of many essential services –food and beverage, manufacturing and pharmaceutical companies — have shifted to a remote-work model. When stay-at-home orders are eventually lifted, many of these folks will return to their offices with equipment that will be re-connected to corporate networks. With this comes the added risk of new vulnerabilities and threats being introduced to either the IT or OT side of mission- and safety-critical operations. During this transition, it’s imperative security teams have visibility into where the organization is exposed and to what extent, enabling them to effectively manage risk on a day-to-day basis. Put simply, the security challenges aren’t gone once everyone is back in the office.”
I have not worked in an office for years, unless you call a coffee house an office. But, many people will be returning to offices in the next few months. They will expect safe workspaces. As will all the factory workers (think about the morons running meat processing plants).
It took a while for cybersecurity to catch up with the sudden working-from-home IT challenge. Now, we’ll have millions returning to the corporate intranet bringing who knows what (computer) viruses with them. Another type of security to deal with.
One way or another, engineers will be busy dealing with this crisis for many months. Probably along with all their other work.
DataOps began popping onto my radar last fall. First there was a startup of former Kepware people developing DataOps for manufacturing enterprises, and then it had a featured role at an IT conference.
I have mentioned the two previously, which attracted the attention of Kevin E. Kline, who is working with Sentry One. He has a heck of a bio—Principal Program Manager, Bestselling author of SQL in a Nutshell, Founder & President Emeritus, PASS.Org, and a Microsoft MVP since 2003. He pointed me to a blog he had written that explains much about the topic.
These passages are lifted from that blog to give you a taste. Check out the entire post for more details. Here is a description.
DataOps is a collaborative practice that improves integration, reliability, and delivery of data across the enterprise. It builds on the foundation of strong DevOps processes. Like DevOps, DataOps fosters communication between business functions like data platform, IT operations, business analytics, engineering, and data science. It focuses on streamlining and automating the data pipeline throughout the data lifecycle:
- Data integration—simplifying the process of connecting to disparate data sources
- Data validation—testing data to ensure that business decisions are supported by accurate information
- Metadata management—maintaining a clear understanding of the topography of the data estate, origin, dependencies, and how the data has changes over time
- Observability—capturing granular insights about data systems along with rich context to help DataOps teams better understand system behavior and performance
DataOps paves the way for effective data operations and a reliable data pipeline, delivering information that people trust with shorter development and delivery cycles.
This part discusses benefits. Later he discusses obstacles.
4 Benefits of DataOps Maturity
Terms that refer to effective collaboration are alignment, tearing down silos, “synergy,” and a newer term—interlock. These terms are prevalent in business because getting them right creates a force multiplier across departments. Imagine being in a rowboat with 10 other people, and none of them are rowing in the same direction. You might never get to where you’re trying to go.
A mature DataOps practice promotes up-front planning and construction, then automated ongoing execution. In other words, teams work together to define what will happen, and various software tools ensure that it happens the same way every time.
Similar to the benefit of collaboration, the automation of data and analytics operations removes a potential element of human unpredictability. We, as human beings, are capable of great things like free thought and reason. These abilities serve us well in many situations. However, they can introduce problems when dealing with repetitive processes that must always follow the same steps.
With a mature, documented, and automated DataOps process, plans to introduce change require fewer hands, less time, and a lower probability of introducing errors. Using this approach also makes it easier to adapt testing procedures. This effectively reduces the time it takes to move from development to production for changes.
DevOps and DataOps have emerged from Agile project management practices. Because of those roots, agility becomes table stakes in DataOps processes. Data teams that already practice Agile methodologies will find it easier to define, implement, and mature their DataOps practice.
Thanks to Terrence O’Hanlon of ReliabilityWeb for cluing me in to this latest open source project regarding Digital Twins on LinkedIn. Somehow the OMG and I missed connections on the press release. Yet another case of cooperation among suppliers and users to promote the common good. Digital Twins form the bedrock of Industry 4.0 and whatever other modern industrial advance.
News in brief: Users to create standard terminology and reference architectures and share use cases across industries
Non-profit trade association Object Management Group (OMG) with founders Ansys, Dell Technologies, Lendlease, and Microsoft, announced the formation of Digital Twin Consortium. Digital twin technology enables companies to head off problems before they occur, prevent downtime, improve the customer experience, develop new opportunities, drive innovation and performance and plan for the future using simulations. Members of Digital Twin Consortium will collaborate across multiple industries to learn from each other and develop and apply best practices. This new open membership organization will drive consistency in vocabulary, architecture, security and interoperability to help advance the use of digital twin technology in many industries from aerospace to natural resources.
Digital twins, virtual models of a process, product or service that allow for data analysis and system monitoring via simulations, can be challenging to implement due to a lack of open-source software, interoperability issues, market confusion and high costs. In order to ensure the success of Digital Twin Consortium, several leading companies involved in digital twin technology have joined the consortium prior to inception. This category of early innovators, called Groundbreakers, includes: Air Force Research Lab (US), Bentley Systems, Executive Development, Gafcon, Geminus.AI, Idun Real Estate Solutions AB, imec, IOTA Foundation, IoTIFY, Luno UAB, New South Wales Government, Ricardo, Willow Technology, and WSC Technology.
Membership is open to any business, organization or entity with an interest in digital twins.
“Most definitions of digital twin are complicated, but it’s not a complicated idea. Digital twins are used for jet engines, a Mars rover, a semiconductor chip, a building and more. What makes a digital twin difficult is a lack of understanding and standardization,” said Dr. Richard Soley, Digital Twin Consortium Executive Director. “Similar to what we’ve done for digital transformation with the Industrial Internet Consortium and for software quality with the Consortium for Information and Software Quality, we plan to build an ecosystem of users, drive best practices for digital twin usage and define requirements for new digital twin standards.”
Digital Twin Consortium will:
- Accelerate the market for digital twin technology by setting roadmaps and industry guidelines through an ecosystem of digital twin experts.
- Improve interoperability of digital twin technologies by developing best practices for security, privacy and trustworthiness and influencing the requirements for digital twin standards.
- Reduce the risk of capital projects and demonstrate the value of digital twin technologies through peer use cases and the development of open source code.
An ecosystem of companies, including those from the property management, construction, aerospace and defense, manufacturing and natural resources sectors will share lessons learned from their various industries and will work together on solve the challenges inherent in deploying digital twins. As requirements for new standards are defined, Digital Twin Consortium will share those requirements with standards development organizations such as parent company OMG.
Founding members, Ansys, Dell Technologies, Lendlease and Microsoft will each hold permanent seats on an elected Steering Committee, providing the strategic roadmap and creating member working groups.
Sam George, Corporate Vice President, Azure IoT, Microsoft Corp. said, “Microsoft is joining forces with other industry leaders to accelerate the use of digital twins across vertical markets. We are committed to building an open community to promote best practices and interoperability, with a goal to help establish proven, ready-to-use design patterns and standard models for specific businesses and domain-spanning core concepts.”
“The application of the Digital Twin technology to Lendlease’s portfolio of work is well underway and we are already realising the benefits of this innovation to our overall business,” said Richard Ferris, CTO, Digital Twin R&D, Lendlease. “The time for disruption is now, and requires the entire ecosystem to collaborate together, move away from the legacy which has hindered innovation from this industry, and embrace Digital twin technology for the future economic and sustainable prosperity of the built world. Digital Twin Consortium is key to the global acceleration of this collaboration and the societal rewards we know to be possible with this technology and approach.”
“Dell Technologies is proud to be one of the founding members of Digital Twin Consortium. As the rate of digital transformation continues to accelerate, industry-standard methods for Digital Twins are enabling large scale, highly efficient product development and life cycle management while also unlocking opportunities for new value creation. We are delighted to be part of this initiative as we work together with our industry peers to optimize the technologies that will shape the coming data decade for our customers and the broader ecosystem,” said Vish Nandlall, Vice President, Technology Strategy and Ecosystems, Dell Technologies.
“The Consortium is cultivating a highly diverse partner ecosystem to speed implementation of digital twins, which will substantially empower companies to slash expenses, speed product development and generate dynamic new business models,” said Prith Banerjee, chief technology officer, Ansys. “Ansys is honored to join the Consortium’s esteemed steering committee and looks forward to collaborating closely with fellow members to further the Consortium’s success and help define the future of digital twins.”
Digital Twin Consortium members are committed to using digital twins throughout their operations and supply chains and capturing best practices and standards requirements for themselves and their clients. Membership fees are based on annual revenue.
Digital Twin Consortium is The Authority in Digital Twin. It coalesces industry, government and academia to drive consistency in vocabulary, architecture, security and interoperability of digital twin technology. It advances the use of digital twin technology from aerospace to natural resources. Digital Twin Consortium is a program of Object Management Group.