Why Invest in DataOps?

DataOps began popping onto my radar last fall. First there was a startup of former Kepware people developing DataOps for manufacturing enterprises, and then it had a featured role at an IT conference.

I have mentioned the two previously, which attracted the attention of Kevin E. Kline, who is working with Sentry One. He has a heck of a bio—Principal Program Manager, Bestselling author of SQL in a Nutshell, Founder & President Emeritus, PASS.Org, and a Microsoft MVP since 2003. He pointed me to a blog he had written that explains much about the topic. 

These passages are lifted from that blog to give you a taste. Check out the entire post for more details. Here is a description.

DataOps is a collaborative practice that improves integration, reliability, and delivery of data across the enterprise. It builds on the foundation of strong DevOps processes. Like DevOps, DataOps fosters communication between business functions like data platform, IT operations, business analytics, engineering, and data science. It focuses on streamlining and automating the data pipeline throughout the data lifecycle:

  • Data integration—simplifying the process of connecting to disparate data sources
  • Data validation—testing data to ensure that business decisions are supported by accurate information
  • Metadata management—maintaining a clear understanding of the topography of the data estate, origin, dependencies, and how the data has changes over time
  • Observability—capturing granular insights about data systems along with rich context to help DataOps teams better understand system behavior and performance

DataOps paves the way for effective data operations and a reliable data pipeline, delivering information that people trust with shorter development and delivery cycles.

This part discusses benefits. Later he discusses obstacles.

4 Benefits of DataOps Maturity

1. Collaboration

Terms that refer to effective collaboration are alignment, tearing down silos, “synergy,” and a newer term—interlock. These terms are prevalent in business because getting them right creates a force multiplier across departments. Imagine being in a rowboat with 10 other people, and none of them are rowing in the same direction. You might never get to where you’re trying to go.

A mature DataOps practice promotes up-front planning and construction, then automated ongoing execution. In other words, teams work together to define what will happen, and various software tools ensure that it happens the same way every time.

2. Reliability

Similar to the benefit of collaboration, the automation of data and analytics operations removes a potential element of human unpredictability. We, as human beings, are capable of great things like free thought and reason. These abilities serve us well in many situations. However, they can introduce problems when dealing with repetitive processes that must always follow the same steps.

3. Adaptability

With a mature, documented, and automated DataOps process, plans to introduce change require fewer hands, less time, and a lower probability of introducing errors. Using this approach also makes it easier to adapt testing procedures. This effectively reduces the time it takes to move from development to production for changes.

4. Agility

DevOps and DataOps have emerged from Agile project management practices. Because of those roots, agility becomes table stakes in DataOps processes. Data teams that already practice Agile methodologies will find it easier to define, implement, and mature their DataOps practice.

Object Management Group Forms Digital Twin Consortium

Thanks to Terrence O’Hanlon of ReliabilityWeb for cluing me in to this latest open source project regarding Digital Twins on LinkedIn. Somehow the OMG and I missed connections on the press release. Yet another case of cooperation among suppliers and users to promote the common good. Digital Twins form the bedrock of Industry 4.0 and whatever other modern industrial advance.

News in brief: Users to create standard terminology and reference architectures and share use cases across industries

Non-profit trade association Object Management Group (OMG) with founders Ansys, Dell Technologies, Lendlease, and Microsoft, announced the formation of Digital Twin Consortium. Digital twin technology enables companies to head off problems before they occur, prevent downtime, improve the customer experience, develop new opportunities, drive innovation and performance and plan for the future using simulations. Members of Digital Twin Consortium will collaborate across multiple industries to learn from each other and develop and apply best practices. This new open membership organization will drive consistency in vocabulary, architecture, security and interoperability to help advance the use of digital twin technology in many industries from aerospace to natural resources.

Digital twins, virtual models of a process, product or service that allow for data analysis and system monitoring via simulations, can be challenging to implement due to a lack of open-source software, interoperability issues, market confusion and high costs. In order to ensure the success of Digital Twin Consortium, several leading companies involved in digital twin technology have joined the consortium prior to inception. This category of early innovators, called Groundbreakers, includes: Air Force Research Lab (US), Bentley Systems, Executive Development, Gafcon, Geminus.AI, Idun Real Estate Solutions AB, imec, IOTA Foundation, IoTIFY, Luno UAB, New South Wales Government, Ricardo, Willow Technology, and WSC Technology.

Membership is open to any business, organization or entity with an interest in digital twins.

“Most definitions of digital twin are complicated, but it’s not a complicated idea. Digital twins are used for jet engines, a Mars rover, a semiconductor chip, a building and more. What makes a digital twin difficult is a lack of understanding and standardization,” said Dr. Richard Soley, Digital Twin Consortium Executive Director. “Similar to what we’ve done for digital transformation with the Industrial Internet Consortium and for software quality with the Consortium for Information and Software Quality, we plan to build an ecosystem of users, drive best practices for digital twin usage and define requirements for new digital twin standards.”

Digital Twin Consortium will:

  • Accelerate the market for digital twin technology by setting roadmaps and industry guidelines through an ecosystem of digital twin experts.
  • Improve interoperability of digital twin technologies by developing best practices for security, privacy and trustworthiness and influencing the requirements for digital twin standards.
  • Reduce the risk of capital projects and demonstrate the value of digital twin technologies through peer use cases and the development of open source code.

An ecosystem of companies, including those from the property management, construction, aerospace and defense, manufacturing and natural resources sectors will share lessons learned from their various industries and will work together on solve the challenges inherent in deploying digital twins. As requirements for new standards are defined, Digital Twin Consortium will share those requirements with standards development organizations such as parent company OMG.

Founding members, Ansys, Dell Technologies, Lendlease and Microsoft will each hold permanent seats on an elected Steering Committee, providing the strategic roadmap and creating member working groups.

Sam George, Corporate Vice President, Azure IoT, Microsoft Corp. said, “Microsoft is joining forces with other industry leaders to accelerate the use of digital twins across vertical markets. We are committed to building an open community to promote best practices and interoperability, with a goal to help establish proven, ready-to-use design patterns and standard models for specific businesses and domain-spanning core concepts.”

“The application of the Digital Twin technology to Lendlease’s portfolio of work is well underway and we are already realising the benefits of this innovation to our overall business,” said Richard Ferris, CTO, Digital Twin R&D, Lendlease. “The time for disruption is now, and requires the entire ecosystem to collaborate together, move away from the legacy which has hindered innovation from this industry, and embrace Digital twin technology for the future economic and sustainable prosperity of the built world. Digital Twin Consortium is key to the global acceleration of this collaboration and the societal rewards we know to be possible with this technology and approach.”

“Dell Technologies is proud to be one of the founding members of Digital Twin Consortium. As the rate of digital transformation continues to accelerate, industry-standard methods for Digital Twins are enabling large scale, highly efficient product development and life cycle management while also unlocking opportunities for new value creation. We are delighted to be part of this initiative as we work together with our industry peers to optimize the technologies that will shape the coming data decade for our customers and the broader ecosystem,” said Vish Nandlall, Vice President, Technology Strategy and Ecosystems, Dell Technologies.

“The Consortium is cultivating a highly diverse partner ecosystem to speed implementation of digital twins, which will substantially empower companies to slash expenses, speed product development and generate dynamic new business models,” said Prith Banerjee, chief technology officer, Ansys. “Ansys is honored to join the Consortium’s esteemed steering committee and looks forward to collaborating closely with fellow members to further the Consortium’s success and help define the future of digital twins.”

Digital Twin Consortium members are committed to using digital twins throughout their operations and supply chains and capturing best practices and standards requirements for themselves and their clients. Membership fees are based on annual revenue.

Digital Twin Consortium is The Authority in Digital Twin. It coalesces industry, government and academia to drive consistency in vocabulary, architecture, security and interoperability of digital twin technology. It advances the use of digital twin technology from aerospace to natural resources. Digital Twin Consortium is a program of Object Management Group.

3D Printing Effort Becomes Linux Foundation Open Standards Project

3MF Consortium joins Linux Foundation, announces new executive director as it moves from development to adoption

Open Source and Open Standards continue to expand influence in developing new technologies and applications. I love to see companies banding together to bring out useful new ideas. This one is interesting.

The 3MF Consortium, the organization dedicated to advancing a universal specification for 3D printing, has announced it is becoming a Linux Foundation member and that HP’s Luis Baldez is its new Executive Director (ED). Baldez supersedes Microsoft’s Adrian Lannin, who has served as ED since the 3MF Consortium was founded in 2015. Among the original creators of the 3MF Consortium, Lannin will remain a strategic advisor to the group.

The 3MF Consortium is among the original members of the Joint Development Foundation (JDF), which became part of the Linux Foundation in recent years to enable smooth collaboration among open source software projects and open standards. 3MF will take advantage of the combined strengths of the Linux Foundation/JDF alliance to advance 3D printing specifications and formats. With the majority of the world’s largest players in the 3D printing industry, 3MF Consortium represents the core of the industry’s innovation in this area.

“The 3MF Consortium has done the important work to create an open standard for 3D printing. The time is now to drive the evolution of 3MF from development to implementation,” said Baldez. “We would not be where we are today without Adrian Lannin’s leadership and contributions, and we’re looking forward to his insights as our ongoing advisor.”

Baldez was recently elected Executive Director by the 3MF Consortium membership to expand upon the technical progress and success of the 3MF standard by building new functionalities for the standard through collaboration with Linux Foundation and JDF. Baldez is a 3D printing veteran with experience across new research, market & business development. It is this combination of expertise that makes him well-suited for the ED role at 3MF Consortium, where the focus is maturing from standards development to implementation and adoption. Baldez has also held R&D engineering leadership positions at other multinationals and startups.

“Luis is a longtime champion of open standards and is an expert in the 3D printing space,” said Alex Oster, chairman of the 3MF technical working group and director of additive manufacturing at Autodesk. “Luis’ leadership and our collaboration with Linux Foundation will accelerate our work on 3D printing and help us build an even more vibrant network of contributions.”

The 3MF Consortium has grown rapidly since its formation in 2015, garnering new member investments and adoption across the industry’s leaders in 3D printing. It is supported by 3D Systems, Autodesk, GE, HP, Materialise, Microsoft, nTopology and Siemens among nearly 20 other companies and has been implemented in nearly 40 products across 22 companies. The 3MF specification is robust and includes six extensions that range from core and production to slice, material and property (including color), beam lattice and security. The Secure Content specification was recently released and establishes an underlying mechanism for payload encryption of sensitive 3D printed data based on modern web standards.

Podcast 210 They Don’t Look Like Athletes

The first writer to seriously look at the new phenomenon of data-driven analytics in baseball found himself allowed to sit in the locker room of the major league baseball team. He observed the players. Something naggged at his consciousness. Then it dawned on him—they didn’t look like athletes. Showering, getting dressed, no one really looked like a standout athlete. Yet, they were winning. Yes, said data-driven baseball exec Billy Beane, everyone else evaluates how players look. We look at their performance and indicators that they have future potential. But I really wanted to discuss Digital Transformation. And to transform digitally, you need to be (digital) data-driven.

Digital Twin Alliance to Address Complex Digital Transformational Challenges

In brief: Three Organizations Combine Expertise to Bring Digital Twins to Life, Create Added Value, and Deliver Support Across the Asset Lifecycle

The idea of an open system for data flow from engineering through construction to startup to operation & maintenance, and perhaps even to decommissioning has intrigued me for years. I have worked with MIMOSA and its Open Industrial Interoperability Ecosystem for many years. Check it out.

For the most part, suppliers have been a bit slow to this game. The way of the world is that automation vendors never liked the “open” part, since their design emphasizes tight integration of as many parts as possible under their proprietary umbrella.

An ecosystem is one thing, and a partnership is another. Sometimes companies announce partnerships with great flourish and publicity only to see the great promise wither from neglect. Sometimes end users (owner/operators) reap significant benefit.

With that background, I approach the announcement of a partnership. I like the idea, but execution and sustainability will be proof of the strength of this partnership. Note that two of the companies are sort of like “conjoined twins” joined at the hip.

From the announcement:

DORIS Group, global Engineering and Project Management company in the energy industry, Schneider Electric, supplier of products and solutions for digital transformation of energy management and automation, and AVEVA an engineering and industrial software supplier, have agreed to develop a strategic partnership to deliver Digital Twin technology for the upstream oil and gas markets.

These new solutions will support the goals of oil & gas organizations to improve asset performance, increase sustainability and maximize return on capital on projects.

The three companies will combine offerings to bring engineering capabilities, an asset lifecycle software solution and digital specialization in order to create a fully formed digital twin to serve as a backbone for improving performance for the upstream sector. The new solution will:

  • Bring new assets on stream faster through the use of cloud-enabled software that improves collaboration and increases engineering efficiencies
  • Deliver enhanced safety leading to better business outcomes
  • Improve traceability through a single point of accountability
  • Enable remote operations and production assurance through a fully functional Living Digital Twin that mirrors all aspects of the operating asset

Oil & Gas owner operators have struggled to go digital due to the lack of a structured offering and orchestration as no single vendor currently delivers what is required to achieve this. Large amounts of data of various types, from different sources is another challenge they face, often leading to data inaccuracy and incompatibility, as well as difficulties in organizing that data and identifying trends.

Similarly, the oil & gas sector is under considerable pressure to quantify, track and reduce CO2 emissions as well as reduce overall pollution – this can be even more difficult with limited monitoring, no established method and no data-driven decision making.

Together, DORIS, AVEVA, and Schneider Electric will offer a structured digital and collaborative solution across the lifecycle of projects that will help oil & gas owner operators address many of these challenges.

Christophe Debouvry, CEO of DORIS Group, stated, “DORIS Group is excited to be strategically partnering with Schneider Electric and AVEVA in this unique venture which will allow us to accelerate the building out of our digital transformation strategy. Combining our complementary expertise will go a long way to providing a powerful enabler to offer our customers embarking on their digital transformational journeys with optimized solutions throughout their assets lifecycle.”

Craig Hayman, CEO AVEVA, also commented, “Leaders driving the next wave of transformation are moving quickly and that’s why this partnership with Schneider Electric and DORIS Group is so opportune. Our common aim is to support organizations on their digital journey especially in the current environment, helping them accelerate the use of digital technology, realize the value of a digital twin and also work towards a more sustainable future. It’s never been easier to begin a digital transformation program, as access to cloud computing, great connectivity, a merged edge and enterprise combined with analytics and machine learning, means that the ability to digitally drive productivity improvements into the industrial world is now unprecedented.”

Christopher Dartnell, President Oil & Gas and Petrochemicals at Schneider Electric, commented, “This partnership is in line with Schneider Electric’s objectives around Digitization and Energy Transition and we will bring our expertise in both energy and process efficiency to the industry. Our goal is to support customers looking to adopt a digital twin model, by offering our experience to facilitate the overall digital transformation for our clients enable them to improve lifecycle performance and safe operations while also making their operations more sustainable.”

Sparkplug: Open Source Technology to Bridge the OT-IT Gap

Many engineers are looking for better ways to move data with fewer programming hours and headaches. Whereas OPC solved many problems leading to interoperability and data exchange, it also brings with it a higher overhead and programming load. For those searching for a something lighter, and also open source, along comes Sparkplug.

Cirrus Link authored the Sparkplug specification and provided it to the Eclipse Foundation, and several other companies support the group as founding members including Chevron, Canary Labs, HiveMQ, Inductive Automation, and ORing. Now additional companies are developing their products using Sparkplug for interoperability.

I recently received a paper authored by Arlen Nipper, president and CTO of Cirrus Link “Sparkplug: Open Source Technology to Bridge the OT-IT Gap”. He begins:

One of the primary pain points in Industrial IoT (IIoT) is disparate systems with both modern and legacy assets. Companies in any industry ranging from oil and gas to manufacturing can hardly imagine a world where they can choose any vendor’s hardware, plug it into their network, and have the hardware 100 percent self-discovered by their SCADA system and every application in the enterprise. True vendor interoperability for both data producers and data consumers is the vision, and new open-source technology may be the answer.

These days, everything relates back to digital transformation. Nipper write, Digital transformation requires devices in the field to be connected, with data made available that can speak the language of both OT and IT for improved business intelligence. In order for this type of digital transformation to be successful, data must be decoupled from a single application so it can flow to enterprise applications in a one-to-many approach.

From the first time I met Nipper, he has evangelized MQTT—a protocol he helped write—as an IT-friendly messaging protocol. It is lightweight. It is a publish-subscribe network protocol allowing for multiple data consumers.

MQTT is a messaging protocol. It does not describe the data traversing the wire (or air). While it provides an excellent engine for delivering IIoT data, MQTT doesn’t make the data interoperable across the enterprise. Thus, a new open source standard has been created and the IIoT industry should understand its importance for bridging the gap from OT to IT.

Nipper explains the next step:

The Internet expanded rapidly thanks to two open technologies – first HTTP, a data exchange protocol, and then HTML, which was used to define the data sent by HTTP. Both were needed. MQTT has needed its “HTML” for years in order for IIoT to explode in growth and adoption. In order to solve this problem of OT-IT interoperability, the Eclipse Sparkplug working group was launched in February 2020 to bring device communications standardization to IIoT.

The Eclipse Foundation states, “The Sparkplug Working Group was established to ‘improve the interoperability and scalability of IIoT solutions, and provide an overall framework for supporting Industry 4.0 for oil and gas, energy, manufacturing, smart cities, and other related industries.’ ”

Sparkplug is an open source software specification that provides MQTT clients with a framework to integrate data. The specification articulates three goals:

1. Define an MQTT Topic Namespace optimized for IIoT.

2. Define MQTT State Management to take advantage of continuous session awareness.

3. Define the MQTT Payload.

Sparkplug adds features including birth certificate and death certificate (session awareness) to help with contextualization of data.

Sparkplug makes this process fast, secure, and open standard so anyone can make use of the framework for MQTT interoperability. Many device manufacturers are supporting Sparkplug, which means it is built in natively on the device on the OT floor.

Nipper concludes:

With Sparkplug, machine learning and artificial intelligence applications can utilize the same standard interface for data without having to know and understand the entire OT environment. They can subscribe to the OT data, and use it immediately for IT functions.

Follow this blog

Get a weekly email of all new posts.