Edge as a Service?

The “Edge” is a hot space right now, although sometimes I’m not sure that everyone agrees what “Edge” is as they develop products and solutions. However, first thing this morning I saw this tweet from Tom Bradicich of Hewlett Packard Enterprise (@HPE) referring to an article that mentions him in CouputerWeekly.com. I’ve written about HPE at the edge and with IoT before. Looks like something’s up.

Tweet from @TomBradicichPhD Not only computing at the #edge, but also a new product category of “converging IT with #OT” systems (such as controls, DAQ, industrial protocols). Watch this space, my team’s next innovation is all this as-a-service. #aaS

Here is the rationale from the Computer Weekly article. “The benefits of edge computing have the potential to help businesses dramatically speed up their data analysis time while cutting down costs. @HPE’s Mark Potter and @TomBradicichPhD share how we can make this possible.”

In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.

It is fascinating to see our environment described by an enterprise IT writer. The truth is that following the Purdue Model, suppliers tried to make PLCs and DCSs part of the information infrastructure in parallel to supervising or executing control functions. That proved too unwieldy for control engineers to manage within the programming tools used. It was also too slow and not really optimized for the task.

Along came IT companies. I have followed a few over the past five years. They have had trouble with figuring out how to make a business out of edge compute, gateways, networking, and the like.

In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.

Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.

Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”

I get this analogy to converging technologies into a device such as the iPhone. I don’t know if we want to cede control over to an HPE compute platform (although it has plenty of horsepower), but the idea is tempting. And it would be thoroughly disruptive.

Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.

HPE’s investment in a new company called Pensando, which recently emerged from stealth mode and is founded and staffed by former Cisco technologists with former Cisco CEO John Chambers installed as Chairman, Sunil believes new categories of device will come to market, aimed at edge computing, could lead to a plethora of new devices perhaps to perform data acquisition and real-time data processing.

Mark Potter recently wrote in a blog post, By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in in a way that is completely air-gapped from the core system.

These are critically-important requirements in our hyper-connected, edge-centric, cloud-enabled and data-driven world – where billions of people and trillions of things interact.

This convergence is generating unimaginable amounts of data from which enterprises seek to unearth industry-shaping insights. And as emerging technologies like edge computing, AI and 5G become even more mainstream, enterprises have an ever-growing need to harness the power of that data. But moving data from its point of generation to a central data center for processing presents major challenges — from substantial delays in analysis to security, governance and compliance risks.

That’s where Pensando and HPE are making an industry-defining difference. By moving the traditionally data center-bound network, storage and security services to the server processing the data, we will eliminate the need for round-trip data transfer to centralized network and security appliances – and at a lower cost, with more efficiency and higher performance.

Here are benefits that Potter listed:

  • Lower latency to competitive solutions, as operations will be carried at a 100Gbps network line rate speed;
  • Controller management framework to scale across thousands of nodes with a federation of controllers allowing scale to 1M+ endpoints; and
  • Security, governance and compliance policies that are consistently applied at the edge.

Open, 5G, Edge Dominate ARC Forum Conversations

Announcements and discussions at this year’s iteration of the Industry Forum sponsored by ARC Advisory Group were amazingly diverse. Another IT supplier appeared. Security remained an issue. Most conversations revolved around open (open source and open interoperability), edge, 5G, collaboration/partnerships, software-defined, machine learning, MQTT and its companion Sparkplug, and most importantly, solving problems for end users.

Following is a brief recap. Follow the links for in-depth information. Of course, many company announcements fit into more than one bucket.

Open

Examples one of a variety of open include Eclipse Foundation and the Open Group Open Process Automation Forum with the IT technology of Kubernetes thrown in.

The Eclipse Foundation launched the Sparkplug Working Group. Founding members Chevron, Canary Labs, Cirrus Link Solutions, HiveMQ, Inductive Automation, and ORing are defining an open standard specification to create interoperable MQTT IIoT solutions.

The Working Group will encourage the definition of technical specifications and associated implementations that rationalize access to industrial data, improve the interoperability and scalability of IIoT solutions, and provide an overall framework for supporting Industry 4.0 for oil and gas, energy, manufacturing, smart cities and other related industries.

Sparkplug is relatively new which leads to interoperability problems since each supplier and end user must create all definitions of the data. Success of this WG is essential for any widespread adoption of is. The Eclipse Foundation pointed out the intent and purpose of the Sparkplug specification is to define an MQTT topic namespace, payload, and session state management that can be applied generically. By meeting the operational requirements for these systems, Sparkplug will enable MQTT-based infrastructures to provide more valuable real-time information to business stakeholders as well.

The Open Group Open Process Automation Forum progresses. This topic broaches on both open and software-defined control. The Open Group Open Process Automation Forum (OPAF), in the first major update to the standard for open process automation systems since February 2019, has progressed perhaps more than I would have predicted after its unveiling only a few years ago at the ARC Forum.

Its first release focused on interoperability while the O-PAS Standard Version 2.0 provides a vendor-neutral Reference Architecture which enables the construction of scalable, reliable, interoperable, and secure process automation systems. The latest release, which is a Preliminary Standard of The Open Group, has further emphasis on standardized system configuration portability to significantly reduce capital cost and time investment for end-users. With these capabilities, end-users can easily exchange equipment without being tied to a single vendor or requiring individual configuration parameters to be written in different operating languages.

With their standard moving from interoperability to portable configurations, leaders told me that the next release will expand on this portability theme.

Bedrock Automation integrates Flow-Cal Flow Measurement into Open Secure Automation (OSA) Platform.

Speaking of both software-defined and open, Bedrock Automation Founder, CEO, and CTO Albert Rooyakkers explained its extension to the “Open Secure Automation (OSA)” platform with the addition of Flow-Cal algorithms “bringing oil and gas measurement and custody transfer securely into the digital age.” This was essentially a software addition to the platform to bring a new twist on flow computer functionality.

The new OSA +Flow family embeds industry-leading Flow-Cal measurement applications. Flow-Cal’s software has long been the industry’s choice for flow measurement and production-accounting data. Affirming Flow-Cal’s stature is the fact that the American Petroleum Institute (API) has selected it to develop, support, and distribute its standard flow measurement calculations.

The OSA +Flow software has been incorporated across all Bedrock controllers providing scalability for PLC, RTU, or DCS flow control requirements at custody transfer stations, separators, and other oil and gas production facilities. These solutions include full support of multi-drop serial, Ethernet, and HART for Coriolis, ultrasonic, and smart transmitters.

The system API compliant library, OPC UA, Inductive Automation software, MQTT, as well as software-defined I/O.

Diamanti Accelerates Energy and Service Organizations’ Adoption of AI/ML

AI and ML applications often leverage GPU processing for training models and they benefit from containers and Kubernetes—an open source container project. However, these processes are often complicated to adopt and run at scale. With the recent announcement of GPU support in the Diamanti AI/ML platform, enterprises have an easier on-ramp to managing large-scale containerized workloads under Kubernetes.

“We’re pleased to share the early customer traction we are seeing on our newest solutions in a wide range of industries including energy, services and more,” said Tom Barton, CEO of Diamanti. “These customers are validating state-of-the-art technologies internally while also benefiting from the reduced physical footprint and cost-savings that come with the Diamanti AI/ML platform.”

The new solution, announced in late 2019, is in early access today and fully supports Nvidia’s NVLink technology for higher performing workloads, as well as Kubeflow, an open source machine learning framework for Kubernetes that provides highly available Jupyter notebooks and ML pipelines. Combined with Diamanti’s Kubernetes control plane, this allows customers to deliver highly scalable environments for performance-intensive AI/ML workloads, accelerating model development and training.

A major energy company turned to Diamanti for a new workload leveraging AI/ML for optical character recognition (OCR) to scan invoices. The customer needed to scan more than 15,000 invoices a day. The legacy infrastructure could not keep up with the demand and eventually accrued a backlog of more than 200,000 invoices. Deploying the Diamanti solution with GPU support eliminated that backlog within hours.

Edge – 5G

As the other influencers at an HPE event told me once, “Gary, everything you do is the edge.” So it is not surprising that I had many conversations about the Edge. But 5G technology was also on many minds. The consensus opinion–5G will drive decision making to the edge.

As an example of edge at the Forum, here is an announcement from Opto 22. For as long as I’ve known the company, it continues to push the latest IT technologies mashed up with control and automation. This product release highlights its pioneering role in IoT.

Industrial automation manufacturer and industrial internet of things (IIoT) developer Opto 22 announced groov RIO, a family of intelligent, distributed input/output (I/O) for IIoT and automation applications. groov RIO represents a first-in-class solution for its ability to quickly connect traditional wired switches and sensors directly to Ethernet networks, software applications, and cloud platforms without intermediary control or communication hardware, such as PLCs, PACs, or PCs.

The first shipping version of groov RIO is the GRV-R7-MM1001-10, a standalone, 10-channel, multi-signal, multifunction I/O unit for signals including thermocouples (TCs), integrated circuit temperature devices (ICTDs), voltage inputs, current inputs, millivolt inputs, discrete DC inputs, self-wetting discrete inputs, discrete DC sinking outputs, and Form C mechanical relays. In addition, two channels provide special features like pulse counting, on- and off-time totalization, software latching, frequency measurement, and more. GRV-R7-MM1001-10 is completely standalone and software-configurable through a browser-based interface.

“When we designed groov RIO, we were looking for ways to democratize I/O data, because that’s what the IIoT is all about,” said Vice President of Product Strategy at Opto 22, Benson Hougland. “Although groov RIO can be used as remote I/O with our groov EPIC system or another control system, we also wanted it to operate autonomously, facilitating direct connection between I/O signals and databases, business software, or cloud IoT platforms.”

GRV-R7-MM1001-10 supports 12 different types of field I/O circuits. It also provides no-hassle, enclosure-free installation with multiple power options, including standard 802.3af Power-over-Ethernet (PoE); an extended operating temperature range; and UL Hazardous Locations and ATEX approvals.

Once installed, groov RIO can be independently managed and configured through browser-based tools. Per-channel I/O type and signal-processing options through groov Manage eliminate the need for a master control unit, and support for standard enterprise network services like DNS, DHCP, and VPN facilitates network connectivity. Embedded communication options range from efficient data publishing with MQTT Sparkplug to advanced signal processing, data aggregation, and transactions with databases and web services, using the low-code Node-RED environment and runtime.

Data —> Action

It’s all about data they all say. But when I talked with Mike Brooks who is now advising at AspenTech, he counseled, “Not too much data.” The action is in using data, not collecting it. Therefore the drawback (indeed, failure?) of data lakes. Too much storage, not enough usability. AspenTech exemplifies using Machine Learning not just to say they are in AI, but to find usable information for the companies to use to improve operations.

Collaboration – Partnerships

The Eclipse Foundation and OPAF exemplify collaboration and partnerships. Inductive Automation has community as a strategic initiative. Both founder Steve Hechtman and chief strategy officer Don Pearson highlighted it at last year’s Ignition Community Conference.

This announcement highlights community along with edge and other trends. Inductive Automation announced improvements to three products and a new development resource within Ignition by Inductive Automation. Ignition is an industrial application platform with tools for building solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), and the Industrial Internet of Things (IIoT).

The solutions include:

  • New and improved products for Ignition Edge.
  • An expansion of the Ignition Onboard program.
  • Improvements to the Ignition Perspective Module.
  • A new, free resource for developers: Ignition Exchange.

Ignition Edge will soon have three new products. Ignition Edge is a line of lightweight, limited, low-cost Ignition software solutions designed for embedding into field and OEM devices at the edge. They allow organizations to extend data collection, visualization, and system management to the edge of the network. With the new products coming soon, the lineup will include Ignition Edge Panel, Ignition Edge Compute, Ignition Edge Sync Services, Ignition Edge EAM (Enterprise Administration Module), and Ignition Edge IIoT.

The Ignition Onboard program now has easier access to industrial hardware that comes with Ignition already installed, configured, and licensed. Numerous device manufacturers are embedding Ignition and Ignition Edge into their devices — including Advantech, Moxa, OnLogic, Opto 22, and ORing.

The Ignition Perspective Module lets users easily build mobile industrial applications in HTML5 for monitoring and control of their processes directly from their mobile phones.

A significant part of the Inductive Automation strategy is to promote community among its customers and partners. The development has been ongoing for some time culminating into Ignition Exchange — a new, online space where developers can get free Ignition resources provided by Inductive Automation and the Ignition community. These resources can save time for developers.

Software Defined

OPAF, Bedrock Automation- as in take hardware platform add flow metering, exemplify the trend toward software-defined hardware.

Machine Learning

I discussed ML in relation to AspenTech for decision making. Perhaps the industry is moving past the SciFi “artificial intelligence” part of the technology to emphasize real use cases deployed today.

Operations

To name a trend “operations” may sound archaic, but many conversations moved from technology to solving real problems for customers. This announcement from AVEVA exemplifies that trend.

AVEVA unveiled its new Discrete Lean Management software. The new offering improves operational efficiency through the digitalization of lean work management for both manual and automated production lines. AVEVA’s quick-to-deploy and easy to use digital tools enable access to production information, KPIs and notifications on dashboards, workstations and mobile devices to improve overall equipment and labor effectiveness, and to facilitate data-driven continuous improvement.

AVEVA Discrete Lean Management is designed to address the issues faced by operating manufacturing plants still using paper-based systems for lean and work order management, work instructions and data collection procedures. It enables physical records to be replaced with digital tools that mitigate the risk of manual processes and provides real time visibility into production performance allowing team collaboration in response to production issues.

The AVEVA Discrete Lean Management software solution is used in Schneider Electric’s manufacturing plants and has been successfully deployed in more than 70 smart factories globally resulting a 10% productivity increase due to downtime mitigation and 70% improved response-time due to automated escalation of production issues.

I actually visited one of the plants in the deployment—one in Lexington, KY. It was an excellent example of using software tools to enhance a lean process rather than getting in the way.

MQTT

MQTT was mentioned all over the conference. This is a data transport technology. It is usable for both OPC UA and for Sparkplug. Some companies touting their use of the technology include:

  • Eclipse
  • Inductive
  • Opto — also NodeRED
  • Cirrus Link
  • Bedrock

Security

I didn’t have as many security conversations as the past few years, but I did chat with some PAS Global executives, and the company announced several new products, along with some new branding.

PAS, now PAS Global, keeps building on the platform of alarm management and safety and its ability to see what is on the process plant’s network assuring integrity of the process control system.

While at ARC Forum, company executives stressed industrial operations must increase focus on cybersecurity while maintaining continuous vigilance on safety. Stated simply, organizations need to ensure OT integrity in the face of unprecedented opportunity and risk. PAS has introduced new and updated products to optimize the integrity of industrial assets and reduce cyber risk, improve process safety and reliability, and ensure OT data health.

PAS Cyber Integrity prevents, detects, and remediates industrial cyber threats. Version 6.5 introduces an enhanced user experience for OT asset inventory information and data collection and transfer. This release also provides support for multiple integration methods (REST API, Syslog, SQL, CSV, SDK), integration with Darktrace, and Microsoft Windows event analytics;

PAS PlantState Integrity Version 8.7 introduces enhancements to Independent Protection Layer (IPL) Assurance that include sensor monitoring and voting, analysis filtering, and process trip reporting.

PAS Decision Integrity enables trusted data for decision-making. Version 1.0 leverages capabilities from PAS Automation Integrity and adds support for OT data health monitoring (data chain accuracy and visualization) and data lake enrichment.

These new product releases will be generally available by the end of March.

Podcast 202 Industrial Challenges 2020 Edition

I have released a new podcast.

In the late 1970s I worked in an engineering department where one of my responsibilities was the custodian and distributor of all engineering data. In addition, I did all the corporate new product quoting–such things as UPS truck bodies and the bodies for the original Atlanta Airport People Movers. Everything was paper and manual. Drawings to bills of material to routings to costing.

Today we do the same tasks, except that everything is digital. The drawings are all digital files, the BOM–digital, sorting/costing/checking all faster and digital. We adapt and adopt technology to do things better.

The problem remains–leadership and management of the systems to implement all these technologies in order to reap the rewards.

That–is the challenge before us.

http://traffic.libsyn.com/automation/202_Industrial_Challenge_for_2020.mp3

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Digitization is on everyone’s lips these days. If you have not taken steps to implement and improve digital data flow, you are probably already behind. I receive information regularly from PwC and here is a new report on how digitization is reshaping the manufacturing industry. The report takes a look at 8 companies and showcase how they improved their efficiency, productivity and customer experience by ensuring they have the right capabilities central to their operating model and by matching them with strong skill sets in analytics and IT.

Pressure from the consumer, new regulations and advances in information technology are all reasons that are pushing manufacturing organizations to digitize so they can avoid falling behind the new breed of market-leading ‘digital champions.’ The report identifies 4 significant changes CEOs must implement to maximize the benefits of digitization.

1. Drive organizational changes that address new digital capabilities and digitalized processes – e.g., product and process design and engineering, end-to-end procurement, supply chain/distribution and after-sales – right from the top, because these are so new and different

2. Hire more software and Internet of Things (IoT) engineers and data scientists, while training the wider workforce in digital skills

3. Learn from software businesses, which have the ability to develop use cases rapidly and turn them into software products

4. Extend digitalization beyond IT to include significant operational technologies (OT) such as track and trace solutions and digital twinning

From the report, “Already, digitally ‘smart’ manufacturers are gaining a competitive advantage by exploiting emerging technologies and trends such as digital twinning, predictive maintenance, track and trace, and modular design. These companies have dramatically improved their efficiency, productivity, and customer experience by ensuring these capabilities are central to their operating models and by matching them with strong skill sets in analytics and IT. “

During 2018 and early 2019, PwC conducted in-depth digitisation case studies of eight industrial and manufacturing organisations in Germany, the US, India, Japan and the Middle East. Drawing on discussions and interviews with CEOs and division heads, we explored the key triggers for change these companies faced, assessed how digital solutions are being implemented and how digitisation is affecting key aspects of their operating models. We also compared our eight organisations with other publicly cited digitisation case studies, and leveraged PwC’s 2018 study Digital Champions: How industry leaders build integrated operations ecosystems to deliver end-to-end customer solutions and other ongoing PwC research.

This paper is the result of ongoing collaboration between PwC and the Global Manufacturing and Industrialisation Summit (GMIS). GMIS provides a forum for industry leaders to interact with governments, technologists and academia in order to navigate the challenges and opportunities brought about by the digital technologies of the Fourth Industrial Revolution. PwC has been a knowledge partner with GMIS since 2016.

The eight case studies in this report make clear how far the role of digital technology goes beyond traditional IT systems. It also encompasses OT and data and analytics technologies. Full integration and linkage among these different technologies, and the ecosystems they are part of, are essential to a successful digital transformation. Yet success is impossible without a digitally smart workforce that is familiar with Industry 4.0 skills and tools.

These challenges are the subject of the second part of the report Digital Champions: How industry leaders build integrated operations ecosystems to deliver end-to-end customer solutions, which will be published in January 2020.

The report will elaborate further on the emerging theory of digital manufacturing and operations, in which successful, digitised industrial organisations will increasingly have to act like software companies in response to four key factors:

  • The connected customer seeks a batch size of one, necessitating greater customisation of products and delivery time, improved customer experience, use of online channels and outcome-based business models.
  • Digital operations require both engineering and software abilities to enable extensive data analysis and IoT-based integration, as well as digitisation of products and services.
  • Organisations need augmented automation, in which machines become part of the organisation via closely connected machine–worker tasks and integrated IT and OT.
  • Future employees will be ‘system-savvy craftspeople’ with the skills to use sensors in order to collect and analyse accurate data, as well as design and manage connected processes.

About the authors

Anil Khurana is PwC’s global industrial, manufacturing and automotive industry leader. He is a principal with PwC US.

Reinhard Geissbauer is a partner with PwC Germany based in Munich. He is the global lead for PwC’s Digital Operations Impact Center.

Steve Pillsbury is a principal with PwC US and the US lead for PwC’s Digital Operations Impact Center.

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

Digital Infrastructure and Solutions Company Expands and Focuses

In brief: During its brief history as a collection of Hitachi Ltd. data properties, Hitachi Vantara continues to grow and remake itself. It has now added Hitachi Consulting and Intelligent Data Cataloging company Waterline Data. The new company combines IT Infrastructure, Data Management and Analytics.

The first news is the combination of Hitachi Vantara with Hitachi Consulting as one company to create a new digital infrastructure and solutions company.

The new Hitachi Vantara aims to become the world’s preferred digital innovation partner by unlocking the “good” in data that benefits customers, raises the quality of people’s lives and builds a sustainable society. Hitachi Vantara will specifically bring a competitive edge to the digital domains that matter most – the data center, data operations, and enterprise digital transformation.

The new Hitachi Vantara combines the best consulting-led digital solutions and vertical industry expertise of Hitachi Consulting with Hitachi Vantara’s IT domain expertise. Going forward, the integrated company will help customers develop practical, scalable digital strategies and solutions that transform operational processes, improve customer experiences and create new business models to drive innovation and growth.

For example, the new company will offer a holistic manufacturing industry practice as one of several vertical industry practices. The manufacturing practice will integrate consulting methodologies for addressing quality, customization, sustainability and new business models with data-driven solutions such as Lumada Manufacturing Insights from Hitachi Vantara, which integrates silos of manufacturing data and applies AI and machine learning to evaluate and enhance overall equipment effectiveness (OEE).

“A barrage of data and technology is disrupting enterprises and industries the world over,” said Toshiaki Tokunaga, chief executive officer and chairman of the board, Hitachi Vantara. “Through the integration of Hitachi Consulting, the new Hitachi Vantara will be uniquely equipped with the capabilities our customers need to guide them on their digital journeys. We’re going to be the company that helps customers navigate from what’s now to what’s next.”

The Hitachi Vantara portfolio is built upon a foundation of world-class edge-to-core-to-cloud infrastructure offerings, including the recently introduced Hitachi Virtual Storage Platform (VSP) 5000 series, the world’s fastest data storage array. The portfolio further features AI and analytics solutions, cloud services for application modernization, systems integration and change management services for SaaS-based ERP implementations and migrations, and Lumada-based digital industrial solutions. Hitachi Vantara’s offerings are all backed by world-class business consulting, deep experience in improving organization effectiveness, co-development capabilities and global delivery services.

With its expanded capabilities, the new Hitachi Vantara will play a key role in advancing Hitachi’s 2021 Mid-term Management Plan, which aims to make the company a global leader through “Social Innovation Business.” The Social Innovation Business strategy centers on combining Hitachi’s industrial and IT expertise and products to create new value and resolve social issues.

Hitachi Vantara will help advance the plan by expanding revenues from digital business, by digitally transforming Hitachi’s industrial businesses, by fueling international growth, and by delivering social, environmental and economic value which helps customers contribute to the attainment of United Nations’ Sustainable Development Goals.

As announced in September 2019, Toshiaki Tokunaga, a 30-year Hitachi veteran who has successfully transformed several Hitachi businesses, will serve in the dual role of chief executive officer and chairman of the board of Hitachi Vantara.

The company’s two business units, Digital Infrastructure and Digital Solutions, will be led by Presidents Brian Householder and Brad Surak, respectively. Hitachi Vantara today also announced details of other appointments to its executive leadership team.

Hitachi Vantara Will Integrate Advanced Data Cataloging Technology Into Lumada Data Services Portfolio

In further news, Hitachi Vantara announced acquisition of the business of Waterline Data, which is headquartered in Mountain View, CA. It provides intelligent data cataloging solutions for DataOps that help customers more easily gain actionable insights from large datasets and comply with data regulations such as GDPR.

Waterline Data delivers catalog technology enabled by machine learning (ML) that automates metadata discovery to solve modern data challenges for analytics and governance across edge-to-core-to-cloud environments. Waterline Data’s technology has been adopted by customers in the financial services, healthcare and pharmaceuticals industries to support analytics and data science projects, pinpoint compliance-sensitive data and improve data governance. It can be applied on-premises or in the cloud to large volumes of data in Hadoop, SQL, Amazon Web Services (AWS), Microsoft Azure and Google Cloud environments.

Waterline Data’s patented “fingerprinting” technology is the cornerstone of its solutions, removing one of the biggest obstacles to data lake success. Fingerprinting uses AI- and rule-based systems to automate the discovery, classification and analysis of distributed and diverse data assets to accurately and efficiently tag large volumes of data based on common characteristics.

Integrating Waterline Data technology with Hitachi Vantara’s Lumada Data Services portfolio will provide a common metadata framework to help customers break down data silos distributed across the cloud, the data center, and the machines and devices at the edges of their networks. By applying DataOps methodologies to the unified datasets, customers can more rapidly gain insights and drive innovation.

“Our research illustrates that almost half of enterprise data practitioners are spending more than 50% of their time simply trying to find and prepare data for analysis. Data catalog products have emerged in recent years as strategic imperatives for enterprises seeking to address this challenge while also improving data governance,” said Matt Aslett, research vice president, 451 Research. “This acquisition is logical and strategic: Waterline Data’s capabilities are a complementary fit for Hitachi Vantara and its Lumada Data Services portfolio. Adding Waterline Data furthers the company’s ability to address growing demand for products and services that deliver more agile and automated approaches to data management via DataOps: helping enterprise consumers of data ultimately leverage information in a fluid, yet governed way.”

“Hitachi Vantara provides customers with the digital building blocks, DataOps approaches and industry solutions they need to transform their organizations through data-driven insights,” said Brad Surak, president, Digital Solutions, Hitachi Vantara. “Waterline Data technologies complement Hitachi Vantara’s DataOps expertise and will become key offerings in the Lumada Data Services portfolio, bringing our customers greater visibility, tighter quality control, improved compliance and better management of their data.”

Financial terms of the transaction were not disclosed. The acquisition of Waterline Data is subject to customary closing conditions and it is expected to close in the fourth quarter of Hitachi’s fiscal year 2019 (ending March 31, 2020).

Upon completion of the acquisition, Hitachi Vantara will make Waterline Data technologies available as standalone solutions as well as integrated components of the Lumada Data Services portfolio.

Report Identifies 4 Changes CEOs Must Implement To Maximize Digitization

HighByte Announces Availability of Intelligence Hub Data Ops for the Industrial Market

HighByte, an industrial software company, announced that HighByte Intelligence Hub Version 1.0 is now available. HighByte Intelligence Hub is the first DataOps solution purpose-built for industrial environments. DataOps is a new approach to data integration and security that aims to improve data quality, reduce time spent preparing data for analysis, and encourage cross-functional collaboration within data-driven organizations.

When HighByte emerged from stealth mode last September, I wrote about it here.

I am partial toward these startups that are not trying to boil the ocean but are instead focused on solving a problem. Take, for example, ThingWorx which enhanced PTC’s business through acquisition and then in turn impacted Rockwell Automation’s business through adoption of ThingWorx technology rather than trying to re-invent it internally.

Or Hitachi Vantara, which is an integration of several smaller companies and which has a thriving DataOps group I wrote more about during my visit to its conference.

HighByte Intelligence Hub enables Operations to securely connect, model, and flow valuable industrial data to the users and systems that require this valuable information throughout the extended enterprise. The platform-agnostic software solution runs on-premises at the Edge, securely connects devices and applications via OPC UA and MQTT, is built for scale, and offers a codeless user interface. HighByte has positioned the software solution as the missing data infrastructure link to achieving the vision of Smart Manufacturing and Industry 4.0.

“As the number of applications that need to turn raw data into usable information increases, the customer is faced with having to recreate models in every application or develop their own solutions that integrate with the various APIs. Either choice slows down the initial deployment of Industrial 4.0 initiatives, inhibits the ability to scale, and places a huge maintainability problem on the customer,” said HighByte CEO Tony Paine. “With HighByte Intelligence Hub, customers can standardize and maintain their data models in a single location, securely streamline information flows, and accelerate time to value for their Industry 4.0 investments.”

Many analysts write as if data in manufacturing is a new thing. That is not true. What is new are tools to obtain better data and transmit it faster to ever more robust databases. We’re sort of doing the same thing that I was trying to do in 1978, only better, faster, cheaper. DataOps is acknowledged as part of the IT technology stack. HighByte is filling a gap in the OT stack.

This will result in such benefits as predicting machine failure and therefore improving uptime, improving product quality, providing better service to customers, strengthening the supply chain, all of which makes customers happier leading to a more robust and profitable company. Of course, this presumes that management figures out how to implement all this.

General availability of HighByte Intelligence Hub was preceded by a global beta program that launched in September 2019. HighByte leveraged the program to collect technical and business feedback from nearly forty (40) manufacturers, distributors, and system integrators representing fifteen (15) countries in preparation for the company’s first commercial launch.

HighByte Intelligence Hub is available as a site license with an annual subscription.

Follow this blog

Get every new post delivered right to your inbox.