IIoT It’s All About Data—Machine-Learning Data Discovery

IIoT It’s All About Data—Machine-Learning Data Discovery

Manufacturing technology professionals have been working with data of many types for years. Our sensors, instrumentation, and control systems yield terabytes of data. Then we bury them in historians or other databases on servers we know not where.

Companies are popping up like mushrooms after a spring rain with a variety of approaches for handling, using, analyzing, and finding all this data. Try on this one.

Io-Tahoe LLC, a pioneer in machine learning-driven smart data discovery products that span a wide range of heterogeneous technology platforms, from traditional databases and data warehouses to data lakes and other modern repositories, announced the General Availability (GA) launch of the Io-Tahoe smart data discovery platform.

The GA version includes the addition of Data Catalog, a new feature that allows data owners and data stewards to utilize a machine learning-based smart catalog to create, maintain and search business rules; define policies and provide governance workflow functionality. Io-Tahoe’s data discovery capability provides complete business rule management and enrichment. It enables a business user to govern the rules and define policies for critical data elements. It allows data-driven enterprises to enhance information about data automatically, regardless of the underlying technology and build a data catalog.

“Today’s digital business is driving new requirements for data discovery,” said Stewart Bond, Director Data Integration and Integrity Software Research, IDC. “Now more than ever enterprises are demanding effective, and comprehensive, access to their data – regardless of where it is retained – with a clear view into more than its metadata, but its contents as well. Io-Tahoe is delivering a robust platform for data discovery to empower governance and compliance with a deeper view and understanding into data and its relationships.”

“Io-Tahoe is unique as it allows the organization to conduct data discovery across heterogeneous enterprise landscapes, ranging from databases, data warehouses and data lakes, bringing disparate data worlds together into a common view which will lead to a universal metadata store,” said Oksana Sokolovsky, CEO, Io-Tahoe. “This enables organizations to have full insight into their data, in order to better achieve their business goals, drive data analytics, enhance data governance and meet regulatory demands required in advance of regulations such as GDPR.”

Increasing governance and compliance demands have created a dramatic opportunity for data discovery. According to MarketsandMarkets, the data discovery market is estimated to grow from $4.33 billion USD in 2016 to $10.66 billion USD in 2021. This is driven by the increasing importance of data-driven decision making and self-service business intelligence (BI) tools. However, the challenge of integrating the growing number of disparate platforms, databases, data lakes and other silos of data has prevented the comprehensive governance, and use, of enterprise data.

Io-Tahoe’s smart data discovery platform features a unique algorithmic approach to auto-discover rich information about data and data relationships. Its machine learning technology looks beyond metadata, at the data itself for greater insight and visibility into complex data sets, across the enterprise. Built to scale for even the largest of enterprises, Io-Tahoe makes data available to everyone in the organization, untangling the complex maze of data relationships and enabling applications such as data science, data analytics, data governance and data management.

The technology-agnostic platform spans silos of data and creates a centralized repository of discovered data upon which users can enable Io-Tahoe’s Data Catalog to search and govern. Through convenient self-service features, users can bolster team engagement through the simplified and accurate sharing of data knowledge, business rules and reports. Here users have a greater ability to analyze, visualize and leverage business intelligence and other tools, all of which have become the foundation to power data processes.

Alliances Advance Edge to Cloud Analytics and Computing

Alliances Advance Edge to Cloud Analytics and Computing

Much of the interesting activity in the Industrial Internet of Things (IIoT) space lately happens at the edge of the network. IT companies such as Dell Technologies and Hewlett Packard Enterprise have built upon their core technologies to develop powerful edge computing devices. Recently Bedrock Automation and Opto 22 on the OT side have also built interesting edge devices.

I’ve long maintained that all this technology—from intelligent sensing to cloud databases—means little without ways to make sense of the data. One company I rarely hear from is FogHorn Systems. This developer of edge intelligence software has recently been quite active on the partnership front. One announcement regards Wind River and the other Google.

FogHorn and Wind River (an Intel company) have teamed to integrate FogHorn’s Lightning edge analytics and machine learning platform with Wind River’s software, including Wind River Helix Device Cloud, Wind River Titanium Control, and Wind River Linux. This offering is said to accelerate harnessing the power of IIoT data. Specifically, FogHorn enables organizations to place data analytics and machine learning as close to the data source as possible; Wind River provides the technology to support manageability of edge devices across their lifecycle, virtualization for workload consolidation, and software portability via containerization.

“Wind River’s collaboration with FogHorn will solve two big challenges in Industrial IoT today, getting analytics and machine learning close to the devices generating the data, and managing thousands to hundreds of thousands of endpoints across their product lifecycle,” said Michael Krutz, Chief Product Officer at Wind River. “We’re very excited about this integrated solution, and the significant value it will deliver to our joint customers globally.”

FogHorn’s Lightning product portfolio embeds edge intelligence directly into small-footprint IoT devices. By enabling data processing at or near the source of sensor data, FogHorn eliminates the need to send terabytes of data to the cloud for processing.

“Large organizations with complex, multi-site IoT deployments are faced with the challenge of not only pushing advanced analytics and machine learning close to the source of the data, but also the provisioning and maintenance of a high volume and variety of edge devices,” said Kevin Duffy, VP of Business Development at FogHorn. “FogHorn and Wind River together deliver the industry’s most comprehensive solution to addressing both sides of this complex IoT device equation.”

Meanwhile, FogHorn Systems also announced a collaboration with Google Cloud IoT Core to simplify the deployment and maximize the business impact of Industrial IoT (IIoT) applications.

The companies have teamed up to integrate Lightning edge analytics and machine learning platform with Cloud IoT Core.

“Cloud IoT Core simply and securely brings the power of Google Cloud’s world-class data infrastructure capabilities to the IIoT market,” said Antony Passemard, Head of IoT Product Management at Google Cloud. “By combining industry-leading edge intelligence from FogHorn, we’ve created a fully-integrated edge and cloud solution that maximizes the insights gained from every IoT device. We think it’s a very powerful combination at exactly the right time.”

Device data captured by Cloud IoT Core gets published to Cloud Pub/Sub for downstream analytics. Businesses can conduct ad hoc analysis using Google BigQuery, run advanced analytics, and apply machine learning with Cloud Machine Learning Engine, or visualize IoT data results with rich reports and dashboards in Google Data Studio.

“Our integration with Google Cloud harmonizes the workload and creates new efficiencies from the edge to the cloud across a range of dimensions,” said David King, CEO at FogHorn. “This approach simplifies the rollout of innovative, outcome-based IIoT initiatives to improve organizations’ competitive edge globally, and we are thrilled to bring this collaboration to market with Google Cloud.”

Digital Transformation Council

Digital Transformation Council

Digital Transformation has generated so much news that company executives have begun ordering projects and task forces within the company to begin that transformation. The pressure on engineers and IT people increases with each new directive. To help clients deal with these new directives, ARC Advisory Group launched the Digital Transformation Council (DTC) at its 2018 Forum.

The council is a member community for industry, energy, and public-sector professionals. Membership is by invitation only and restricted to end users of digital transformation technology, such as professionals working for manufacturers, utilities, and municipalities. There is no fee to join.

“As data-driven market disruption grows, professionals across similar industries need to connect and learn from one another,” according to Jesus Flores-Cerrillo, Associated R&D Director at Praxair, one of the world’s largest providers of industrial gases. He added, “It’s becoming mission-critical to understand how to use data to develop services and products and optimize operations and assets. That can only be accomplished by understanding the possibilities provided by modern data tools such as artificial intelligence, machine learning, and digital twins.”

“We are delighted to support the Digital Transformation Council by bringing members together in person and online,” commented Greg Gorbach, Vice President at ARC Advisory Group. “This community will enable individuals and companies to get up to speed quickly on digital transformation innovations and share ideas about what provides value and what doesn’t.”

Each February, a member-only meeting, anchored to the annual ARC Industry Forum, will bring the Council together to set the focus and agenda for the coming year. Members will also gather via virtual quarterly meetings to discuss research findings, activities, and other topics.

In addition to annual in-person meetings and quarterly virtual meetings, Digital Transformation Council members will have year-round access to research and fellow members via an online community. ARC Advisory Group’s role will be to conduct research, organize meetings, provide venues, and facilitate peer-to-peer discussions. ARC will also deliver technical support for the group’s online presence.
The DTC will address topics such as analytics, industrial Internet of Things (IIoT), artificial intelligence and machine learning, cybersecurity, and additive manufacturing.

Open Platform and Digital Twin

Open Platform and Digital Twin

I am a sucker for open platforms. When the PR agency wrote with a teaser about discussing open platforms with Marc Lind, SVP Strategy at Aras, a PLM supplier, I bit. They threw in “digital twin” and “digital thread” as the topping and cherry atop the sundae, and the appointment was made.

We talked just before Christmas, but I’ve had such a crazy January that I’ve just now gotten to this in my pile of things to write.

PLM is often thought of as an enterprise application and covered by analysts who also watch such areas as ERP. I’ve talked with suppliers for years as a magazine editor, but they didn’t really seem to fit well within the magazines and they most likely were not advertising prospects, so there wasn’t pressure to write much. I’m saying that I’m not an expert in the area like some of my friends.

But I’ve followed the technology for many years. I’ve seen it coming—this coordination of digital and physical. As soon as the digital folks could get it all together—especially better databases and interfaces—then I knew we’d be much closer to the realization of digital manufacturing.

Lind told me something about the Aras platform. First, he said, was the attempt at doing away with silos where you might have your mechanical CAD, then have your electrical CAD, then perhaps your MES, and your ERP. He said not only was there a problem within manufacturing, think about the next step, say connected cars and other systems of systems, where things really need to interact across boundaries.

Check out the Aras platform. It’s interesting. And once again as I’m seeing more often, it is exploring a different business model that can make its platform and products available to a wider customer base. For other writing I’ve done on open platforms, click the small “ad” on my site to download the MIMOSA white paper.

Digital Twin

We also talked digital twin, one of the foundation concepts for digital manufacturing.

He said the term Digital Twin was coined back in 2002 by Dr. Michael Grieves while at University of Michigan. Effectively, the Digital Twin is an exact virtual representation of a physical thing. It’s as if the physical product or system was looking in a virtual mirror.

Grieves describes it as a mirroring (or twinning) of what exists in the real world and what exists in the virtual world. It contains all the informational sets of the physical ‘thing’ meaning its cross-discipline – not just a mechanical / geometric representation, but also including the electronics, wiring, software, firmware, etc.

Many people talk about Digital Twins in the context of monitoring, simulation and predictive maintenance which are all incredibly valuable and potentially transformative in their own right, however, there would seem to be much more to it.

“As products of all types move to include connectivity, sensors, and intelligence we can’t just think about the data streaming back from the field.”

Without accurate “Context” – Digital Twin – time series data generated during production and ongoing operation is difficult or even impossible to understand and analyze.

In addition, the ability to interpret and act upon these data often require traceability to prior information from related revisions – Digital Thread.

“To complicate matters further as artificial intelligence / cognitive computing is introduced the necessity for the Digital Twin becomes even greater. If Knowledge = Information in Context, then without a Digital Twin, machine learning won’t work as intended, will be rendered ineffective or worse… potentially leading to risky misinterpretations or misdirected actions.”

Finally, Lind warns, “Because without Context – Digital Twin – the IoT-enabled value proposition is severely limited and could introduce real liability.”

Report Applications–What is the Market and Where Do They Fit the Industrial Software Ecosystem

Report Applications–What is the Market and Where Do They Fit the Industrial Software Ecosystem

I don’t really think about report tools that much, to be honest. Maybe because most people seem to default to Microsoft Excel to draw information from their operations information system.

Roy Kok has been VP of sales and marketing for Ocean Data Systems / Dream Report for some time now. He’s an industry veteran whom I’ve known for probably 20 years. We’ve had occasion to chat about his product several times over the past few months. So, I had to ask, just what do you do and what kind of market is there—really?

[Note: He’s a new sponsor of the site, trying us out for a while. I don’t actively go out and sell ads, but I certainly appreciate the companies that do—hint. I actually probed about the market before he decided to buy.]

I guess I never thought about a custom report writer in the same genre as dashboards and other visibility tools. I stand corrected.

Kok tells me, “I believe Dream Report to be the number one product of this type in the world.  We are currently shipping in the thousands per year. As you can imagine, this is giving us great market penetration and visibility, but Dream Report is not a very expensive product, so our company is still relatively small at 18 employees. Dream Report is all that we do, so from that perspective, we are a significant scale for a single product company. I believe we hold 5% or so market share. 85% market share is held by business products being applied to industrial applications.  These products include Crystal Reports, Microsoft SSRS and Excel. That would leave 10% for the plethora of other tools, vertical market solutions, and smaller competitors.”

Why did this market become so dominated by business tools? “One simple word – History,” added Kok.  “In the late 80s and early 90s, HMI/SCADA was still in its infancy and competition was tremendous. Vendor focus was on reliability and capability of HMI/SCADA. There was another invention at that same time – ODBC – in the business world. ODBC was the way third party products could interact with databases of all types. Also in the late 80s, Crystal Reports came on the market and in the early 90s, Microsoft delivered SSRS (SQL Server Reporting Services). Excel was also available and leveraged ODBC. The result of all these developments was that HMI/SCADA vendors chose to enhance their products with ODBC and could thus leverage the variety of business tools on the market. That set the path for most of the industrial market. To be fair, some HMI/SCADA vendors dabbled in report generation, typically focused on connectivity too their own products.”

This becomes part of the IT / OT situation. Part of the continual divide between the organizations lies in the tools each use. Both thrive on information, but the type of information and its format is different between the two areas. Often business tools are much more expensive, since the cost can be spread over a much larger application framework. Kok posits that OT people have been hesitant to look at report applications from fear of cost and complexity. Therefore, the benefit of his product.

Adds Kok, “Our challenge is to re-educate the world that reports are easy to create and are actually your shortest path to continuous improvement. A report (or dashboard) can be fast and easy to create. It can then be scheduled for automatic generation and delivery. We argue that the step after installing a historian, should be the installation of a report and dashboard solution like Dream Report. Then, when KPIs go askew, you can use Dream Report analytics or other advanced tools to better understand the root cause. Dream Report can actually bridge the gap and help to justify advanced analytics like those from Seeq, Falkonry, TrendMiner, Tableau, Pentaho, and others.

And that, I think, is the next step forward—how to integrate advanced analytics into OT in a sensible and useful way. We keep talking about predictive maintenance. And have been for many years. Maybe that one really begins hitting. And we’ll watch for which application strikes next.

Ocean Data Systems has been growing organically since 2006. Major OEMs private labelling and reselling Dream Report include Schneider Electric, Wonderware, GE, Eurotherm, Indusoft.) Here is a short video.


Compute Power Meets IoT at HPE Discover 2017 in Madrid

Compute Power Meets IoT at HPE Discover 2017 in Madrid

Hewlett Packard Enterprise (HPE) held its European customer conference, Discover, in Madrid this week. Points of emphasis that impacted me included Internet of Things business, high power computing, a view of the changes going on at HPE, and a look at the future of IT and the plant.

Bloggers and influencers

I was here as part of a blogger program separate from press and analysts. Bloggers are a livelier group than press. I think that I am the only independent blogger in manufacturing in the US (everyone else works for a magazine or analyst firm). There were 25 bloggers at Discover from countries as diverse as Denmark, Belgium, Germany, Italy, Spain, New Zealand, Canada, and the USA. Rather than attending press conferences our program included “coffee talks” that were live-streamed on the Web. These were informal presentations plus question-and-answer sessions.

There was one press conference I attended—the announcement of the partnership with ABB on the mini-data center product. Instead of conversation and give-and-take, one or two journalists asked questions in a challenging manner that were seldom designed to elicit more information. Note: I wrote about the partnership earlier this week.

Retiring CEO Meg Whitman used a quote from Gartner in her remarks. “The Edge will eat the cloud.” HPE has developed edge computing devices called Edgeline that I discussed in August after my first meeting with the company. These are powerful computing devices based on PXI platform technology from National Instruments. The blogger group devoted some time discussing how valid that comment was.

We concluded that you will need both. I have an example from a conversation I had with Rod Anilker, a technologist in the OEM group. Imagine taking the computing power and openness of the HPE platform to replace proprietary controllers such as CNC, PLC, DCS. These devices at the edge would solve many control and other edge applications with the additional capability of sending data to the cloud.

Now, imagine the storage and computing power HPE has accumulating vast amounts of data—maybe from a power generation fleet or a company’s many refineries—achieving scale sufficient to do some pretty cool pattern recognition. The predictive, prescriptive, and planning possibilities would be awesome.

Pieces of HPE Corporate Puzzle

Antonio Neri, President and COO and next year’s CEO, let general session with these main points.
• Intelligent Edge (where the action is->acquisition of Aruba so important)
• Secure platforms from edge to core to cloud
• Memory-driven computing (acquisition of Silicon Graphics another important piece)
• Artificial Intelligence (inside everything)
• Software-defined and multi-cloud platforms
• Partner ecosystem
• Flexible consumption (scale up, scale back)
• Advise and transform HPE PointNext
• Outcome-as-a-Service, future of enterprise computing


PointNext is the services arm introducing the concept of Edge-as-a-Service. In face, HPE features “as-a-service” in many guises.

This concept seems to be modeled on ideas emanating from GE’s consumption of services mode. Capturing and processing data at source where action needs to happen as the foundation of the model. Then you provide IT in a way that scales, pay-as-you-go concept, subscription-based. Therefore, the customer has flexibility and reduced risk.

Take the expertise from a data center that runs 24/7 and put it at edge. Then it’s all about extracting data. Take this into machine learning. This starts to morph into the concept from the OEM group.

The model architecture takes HPE’s new GreenLake plus EaaS. GreenLake edge compute includes design services, information services, operational & support services, and pay-as-you-go. Upfront consulting to help evaluate the client’s requirements and business process and recommends solution packages.

Technology overview

David Chalmers, research and development executive, briefed us with a business and technology overview.

Hewlett Packard Enterprise is one of the two business left standing after Meg Whitman (and the board) split the company following some bad years of plans and leadership. Following the split, several businesses were divested.

Chalmers related HPE has been changing fast into an infrastructure/solutions company (he said, the world’s largest). “The strength of our portfolio is the best in 10 years, much from organic development. The SGI acquisition yielded more compute options (SGI acquisition), including low power, high performance computing. By 2022 60% of data will never get to data center, it’ll reside at the edge. Therefore intelligent edge is important. SGI brought high performance analytics.”

Another couple of tidbits. At the new HPE people bring business cases first, then talk about the technology solution. The OT world order of magnitude larger than IT world. (Hmmm)

Oh, and there were many new products. They don’t all apply to my areas of coverage. But the engineers have been busy.


I just realized I made it through the entire discussion without mentioning the technology that brought me to HPE Discover—Internet of Things. Much of that relates to the Edge and devices such as Edgeline. Obviously important IoT garnered significant floor space in the exhibition area.

There will be more in another post.


Follow this blog

Get every new post delivered right to your inbox.