Browsing LinkedIn, something I seldom do, I saw this image from a company called Seebo. “Where IoT Projects Fail.” Interesting, but can’t these be summed up in a word or two?
Try “management” or “leadership”.
The recurring theme I’ve found in my consulting and qualification process for a client concerns not really understanding what Internet of Things (IoT) means. Nor do they always understand realistically what benefits could accrue. Or what technologies fit.
A client one time hired me to justify a decision already made—in their minds at least—about acquisitions that would enter them into the IoT market. Another looked for use cases and settled on one not understanding the complexity of that use case.
On the other hand, a wise CTO once explained to me about themes for the company’s annual conference. One year might be IoT and another digitalization. He said they looked at the current themes in the market and then figured how their products fit, and presto—a theme.
If you are in an IoT project or contemplating one as a user or looking at a product and service plan as a supplier, step back and try using good basic management first. Organizing, defining, staffing.
Here is the list from the image:
- Failure to capture business opportunities
- Unclear and incomplete use cases
- Systems are too complex to communicate
- Missing critical data
- Unable to extract actionable insights
- Unable to identify root cause of product malfunctions
- Ensuring market-fit and early buy-in
- High cost of mistakes
- Prototyping products not technically or financially feasible
- Skills or capacity gap
- Aligning and syncing teams
- Detailed and complete spec docs and keeping them up-to-date
Antonio Neri, CEO and President of Hewlett Packard Enterprise (HPE), used the phrase “Data is the new currency, memory the new gold” in his keynote to the company’s annual US customer conference Discover in Las Vegas in June. Just one of the many places I’ve been lately.
If you haven’t planned for data in your machine and process control designs, you had best begin.The race for improved operations performance is on now.
We talk often of “edge” in the world of Internet of Things or Industrial Internet of Things. The edge has many definitions, but it can be defined as any place outside a data center. PLCs, for example, not only perform logic control, but they also aggregate data from perhaps thousands of sensors. SCADA devices and industrial computers also collect and channel data from a few to many sensors and data sources.
Business operations managers are hungry for this data to feed their information systems that in turn fuel their business decisions. Data in context is information. Information correctly presented to decision makers leads to better, faster decisions—and a competitive edge.
This search for competitive edge has moved me from an emphasis on control and automation (something we still need to do well) to Industrial Internet of Things. The IIot is taken by many as a similar strategy to Industrie 4.0 or Smart Manufacturing or whatever different countries call their strategies. This means I’m looking at a new generation of edge computing, enhance networking standards, human-centered design for mobile visualization of data, and even Augmented Reality (AR) and Artificial Intelligence (AI). These are not far-out technologies any longer. They are here and applications are growing.
Neri talked about the future as edge-centric, cloud-enabled, data-driven. He said the edge is where the action is, where the data is created. HPE is going to invest $4 billion in the intelligent edge over the next 4 years.
The company announced a new edge computing device with enterprise grade computing power (far beyond a PC) plus up to 48TB (yes that’s Tera not Giga) of memory. Oh, and it also comes in an environmentally hardened package. The CTO of Murphy Oil talked of using these on off-shore oil rigs.
Texmark Chemicals is a Houston, Texas based petrochemical refiner. I had several opportunities to talk with them about their IoT projects. They orchestrated an ecosystem of 12 suppliers initially to instrument critical pumps in their process in order to achieve predictive maintenance. This potentially saves the company millions of dollars by avoiding catastrophic failure. (Note: I previously wrote about the Texmark use case here–and expect more to come.)
Back to the announcement from HPE about the new edge product—a family of edge-to-cloud solutions enabled by HPE Edgeline Converged Edge Systems to help organizations simplify their hybrid IT environment. By running the same enterprise applications at the edge, in data centers and in the cloud, the solutions allow organizations to more efficiently capitalize on the vast amounts of data created in remote and distributed locations like factories, oil rigs or energy grids.
(Dr. Tom Bradicich wrote a blog post you can find here.)
HPE’s new edge-to-cloud solutions operate unmodified enterprise software from partners Citrix, GE Digital, Microsoft, PTC, SAP and SparkCognition, both on HPE Edgeline Converged Edge Systems – rugged, compact systems delivering immediate insight from data at the edge – and on data center and cloud platforms. This capability enables customers to harness the value of the data generated at the edge to increase operational efficiency, create new customer experiences and introduce new revenue streams. At the same time, edge-to-cloud solutions enabled by HPE Edgeline simplify the management of the hybrid IT environment, as the same application and management software can be used from edge to cloud.
“The edge is increasingly becoming a centerpiece of the digital enterprise where things and people generate and act on massive amounts of data,” said Dr. Tom Bradicich, Vice President and General Manager, IoT and Converged Edge Systems, HPE. “Our edge-to-cloud solutions help bring enterprise-class IT capabilities from the data center to the edge. This reduces software and IT administration costs, while accelerating insight and control across the organization and supply chain.”
HPE also announced the HPE Edgeline Extended Storage Adapter option kit, adding up to 48 terabytes of software-defined storage to HPE Edgeline Converged Edge Systems. This enhancement enables storage-intensive use cases like artificial intelligence (AI), video analytics or databases at the edge, while leveraging industry-standard storage management tools such as Microsoft Storage Spaces, HPE StoreVirtual VSA, and VMware vSAN.
Data are worthless without the ability to put it into a context and present it in a simple way to decision makers—no matter where in the company they reside. Enter Ocean Data Systems (ODS) and its product Dream Report.
Note: Dream Report obviously is an advertiser on this blog. However, I’m not a trade magazine, so writing about advertisers isn’t mandatory. I’ve known the Marketing VP for almost 20 years, and he has always represented quality products. I pass this along not as a reviewer but because I think it’s useful.
Ocean Data Systems (ODS) announced that Dream Report version 4.82 is posted and available for download. Along with a broad array of Customer Software Change Requests (SCRs), this release delivers on a theme of Partner Connectivity. Dream Report is now the official solution for the Aveva – ClearSCADA solution and version 4.82 includes drivers to access historical values, historical messages and real-time values. This release also includes new connectivity for the OSIsoft – PI Historian and Asset Framework. Enhanced connectivity is also delivered for the Aveva – Wonderware Online InSight product and the Dream Report – Advanced ODBC Driver for time series data in SQL Databases.
General features and benefits improvements include an enhanced Automatic Statistic Table to add flexibility for ad-hoc analysis, new access to Table Footer data delivering the ability to enhance report calculations using table footer results and support for .xlsm (Excel File Format) to support macros.
“Each new ‘Purpose Built’ Driver added to Dream Report enables Dream Report to ideally support a new market segment and lets those customers benefit from Dream Report – Compliance Reports, Performance Dashboards and Ad-hoc Analysis,” explained Roy Kok, VP Sales and Marketing for Ocean Data Systems. “We always include enhancements that benefit our entire installed base and this release is no different. The Dream Report release notes, in the documentation directory, will detail all changes.”
Founded in 2004, Ocean Data Systems develops software solutions for industrial compliance and performance; reports, dashboards, and ad-hoc analysis and troubleshooting. Dream Report delivers both local and Internet connectivity to all major HMI/SCADA, Historian and business data sources through either proprietary or industry standard drivers. Dream Report’s markets include process, hybrid and discrete; with special functionality for Life Sciences (Pharmaceutical and Biotech), Water, Wastewater, Heat Treat, Building Automation, Energy Management and Manufacturing Operations.
These are the main features of Dream Report:
1- DATA COLLECTION
Dream Report integrates a robust communication kernel to collect data and alarms from multiple real time and historian sources. It uses OPC, OLE, and ODBC standards to connect and collect Data from different suppliers. Moreover, ODS develops custom drivers to leverage native history from SCADA systems, DCS, RTU and more….
2- DATA LOGGING
Dream Report integrates a powerful historian module to log clean and accurate data in any standard database such as SQL Server, Oracle, MySQL, Access…
This unique feature position Dream Report not only as a reporting tool, but also as the ideal solution of field Data integration for enterprise applications.
3- DATA EXTRACTION & ANALYSIS
Dream Report integrates a user friendly object library to extract Data statistics and analysis to be displayed in multiple views like tables, Bars, Pies, Charts and more…
4- REPORT DESIGN
Dream Report’s studio integrates an intuitive graphical editor to create and save state of the art reports as templates.
5- REPORT GENERATION & DISTRIBUTION
Dream Report enables to generate reports manually and automatically. The automatic mode enables to execute report on event and on schedule. When ready, reports can be automatically printed, emailed, stored and published over the web.
Most of my time involves Hewlett Packard Enterprise (HPE) where I am devoting about 2.5 hours a day to interviews. As one person asked, what does HPE have to offer. Briefly described, HPE has a variety of compute devices, services, and partnerships.
One application was a prescriptive maintenance solution where IoT data is analyzed and the CMMS is notified to initiate a work order. We are not in the era of self-healing machines, yet, but we are one step closer where the machine can begin a maintenance workflow with information about what to repair.
The SecureEdge Data Center combines enclosures from Rittal, I/O from ABB, and Edgeline edge computing hardware from HPE into a scalable industrial center to provide IIoT data to the enterprise from ABB robotics and automation.
As a former machine vision integrator, I loved the video analytics demo application showing Relimetrics software to analyze servers in manufacturing. In this case, the application read a 2D barcode to determine the build, discovered the bill of materials, and then checked that all the proper components were in the assembly, that everything was properly installed, and there were no other defects.
One application of Edgeline edge compute devices, for example, is in partnership with National Instruments to accomplish complex testing at the edge with communication to the cloud as necessary. Edge compute is also important in autonomous vehicles where decisions must be quickly executed locally, but large amounts of data must also be communicated to the cloud for further analysis.
Speaking of partnerships, HPE has forged significant partnerships in the industrial world with ABB, GE Digital, OSIsoft, PTC (Kepware and ThingWorx), Rittal, and Schneider Electric. Most of these involve a significant IT infrastructure including power at the Edge from HPE along with data and connectivity plus solutions targeted to various industrial applications.
Manufacturing technology professionals have been working with data of many types for years. Our sensors, instrumentation, and control systems yield terabytes of data. Then we bury them in historians or other databases on servers we know not where.
Companies are popping up like mushrooms after a spring rain with a variety of approaches for handling, using, analyzing, and finding all this data. Try on this one.
Io-Tahoe LLC, a pioneer in machine learning-driven smart data discovery products that span a wide range of heterogeneous technology platforms, from traditional databases and data warehouses to data lakes and other modern repositories, announced the General Availability (GA) launch of the Io-Tahoe smart data discovery platform.
The GA version includes the addition of Data Catalog, a new feature that allows data owners and data stewards to utilize a machine learning-based smart catalog to create, maintain and search business rules; define policies and provide governance workflow functionality. Io-Tahoe’s data discovery capability provides complete business rule management and enrichment. It enables a business user to govern the rules and define policies for critical data elements. It allows data-driven enterprises to enhance information about data automatically, regardless of the underlying technology and build a data catalog.
“Today’s digital business is driving new requirements for data discovery,” said Stewart Bond, Director Data Integration and Integrity Software Research, IDC. “Now more than ever enterprises are demanding effective, and comprehensive, access to their data – regardless of where it is retained – with a clear view into more than its metadata, but its contents as well. Io-Tahoe is delivering a robust platform for data discovery to empower governance and compliance with a deeper view and understanding into data and its relationships.”
“Io-Tahoe is unique as it allows the organization to conduct data discovery across heterogeneous enterprise landscapes, ranging from databases, data warehouses and data lakes, bringing disparate data worlds together into a common view which will lead to a universal metadata store,” said Oksana Sokolovsky, CEO, Io-Tahoe. “This enables organizations to have full insight into their data, in order to better achieve their business goals, drive data analytics, enhance data governance and meet regulatory demands required in advance of regulations such as GDPR.”
Increasing governance and compliance demands have created a dramatic opportunity for data discovery. According to MarketsandMarkets, the data discovery market is estimated to grow from $4.33 billion USD in 2016 to $10.66 billion USD in 2021. This is driven by the increasing importance of data-driven decision making and self-service business intelligence (BI) tools. However, the challenge of integrating the growing number of disparate platforms, databases, data lakes and other silos of data has prevented the comprehensive governance, and use, of enterprise data.
Io-Tahoe’s smart data discovery platform features a unique algorithmic approach to auto-discover rich information about data and data relationships. Its machine learning technology looks beyond metadata, at the data itself for greater insight and visibility into complex data sets, across the enterprise. Built to scale for even the largest of enterprises, Io-Tahoe makes data available to everyone in the organization, untangling the complex maze of data relationships and enabling applications such as data science, data analytics, data governance and data management.
The technology-agnostic platform spans silos of data and creates a centralized repository of discovered data upon which users can enable Io-Tahoe’s Data Catalog to search and govern. Through convenient self-service features, users can bolster team engagement through the simplified and accurate sharing of data knowledge, business rules and reports. Here users have a greater ability to analyze, visualize and leverage business intelligence and other tools, all of which have become the foundation to power data processes.