I had been talking Open Source with Bill Lydon, who, like me, has been around many places in manufacturing and media in his career most recently as editor of InTech. He referred me to an article he wrote on automation.com.
This release actually points to two important technologies that will combine to improve management’s ability to operate a more profitable and efficient plant. One is the open source. The other is DataOps. I had begun hearing about that from IT companies and was just beginning to wonder about applications specifically to industry when I was approached by John Harrington, one of the founders, with the initial story of HighByte.
I have written several things about DataOps. Here is one from my visit last year to the Hitachi Vantara conference, and the other reports on lack of leveraging data.
On to the actual news:
HighByte announced the release of HighByte Intelligence Hub version 1.2. The release expands the software’s support for open standards and open platforms, including MQTT, Sparkplug, and OpenJDK. Open standards and platforms simplify industrial data integrations and accelerate the time-to-value for Industry 4.0 solutions.
“The MQTT Sparkplug specification is critical to ensuring information interoperability and governance in the plant and throughout the enterprise,” said HighByte CEO Tony Paine. “By deploying HighByte Intelligence Hub with Sparkplug support, our customers are able to standardize and streamline their data infrastructures from factory to cloud.”
HighByte Intelligence Hub is the first DataOps solution purpose-built for industrial environments. DataOps is a new approach to data integration and security that aims to improve data quality, reduce time spent preparing data for analysis, and encourage cross-functional collaboration within data-driven organizations.
In addition to MQTT Sparkplug and Open JDK support, HighByte Intelligence Hub version 1.2 includes model and instance categorization, enhanced security, and more flexible presentation of information models and output formats.
The software is available as an annual subscription and is priced per instance or per site. highbyte.com
From The HighByte Blog
The future of Industry 4.0 is open: open standards, open platforms, and open thinking. In today’s ecosystem, realizing the full potential of Industry 4.0 requires a mesh of products working together to fulfill each layer of the technology stack. Open standards and platforms simplify these integrations and speed up the time-to-value for Industry 4.0 solutions.
Open Standards. This release adds Sparkplug support over MQTT. Sparkplug is an open standard built on top of MQTT that defines the message format for Sparkplug-enabled applications. Many industrial sensors and systems have adopted the Sparkplug specification with MQTT as a means of integrating systems due to Sparkplug’s prescriptive topic namespace, payload definition, and state management. Using Sparkplug, customers can instantly consume and publish data models to and from other Sparkplug-enabled systems.
Open Platform. This release also supports OpenJDK v14, a free and open-source implementation of Java, extending the reach of HighByte Intelligence Hub to any OpenJDK-enabled platform. OpenJDK support for the underlying JAVA virtual machine ensures the longevity of the solution and reduces the cost of ownership of the solution for our customers.
HighByte Intelligence Hub version 1.2 offers the following new features:
- Enhanced security. Securely connect and configure HighByte Intelligence Hub using HTTPS.
- More flexible output formats. JSON output can now be further customized, allowing users to flatten or expand the hierarchy. Flexible presentation of information models is essential when supporting multiple use cases and target applications. While MQTT is becoming the de facto protocol for IoT data and many applications support it, each application has nuances in how they expect JSON to be structured. Applications and data storage engines also have unique needs regarding update frequency and how much information is included in the update. Flexible presentation of information models addresses these interoperability challenges.
- Publish only data that changes. Enable a flow to only publish model data that has changed, reducing the amount of data sent to applications.
- Easily organize models and instances. Models and model instances can now be organized into categories, making it easier to manage hundreds or thousands of models across your enterprise. The organization of models and instances is critical as companies scale the size of their deployments.
Tony Paine Blog Post
Communication within a start-up is pretty straightforward. If you have a question about a new product launch, you go directly to the owner or CEO. Problems with a design flaw? Talk to your lead engineer. As that business scales, your lines of communication become more complex. You may need to send information through multiple channels to get an answer. Without an easy way to send or retrieve information, it might get lost or misinterpreted or you may wait days for an answer. Anyone who has worked in that environment knows the inherent challenges.
Similarly, when organizations implement new industrial IoT solutions, they may work fine at first but become less effective as the project or company scales. The more capabilities you add, the more connections you create throughout your data systems.
For example, today you might need one or two pieces of production data, such as downtime or line speed, from a machine that feeds information into a business intelligence system and an analytics software package from different vendors. As your organization grows, accessing this information becomes more complex because you now have thousands of connection points. Each time you add an application to the system, you need to build connections between the new software and the other systems with which it must communicate. This dramatically increases integration costs and slows deployments.
To scale your industrial IoT implementation, you need a unified namespace. A unified namespace is a software solution that acts as a centralized repository of data, information, and context where any application or device can consume or publish data needed for a specific action. Without a centralized data repository, it could take months to deploy a new analytics application across the entire enterprise versus hours with a unified namespace.
For nearly two decades, MQTT has served as an effective messaging protocol that allows any program or device to publish data but doesn’t offer interoperability between third-party devices and applications. Technology companies have brought data interoperability to MQTT devices and applications through the development of the Sparkplug specification.
At HighByte, we view our unified namespace as a middleware solution that allows users to collect data from various sources, add context so there’s meaning to it, and transform it to a format that other systems can understand. That is why we are adding support for Sparkplug in the upcoming October release of HighByte Intelligence Hub.
This is where you begin to unlock the real value of machine learning because you now have the connectivity you need to optimize your systems, devices, and processes in real time and scale your IoT capabilities without costly, time-consuming implementations.
Just to keep my perspective in balance, I attended a national soccer referee instructor virtual training. It was terrible. The presenters were not familiar with the technology and the presentation was incoherent.
Therefore, what a joy to attend another well done industrial technology conference. The Ignition Community Conference sponsored by Inductive Automation was Tuesday Sept. 15, but you can click the link and see the presentations for a while. I attended two other conferences (unfortunately not the Apple one) and therefore ran out of time to watch more today. But, I’ll be back to catch other presentations.
Chief Marketing Officer Don Pearson always has a relevant quote to serve as a theme for the conference. “Let’s stop trying to predict the future; let’s build it!” Stop and ponder.
The Ignition platform has greatly expanded over the years without losing the core ideas founder Steve Hechtman first explained to me in 2003. Built upon IT-friendly technology with a core strategy of unlimited licensing, the result is a robust HMI/SCADA offering that is affordable. [Note: Inductive Automation is a sponsor, but I’ve been a fan since long before that happened.]
They call it a “community conference” because they consider themselves, their customers, and their partners as a community. And when they gather physically (and even virtually), the people all act like a community. I always enjoy the conference feeling.
Hechtman discussed how the executive team met at the very beginning of hearing about the pandemic to begin preparing for remote work. Among other things they bought a lot of laptop computers. As things went along, they discovered that overall everyone was more productive working remotely. This will become a new way of life for most at the company.
Preparing for business in the Covid environment, they improved the support function and added more people. They improved remote training–and a significant number of customers would prefer to continue remote training although many still wish to return to in-person classes when it’s possible.
Speakers extolled the stories returning about how people are using the Maker edition unveiled in June. Ignition Perspective is another key new product that serves as a base for the main product Ignition 8.1.
Key themes included filling out the promise of IoT broached back in 2016, mobility, remote control, large enterprise solutions, and working with AWS and Azure. Inductive has come a long way.
Ignition 8.1 is a release that will be supported five years. Its development them was refining version 8.0 released last year. It’s vision is based on Perspective. They worked to make it the best visualization platform. Perspective makes it easier to create in a variety of platforms–including the new Workstation edition. It’s now not only Web-based, but also has this Workstation edition to enhance speed. They have developed an easier-to-use and improved symbol library. They’ve added first-party Docker support. Plus a Quick Start application to help people new to the platform to get configured and ready to use.
Be sure to visit the ICC website and examine all the use cases and partners.
Helps enable seamless test operations and data management across an entire organization
It begins with data and it’s all about data. “It” being improving production and profitability and safety. It doesn’t end with data, though. A system is required. Part of the system is software that gathers, analyzes, visualizes data.
NI began with data—test and measurement. It just kept growing. This week it announced the enterprise version of SystemLink software. By standardizing the way data is shared and analyzed, the new enterprise version enables increased visibility and control of test systems across an entire organization. In this way, SystemLink software serves as an important bridge between engineering and manufacturing departments in their efforts to improve overall operational efficiencies and drive digital transformation initiatives.
NI, in its release information, acknowledges data can be an organization’s greatest asset when used to make more informed decisions. It can also be a drain on time and resources when it creates incompatible silos. SystemLink software connects test workflows to business performance, linking people, processes and technology across the enterprise, from engineering to production to the field.
Engineers save time through focus on quickly spotting patterns and proactively addressing issues before they become a problem. “Freeing up engineers to focus on the work that has the largest impact for their organization is smart business,” said Josh Mueller, VP of Experience at NI. “But it is also one of the core components of our company mission — elevating and empowering the engineer.”
Cree Lighting, a leading manufacturer of indoor, outdoor and consumer lighting, has implemented real-time data monitoring and display, post-test analysis and factory management tools using SystemLink software. “SystemLink enables our production floor to step into the future. It equips us with the visibility to respond to market conditions more quickly while optimizing our team’s production efforts all around the world,” said Ian Yeager, test engineering manager at Cree Lighting. “Now, I spend less time managing deployments and post-processing data and more time using the built-in tools to take care of low-hanging opportunities and improve efficiency for my team.”
The new version of SystemLink software is NI’s first hardware-agnostic systems and data management tool. The announcement of enterprise support underscores NI’s enterprise software strategy to help customers accelerate digital transformation efforts by coupling test operations with advanced product analytics enabled through its recent acquisition of OptimalPlus. By unlocking the value of test data and allowing more groups across an enterprise to work together, NI is helping connect the bold people, ideas and technologies required to push our world forward.
In the beginning were data. Increasing amounts of data. But what good is a storehouse filled with data? Especially when you have to build additional storehouses to hold more data?
We all know that this is just a hole to throw good money into unless we can reap benefits from the data.
John Burton, CEO of Ursa Leo, recently gave some time to me to explain what this new company is up to. And it’s pretty cool. It begins with data formed into a digital twin of the factory, oil rig, equipment, and the like. But they asked how can company managers and operators reap benefits from their data? The company began by focusing on visual representations of the data. It hired engineers from the gaming industry to show photo realistic reproductions of the plant, equipment, building, and so on. Think things like Augmented Reality and Virtual Reality brought to life by the latest in gaming technology.
They develop on the Unity gaming engine that is designed to run on just about anything (derivative of gaming industry). So, for example, you can get a bird’s eye view of the factory. Show sensors perhaps as green dots. Zoom in to see sensors; zoom in again for ever greater levels of detail including sensor data. It is a simple screen showing IoT data from sensors and asset data bases. It’s spatially related. The model resides in the cloud, so the same view and information is available to everyone making it great for remote maintenance support, for example, also training and remote control.
Visualization includes x-ray vision, rewind/replay events, “make pieces of equipment fly apart, visualize fluid flow, heat maps (like temperature of gas for example).
Recently, the company announced that it has closed its Seed Round B and raised $725K. The round was led by Keiretsu Forum and Bay Angels with additional private investors. As UrsaLeo continues to grow, the money will be used to expand operations into manufacturing and building management, further product development with collaboration and remote control modes, and enhance the company’s new social distancing technology.
With the closing of the Seed B, the company also announced the hiring of Michael Uribe and Alina Verdiyan as Senior Vice Presidents. Michael will lead the Building Information Management Division and work with developers and building operators to bring cutting edge 3D technology to public spaces. Alina will lead the Manufacturing Division and have worldwide responsibility for dedicated sales, marketing, and operations.
“The UrsaLeo team and traction they’ve achieved with early customers signals an opportunity for rapid growth in valuation,” said Chandika Mendis of Bay Angels. “With the adoption of social distancing technology and digital twinning, one of the hottest sectors at the moment, enterprise software that helps to simplify problem solving in challenging times is an attractive investment. We look forward to tracking their success.”
I first started hearing seriously about data ops last year at a couple of IT conferences. Then a group of former Kepware executives founded High Byte to point data ops specifically to the manufacturing industry. I told them I thought they had something there.
A representative of Seagate Technology sent me information about a study done with IDC about data in organizations. I haven’t had a relationship with Seagate for many years, but this is a timely report about enterprise data pointing out that 68% of data available goes unleveraged and that manufacturing is a laggard in this arena.
As enterprise data proliferates at an unprecedented pace – set to grow at a 42.2.% annual rate over the next two years – a new report from Seagate and IDC has revealed that the majority (68%) of data available to enterprises goes unleveraged, meaning data management has become more important than ever.
Furthermore and somewhat surprisingly, the manufacturing sector shows the lowest level of task automation in data management, lowest rate for full integration of data management functions as well as low adoption of both multicloud and hybrid cloud infrastructures.
The report also identifies the missing link of data management—DataOps—which can help organizations harness more of their data’s value and lead to better business outcomes.
The report, Rethink Data: Put More of Your Data to Work—From Edge to Cloud is based on a survey of 1500 global enterprise leaders commissioned by Seagate and conducted by the research firm IDC.
“The report and the survey make clear that winning businesses must have strong mass data operations,” says Seagate CEO Dave Mosley. “The value that a company derives from data directly affects its success.”
Some additional findings include:
- The top five barriers to putting data to work are: 1) making collected data usable, 2) managing the storage of collected data, 3) ensuring that needed data is collected, 4) ensuring the security of collected data, and 5) making the different silos of collected data available.
- Managing data in the multicloud and hybrid cloud are top data management challenges expected by businesses over the next two years.
- Two thirds of survey respondents report insufficient data security, making data security an essential element of any discussion of efficient data management.
The missing link of data management is reported to be data operations, or DataOps. IDC defines DataOps as “the discipline connecting data creators with data consumers.” While the majority of respondents say that DataOps is “very” or “extremely” important, only 10% of organizations report having implemented DataOps fully. The survey demonstrated that, along with other data management solutions, DataOps leads to measurably better business outcomes. It boosts customer loyalty, revenue, profit, cost savings, plus results in other benefits.
“The findings of this study illustrating that more than two-thirds of available data lies fallow in organizations may seem like disturbing news,” said Phil Goodwin, research director, IDC and principal analyst on the study. “But in truth, it shows how much opportunity and potential organizations already have at their fingertips. Organizations that can harness the value of their data wherever it resides—core, cloud or edge—can generate significant competitive advantage in the marketplace.”
HPE Discover Virtual Experience wrapped up last week, but I have much to think about and report. The HPE team did an excellent job pulling together a conference where we saw many different living rooms and home offices. Tough job; well done.
The release of a new software portfolio from HPE may sound more to the interest of enterprise architects, but I have already seen demos of where this also aids the coming together of OT and IT in order to bring the production side of an enterprise into more of a value to the enterprise. This is important toward the counteracting of recent enterprise history where production was a “black box” and corporate financial geniuses viewed it as something that could be moved around chasing low cost.
From the blog of Kumar Sreekanti, CTO and head of software at HPE, we learn about the coming together of Ezmeral—brand name for the software portfolio.
Digital transformation is being amplified by an order of magnitude. In fact, many business leaders that I’ve spoken with are now embracing a digital-first strategy—to compete and thrive in the midst of a global pandemic. And the enterprises that use data and artificial intelligence effectively are better equipped to evolve rapidly in this dynamic environment. Now these data-driven transformation initiatives are being accelerated to enable faster time-to-market, increased innovation, and greater responsiveness to the business and their customers.
As CTO and head of software at HPE, my focus is on delivering against our edge-to-cloud strategy and vision of providing everything as a service. Software is a very critical and important component of this strategy. It’s also essential to helping our customers succeed in their data-driven digital transformation journeys, now more than ever.
We’re committed to providing a differentiated portfolio of enterprise software to help modernize your applications, unlock insights from your data, and automate your operations—from edge to cloud. Today, we announced that we’ve unified our software portfolio with a new brand: HPE Ezmeral.
The HPE Ezmeral portfolio allows you to:
- Run containers and Kubernetes at scale to modernize apps, from edge to cloud
- Manage your apps, data, and ops – leveraging AI and analytics for faster time-to-insights
- Ensure control for governance, compliance, and lower costs
- Provide enterprise-grade security and authentication to reduce risk
Business innovation relies on applications and data. The apps and data running the enterprise now live everywhere—in data centers, in colocation centers, at the edge, and in the cloud. Most of the applications running businesses today are primarily non-cloud-native; and data is everywhere, with more and more data being generated at edge. Our customers are having real issues with non-cloud-native systems that will not or cannot move to the public cloud due to data gravity, latency, application dependency, and regulatory compliance reasons. Data has gravity, so our customers want to bring compute to the data not data to the compute. And because data is exploding, it’s driving the need for AI and machine learning at enterprise-scale—with the ability to harness and leverage petabytes of data.
Our customers want flexibility and openness; they want to eliminate lock-in. They want pay-per-use consumption in an as-a-service model. They want open solutions that give them the best of both worlds—with a modern cloud experience in any location, from edge to cloud. We address these needs by providing HPE GreenLake in the environment of your choice, with a consistent operating model, and with visibility and governance across all enterprise applications and data. Our software provides differentiated IP to deliver these cloud services through HPE GreenLake.
And in today’s news, we announced new cloud services from HPE GreenLake. This includes new HPE GreenLake cloud services for containers and machine learning operations—powered by our HPE Ezmeral Container Platform software to run containerized applications with open source Kubernetes, and HPE Ezmeral ML Ops software to operationalize the machine learning model lifecycle.