HighByte Releases Industrial DataOps Solution with Complete Data Engineering Toolset for Global Manufacturers  

Unified Namespace architectures are the hot IT item right now. HighByte, a pioneer in industrial dataops, now includes support for this technology.

HighByte, an industrial software company, announced the release of HighByte Intelligence Hub version 3.2 that fully supports Unified Namespace (UNS) architectures and expands data pipelines capabilities, providing a complete data engineering platform to the industrial market. Pipelines are further integrated into the core of the Intelligence Hub and enhanced to provide highly configurable, complex, and sequenced data processing. 

Pipelines in the Intelligence Hub can now seamlessly interface with complex systems hosting transactional and historical data and sequence dependent transactions with MES and ERP systems. This sequencing, along with new compression capabilities, can tackle “ETL” use cases, such as backfilling historical data from timeseries databases to the cloud. Pipelines can also now maintain state from one run to the next. This is ideal for capturing long-running, conditional events used for tracking machine status and production.

Version 3.2 also includes a new UNS Client that provides visualization of the UNS, allowing users to view the contents of any MQTT broker. HighByte Intelligence Hub users can now see the results of their MQTT-based Industrial DataOps workloads in a single, unified engineering environment, negating the need for external testing clients.

The Intelligence Hub now provides a complete UNS infrastructure solution with an integrated data engineering platform, embedded MQTT broker, and visual client. Oriented for interoperability and architectural flexibility, the Intelligence Hub can serve as an organization’s sole UNS infrastructure or as a complementary solution among other third-party UNS components. Other features in version 3.2 include a new SQLite connector, enhanced store-and-forward capabilities, and improvements to many existing connectors, including AWS IoT SiteWise, Sparkplug, and PI System. 

Ignition Cloud Edition Now Available in Microsoft Azure Marketplace

Pay-As-You-Go Solution Provides a Simpler Way to Leverage More Cloud Capabilities

Inductive Automation recently moved its flagship Ignition software to the cloud. Now it’s available on the Microsoft Azure Marketplace on a pay-as-you-go plan.

Ignition Cloud Edition works together with the standard Ignition platform and Ignition Edge, creating a fully integrated and scalable system that connects people, processes, and programs for true digital transformation.

Ignition Cloud Edition on Azure combines the vast power of the Ignition platform with Azure’s extensive cloud computing services. Adding this listing to Azure’s marketplace provides an additional avenue for access, bringing additional flexibility to architectures and opening up even more possibilities for innovation,” says Kevin McClusky, Chief Technology Architect, Inductive Automation.

Ignition Cloud Edition comes packaged with Ignition Core Modules: Perspective, Reporting, SQL Bridge, OPC UA, Enterprise Administration Module (EAM), Tag Historian, and Alarm Notification. It also includes the Web Development, Twilio Notification, and Voice Notification modules, as well as the MQTT Engine, MQTT Distributor, MQTT Transmission modules from Cirrus Link Solutions. A cloud connector module called the MongoDB Module is also included, and Cloud Edition users will get new cloud connector modules as they become available.

Ignition Cloud Edition’s pay-as-you-go purchasing model opens up budget options, allowing companies to use part of their operating expenditure budget allotted for cloud services instead of capital expenditures. It is important to note that Cloud Edition is not SaaS, and users are responsible for configuration, backup, and upgrades of the software. Cloud Edition is backed by TotalCare Support, which includes unlimited phone and web support, upgrade protection, discounts on training courses, and more.

A Better Way To Control Process Quality

Two conversations occurred this week involving variations in feedstock that affect the process and the quality of the end product. One conversation involved coffee beans. The other steel.

The farmer fortunate enough to sell directly to the roaster (Direct Trade Coffee) can afford to pick ripe berries more often during the season than the farmer selling through a large broker. This reduction in the variability of the coffee bean feedstock allows the roaster to achieve tighter control in the process. Coffee beans purchased through a “middle man” have more variability requiring the roaster to over-roast (burn) the beans during the process (think large chain coffee). The coffee I’m drinking this morning has only the roaster between me and Diego Chavarria, who farmed those beans in Nicaragua. It’s delicious.

Ah, but steel, you wonder. Similar philosophy at least through the process. The other conversation involved Fero Labs’ CEO and co-founder, Berk Birand. Fero Labs has developed a software product to help operators and engineers reduce process volatility, gain more control over that process, and consistently produce higher quality product. The two main markets the company serves are steel and cement.

Birand explained the company’s focus by beginning with the main enemy of manufacturing—volatility. There simply exist too many variations for which manufacturing software can account. These include feedstock variability, operator variations from shift to shift, and many other environmental factors. This leads process engineers to over-design in order to allow for worst-case scenarios. In fact, he states, the average factory uses 9% more resources than necessary to account for these variations.

Birand and his co-founder are machine learning (should I say?) experts. “Machine Learning works better today because we have ten years of improved data infrastructure along with much statistical modeling thanks to such initiatives as Six Sigma. Our premise is to reduce overdosing by building a statistical model that can capture and process hundreds of parameters.”

The software builds real-time operations atop its modeling and machine learning engine. One example is steel processing. This type of manufacturing involves a feedstock of scrap steel. I had a customer once who was in this market. The pile of scrap I saw was amazing. Anyway, operations cannot really know the composition of the feedstock going into the furnace. Fero Labs’ software allows input of composition analysis in real time such that operators know what might be necessary to add to that particular batch to make the steel to spec.

Trust in “artificial intelligence” or machine learning models is hard to build because it is almost always a “black box.” Engineers cannot peer inside to understand what is going on. Birand describes Fero’s product as a “white box.” Engineers do have the ability to peer inside and see what’s happening. Building trust at this level makes it more likely that the customer will actually use the product.

I seldom have CEO interviews with the depth of detail of this conversation. It has been many years since I actually worked in manufacturing. These conversations bring back the excitement of finding a better way to make things.

Measure and Manage Greenhouse Gas Emissions

Honeywell engineers have been busy with a variety of sustainability technology applications. Beyond what I’ve written previously here is another initiative.

Honeywell announced an expansion to its Honeywell Forge Sustainability+ for Industrials | Emissions Management software application that allows industrial companies to measure and monitor both direct and indirect greenhouse gas (GHG) emissions of their operations.

The software application can collect data from Honeywell’s leading sensors and gas-cloud imaging cameras to measure direct GHG emissions, also known as Scope 1 emissions. The new capability aggregates data from additional sources to measure indirect GHG emissions from the purchase of energy, known as Scope 2 emissions.

Despite global efforts in decarbonizing the power sector, electricity and heat generation are responsible for over 40% of global CO2 emissions, one of the main types of greenhouse gas. Measuring, calculating and accounting emissions are key steps to abate emissions and are incentivized by the recently enacted Inflation Reduction Act in the U.S. GHG emissions are also regulated around the world.

“Honeywell’s newly expanded solution provides customers with a more comprehensive view of their emissions and a critical tool toward meeting their sustainability goals,” said Ravikrishnan Srinivasan, vice president and general manager of Emissions Management at Honeywell Connected Enterprise. “Honeywell is uniquely positioned with its ready-now technology and experience to be the transformational partner that helps organizations accelerate their progress in achieving sustainability outcomes.”

Digital Twin Consortium Publishes Platform Stack Architectural Framework

Are you interested in building digital twins? Here is a potential source for ideas. Also, it can be a place to contribute to the growth of the technology.

The Digital Twin Consortium (DTC) announced the Platform Stack Architectural Framework: An Introductory Guide. The guide, designed for C-Suite and business leaders, provides foundational building blocks and central concepts of a digital twin system.  System architects can use it to enable technology selection through development.

“Digital twins and enabling technologies are revolutionizing how we approach even the simplest of tasks, from managing the flow of stock in a warehouse to designing, deploying and maintaining a fleet of aircraft,” said Dan Isaacs, GM & CTO, DTC. “Digital twin systems accelerate digitization as they provide organizations the means to operate more efficiently, effectively and adhere to best practices and guidelines.”

The guide discusses the IT/OT infrastructure, virtual representation, service interfaces, applications, and mechanisms for synchronizing real-world data. The guide reviews commonly adopted technological approaches and standards and emphasizes the importance of security, trustworthiness, and governance.

“The Platform Stack Architectural Framework: An Introductory Guide answers fundamental questions such as “What are the critical constituent parts of a digital twin system?” and “What elements take a solution from being a great model or simulation to qualifying as a digital twin?” said Dr. David McKee, Entrepreneur and Portfolio CTO at Counterpoint Technologies, and Co-Chair of the Capabilities and Technology Working Group, DTC. “The guide also helps business leaders and developers understand how to design and architect digital twin systems with best practices for scalability, interoperability, and composability to realize their transformative value.”

The guide discusses five use cases of varying maturity levels with examples of how designers can use the architecture in practice. The use cases include buildings as batteries, emergency communication services, manufacturing quality control via remote operator, scope 3 carbon reporting emissions, and infectious disease management. Technology Readiness Levels (outlined in the guide) help designers understand the technical maturity of a system as it moves through the following stages:

  • Technical modeling and simulation, starting with theoretical models and improving to being based on real-world data
  • Digital twins as individual components based on actual data and validated in the real world through synchronization.
  • Digital twin systems in production and operational environments, with system integration and clearly defined synchronization at a specified frequency

The guide is the first in a series of digital twin publications OMG consortia will publish in the coming months. For more information, please download the Platform Stack Architectural Framework: An Introductory Guide from the DTC website.

About Digital Twin Consortium

Digital Twin Consortium is The Authority of Digital Twin. It coalesces industry, government, and academia to drive consistency in vocabulary, architecture, security, and interoperability of digital twin technology. It advances digital twin technology in many industries, from Aerospace to natural resources. Digital Twin Consortium is a program of Object Management Group. 

Authentise Releases Threads to Spur Agile Engineering Collaboration

I’ve discussed digital thread technology and use a couple of times this month. Here is a company called Authentise for whom this is a specialty. They say that its new Authentise Threads product provides unique work thread collaboration that empowers R&D and industrial engineering teams to flexibly speed up, track and integrate product development. I named my new website 10 years ago The Manufacturing Connection because I saw that the future was not simply automation but connections of many types. Here is an example.

Andre Wegner, CEO Authentise, outlines that “despite the noise about the need to be more agile, it’s clear there’s a relative lack of software solutions available today to support R&D, industrial engineering and manufacturing to actually accomplish this. If the definition and simulation of product and process is digital, then there’s no reason we cannot adopt similar processes to those pioneered in software and move at digital development speeds.”

Authentise Threads sits alongside existing engineering and project management systems to provide key features such as:

  • Cross Functional Work Thread Collaboration. Create, search, follow & link shared work threads across engineering teams and partners with real time structured communication, chat & notifications.
  • Shared information, knowledge, experience, resources and context. A shared repository of all the key data, resources, goals, metrics needed for work thread execution
  • Collaborative Digital Decision Making. Formally track and manage workthread efforts, insights, actions, decisions, resolutions, and more.
  • Continuous learning & improvement. Share full history & traceability of work, discussions, issues, decisions, actions, metrics, all with full context. 

Authentise Threads delivers value immediately. The R&D organisation of a leading surgical robotics company was up to speed in less than 30 minutes. Within 2 weeks they were seeing a 1.5x ROI on their investment, tracking 100% of their R&D decisions digitally, while saving 150 hours and 20 meetings across a distributed team including external partners. They doubled the effective size of their team.

Since starting at Singularity University in 2012, Authentise has focused on providing flexible, data-driven workflows in the most agile manufacturing and engineering settings. Its tools help manage the order to part process by connecting to machines and providing operators with digital tools to enable traceability, repeatability and efficiency on the shop floor. Initially focused on the additive manufacturing sector, it now has clients such as Boeing, 3M, and Danfoss, who have seen savings of up to 95% with 6x ROI in the first year.

Follow this blog

Get a weekly email of all new posts.