My first Hannover interview was with Tobias Steinhäußer of the Fraunhofer Institute. I have just begun learning about the Dataspace Connector organizations and work, and Tobias mentioned its relevance to what this particular center at Fraunhofer is up to. This organization sits at the cutting edge of technology and standards for Industrie 4.0. Following is its news release.
The Fraunhofer Cluster of Excellence Cognitive Internet Technologies CCIT will showcase key technologies for an agile, innovative and competitive industry at the digital Hannover Messe (#HM21) from April 12 to 16, 2021. The scientists will present cognitive systems for future-proof assembly and production, the Dataspace Connector as a basis for sovereign data exchange through participation in GAIA-X and International Data Spaces, and transparent machine learning methods for industry.
Cognitive Internet technologies combine the physical world of things with the digital world of data and learning algorithms to create highly intelligent applications. They collect data from different sources, learn to understand them and optimize existing applications or open the door for new business models.
“At Fraunhofer CCIT, more than 20 Fraunhofer institutes are pooling their expertise to develop cognitive Internet technologies for the industry of the future. In doing so, the researchers are taking the next step in the development of Industrie 4.0 and the Internet of Things: they are combining the strengths of people and digital technology. We look forward to presenting specific application scenarios and solutions at the Hanover Messe 2021,” says Christian Banse, head of the Fraunhofer CCIT office.
Trustworthy electronics and intelligent sensor technology
In an agile and flexible industrial production, trustworthy electronics and intelligent sensor technology record and communicate data in real time. Fraunhofer CCIT shows examples at its digital #HM21 booth: With the intelligent screw connection, which permanently monitors hard-to-reach places and areas by means of a thin-film sensor system, screw connections, e.g. on bridges, wind turbines or also on machines in production lines, can be monitored wirelessly and energy-autonomously. This combination of different Fraunhofer technologies and their integration in a screw/clamp connection is already available as a product-ready technology solution.
Fraunhofer CCIT is also developing a scalable sensor network that can integrate different sensors and the associated data communication for condition monitoring tasks, e.g. even directly in rolling bearings or in machine-to-machine communication. This means that these systems can be adapted to each individual case with different measurement variables for a wide range of meaningful damage characteristics. With intelligent sensor technology, radio technology and autonomous energy supply, the condition of machine tools can also be continuously monitored. This is a key element, for example, in forming and also in machining, in order to optimize processes. Among the indicators are the tightening forces and temperatures on the tool surface.
The self-measuring localization system FlexLoc allows mobile machine tools to be placed or networked easily, flexibly and in real time in the hall infrastructure or tools in the human-machine interaction. For example, by attaching sensor nodes, automated guided vehicles (AGVs) can also interact with a person carrying an associated tag.
Secure data rooms
A controllable, sovereign and traceable exchange of data across company boundaries is essential in order to survive or even lead the market in digital competition with agile, intelligent service offerings. This requires software solutions that enable the safeguarding of complex processing chains by covering a wide range of security and interoperability requirements while being easy to integrate into existing IT infrastructures.
The Dataspace Connector (DSC), which will be presented at the Fraunhofer CCIT booth, meets these requirements: It has an extensible architecture that enables companies to exchange data in a sovereign manner and can be adapted flexibly, easily and comprehensibly to the companies’ individual business processes and requirements through appropriate configurations.
Companies can integrate the Connector solution, which is available as open source, into an existing container infrastructure, for example. Since the data is protected before it is exchanged via connectors and provided with usage rules, the data provider retains control over his data even after the exchange and thus an overview of what happens to the data. Through appropriate extensions, the DSC also supports the three security levels specified in DIN SPEC 27070. This means that companies can easily and efficiently start using the European GAIA-X data infrastructure. The DSC is licensed under Apache 2.0 and freely available.
Fraunhofer CCIT is continuously developing offers for securing business processes so that conditions of use can be enforced in a comprehensible manner. These solutions enable trust in distributed business processes even with increasing complexity of processing chains and thus address the requirements of industry.
Machine Learning for Industry 4.0
Industrial companies can apply the data gained through digitization in innovative artificial intelligence solutions. “Informed Machine Learning”, which was also shaped by the Fraunhofer CCIT, enables companies to transparently track the decision-making processes of learning systems and to intervene at the right points – an important prerequisite for assessing quality, reliability and risks. The research teams will be demonstrating examples of specific application scenarios in industrial production, including quality control at their digital trade fair stand.
An intelligent system consisting of a conventional image recognition process and innovative AI methods detects damage and defects such as paint inclusions or hail damage on reflective surfaces (e.g. car body parts) and automatically categorizes the defects found. Another technology combines voice and gesture control to create a multimodal dialog assistant for industry. The system enables completely digital defect documentation in production directly on the component with very little time expenditure. Users can choose between intuitive pointing gestures and a laser pointer as input method and mark defect locations on a component quickly, precisely and intuitively. Another exhibit shows cognitive tools that digitally assist workers, for example during the maintenance process on a forming press. The module attached to the screwdriver recognizes individual work steps and compares them with the target process. An additional localization system (QR code tracking) ensures that the work steps are performed at the correct location. The worker receives live feedback on the progress of his work via an app.
The results of a recent survey. Where do you fit on their maturity index? Hopefully you are using the technologies and obtaining positive results.
Analog Devices, Inc. announced a newly commissioned study conducted by Forrester Consulting on behalf of Analog Devices, that shows that industrial manufacturers who have made investments in connectivity technologies (“high maturity”) are better positioned to drive innovation and gain a competitive advantage compared to firms that have been slower to implement connectivity (“low maturity”) across the factory floor.
The study, based on a survey of more than 300 manufacturing, operations, and connectivity executives across the globe, found that 85% of high maturity firms are currently using Industrial Internet of Things (IIoT) technologies across much of the factory floor, compared to 17% of low maturity organizations. Over half (53%) of low maturity organizations report that their legacy equipment is unable to communicate with other assets.
“This past year was a true catalyst for digital transformation and many businesses needed to navigate and adopt connectivity strategies that helped them to become more agile and lay the groundwork for future innovation,” said Martin Cotter, SVP Industrial, Consumer & Multi-Markets at Analog Devices. “We see significant opportunity in the adoption of connectivity solutions, including 5G, to help organizations get data more quickly, enabling end applications.”
Findings from the research include:
Connected firms believe that improving network reliability (including adding 5G networks) will create significant opportunity: 68% of high maturity firms say this will enable them to make better use of existing cloud infrastructure and 66% believe their industrial data and IP will be more secure. Conversely, only 21% of low maturity firms believe that improving network reliability will help improve security. However, all respondents agree that improving network reliability will improve efficiency by freeing up employees who are constantly resolving downtime issues.
Low maturity firms struggle with security risk: 54% say that their lack of sophisticated cybersecurity strategy puts their business, customer, and employee safety at risk.
The human element continues to pose challenges: Almost half (47%) of low maturity firms say they lack the expertise to understand which connectivity technologies to invest in, indicating a skills gap. Even high maturity firms report that it is not easy for them to access the insights they need to make labor planning and safety decisions.
Real-time monitoring of equipment and productivity demonstrates an acute awareness of the high cost of unscheduled downtime: High (5%) and medium (17%) maturity firms reported much lower occurrence of unscheduled downtime of their industrial technology or equipment each week than low maturity companies (53%). These interruptions, lead to higher cost of holding inventory and labor per unit, loss of production and customer confidence and decreased work capacity.
This research shows us that while many firms are benefiting from the promise of industrial connectivity, others have significant legacy and talent-related hurdles to overcome. Innovation is hindered by both a shortage of in-house expertise and interoperability of systems and data, two major hindrances to manufacturing modernization.
Methodology: For this study, “Seamless Connectivity Fuels Industrial Innovation” (March 2021) – commissioned by Analog Devices – Forrester Consulting conducted a global online survey of 312 industrial connectivity strategy leaders. Survey participants included decision-makers in IT, operations, cybersecurity, and general management manufacturing roles. The study was conducted in October 2020.
Looks like we have a serious new entrant for cloud services for manufacturing software—Google Cloud. For the past many months, AWS and Azure have been the cloud infrastructure partners of choice. Google Cloud has begun making some inroads. Any thoughts on how this will shake out?
I thought I was getting a press release. This looks to be more like a blog post by Dominik Wee, Managing Director Manufacturing and Industrial of Google Cloud.
Across the industrial sector, many digital enablers and disruptive technologies are transforming businesses to be more efficient, profitable, nimble, and secure: smart factories with connected machines, the rise of the industrial internet of things (IIoT), proliferation of sensors and data, and new cloud strategies, to name a few.
Our work with the PI System is designed to help industrial companies modernize their data and take it beyond the operational space to Google Cloud, deriving more insights and business value. Thanks to our partnership with AMD and OSIsoft, now part of AVEVA,customers can effectively, safely, and easily deploy a fully functioning PI System to Google Cloud. Together, we assist customers along that transformational journey through the launch of GCP deployment scripts for PI Core.
The PI System is the market-leading data management platform for industrial operations in essential industries such as energy, mining, oil and gas, utilities, pharmaceutical, facilities, manufacturing, and more. The PI System automatically captures sensor data from every relevant source, refines and contextualizes the data, and delivers it to a broad ecosystem of people, tools, and applications for analysis and strategic planning. It also serves as a single source of truth for consistent decision making.
With the PI System, industrial companies can generate insights from their data, including:
Analyzing seasonal trends
Determining if utilities are meeting the demands of production
Comparing the performance of different lots of raw material
Determining when maintenance is required on equipment
Optimizing the utilization or performance of a production line
The new deployment scripts are a natural addition to PI Integrator for Business Analytics, which integrates Google Cloud withPI Core, the on-premises portion of the PI System. PI Integrator for Business Analytics can deliver critical operational data directly to Google Cloud destinations like Cloud Storage, BigQuery, and Pub/Sub. It can also be deployed either on-prem or on Google Cloud.
The solution allows customers to take time-series data from one or many of their PI Core deployments and move it ontoBigQuery, Google Cloud’s enterprise data warehouse for analytics. Customers can take advantage of exabyte-scale storage and petabyte-scale SQL queries in this high-speed, in-memory BI engine, enabling cloud-based artificial intelligence (AI) and machine learning (ML) tools that provide deep insights like anomaly detection and predictive maintenance from that data.
Through our partnership, PI Core-based solutions for industrial customers are further protected by Google Cloud’s robust security features, like encrypting cloud data at rest by default, that keep data safe and more secure. What’s more, PI Core is aligned with real-time mission-critical industrial applications, avoiding delays in time-sensitive process automation or discrete manufacturing processes.
Serving a new slice of PI (System) on Google Cloud
GCP deployment scripts for PI Core are the latest achievement in our partnership. We developed these deployment scripts for PI Core on Google Cloud using Infrastructure as Code (IaC) with PowerShell and Terraform technologies. With PI Core on Google Cloud, customers can generate business insights through the power of Google Cloud’s Smart Data Platform with AI/ML combined with other contextualized data. The scripts accelerate an industrial customer’s digital transformation and support quick and iterative testing and prototyping. They’re also designed for customers who are considering moving their on-prem system to Google Cloud, or customers who would like to deploy PI Core as an aggregation system to bring diverse on-prem data from multiple PI Systems into one place.
These deployment scripts automate the provision of the PI Core components running on Google Cloud N2D VM instances powered by hardware such as 2nd Gen AMD EPYC processors. GCP deployment scripts for PI Core include recommended practices for both platforms by delivering automation to deploy test and development environments.
The scripts are offered in non-high availability (non-HA) configuration, with an HA version providing failover capabilities for PI Core coming soon. These scripts are open-sourced, and the non-HA version is now available on Google’s GitHub.
By running PI Core on Google Cloud, the industrial sector has an easy and efficient path to business insights, generating more value from their collected data in a safe and managed environment. Enabling PI Core deployment on Google Cloud through scripts is just one of the solutions that we’re building together. Stay tuned for more updates.
The team that started HighByte picked an open spot called DataOps in the OT/IT pantheon that I felt was today’s sweet spot. The IT companies introduced me to DataOps (the next thing past DevOps, I guess). The application looked, shall we say, useful. HighByte is up to version 1.4 just released and has many applications in the wild already. It is well worth your time checking out.
HighByte announced the release of HighByte Intelligence Hub version 1.4. HighByte Intelligence Hub is an Industrial DataOps solution enabling manufacturers to streamline their data architecture and reduce time to deploy new systems. The software supports open standards and provides Operations Technology (OT) teams with the flexibility to merge, prepare, and deliver industrial information to and from IT systems without writing code. With its latest release, HighByte Intelligence Hub now supports complex, multi-value sets of data, a broader range of disparate data sources, flow consolidation, and additional security features.
“HighByte Intelligence Hub enables me to easily model data—adding context, segmenting, and publishing data to the unified namespace”, said Tim Burris, a software architect at Ectobox. Ectobox is an industrial intelligence solutions company specializing in data-driven manufacturing, ISA-95, and software consulting and development for their clients. “The time HighByte Intelligence Hub has saved me is incredible. I’m using the flow capabilities, updated SQL connecter, and REST input to reduce custom code and deliver a maintainable integration solution to our clients. It’s a game changer.”
HighByte works with a network of system integrators and international distributors to take HighByte Intelligence Hub to market, serving customers in a wide variety of vertical markets including biotechnology, packaging, industrial products, and academic labs. “This release focused on meeting the needs of our customers and partners with a more sophisticated solution that can handle complex data,” said HighByte CEO Tony Paine. “HighByte Intelligence Hub is ready to support Edge-to-Cloud and Cloud-to-Edge use cases.”
HighByte Intelligence Hub is an Industrial DataOps application designed for scale. The latest release includes support for complex data sets that can be processed through HighByte Intelligence Hub’s modeling engine, transforming and contextualizing information at high volumes. The release also includes new codeless connections for CSV files and JDBC drivers, significant improvements to existing SQL, MQTT, and OPC UA connectors, multiple flow inputs and outputs, array handling, and REST templates. Security enhancements include secure connections over MQTT with SSL encryption as well as payload encryption and usernames and passwords for OPC UA connections.
I copied the information below from the HighByte blog written by Aron Semle to add more context and user ideas to the product news: https://www.highbyte.com/blog/new-use-cases-highbyte-intelligence-hub-version-1-4-is-here
Highlights & Use Cases
Use Multiple Flow Inputs and Outputs. Let’s say you’ve created a “boiler” model to model the ten boilers you have at your facility. Now you can create a single flow that sends data for all ten of the boilers to an AWS IoT Core topic or all the boiler data to multiple outputs, like a REST endpoint for debugging and Azure IoT Hub for production.
See Your Data. Now you can easily create and test any input in the UI, allowing you to see what data and schema the input returns. Not sure where that “amps” attribute was in that JSON payload? Simply expand the input in the expression builder and see all the data and schema right in the UI.
Work With Arrays. Arrays are everywhere, from OPC UA arrays representing set points in the PLC, to an array of JSON objects representing motor running hours from a maintenance system. HighByte Intelligence Hub now supports inputting and outputting simple and complex array types. You can index arrays in expressions, slice them, dice them, and even build arrays out of primitive OPC UA tags.
Read CSV Files. Many older industrial devices, especially testing equipment, write their data to a CSV file. This new input makes it easy to read one or more CSV files from a directory. Files can be static, or the input can index files, reading them in chunks and moving them to a completion directory when done.
Get More From Your OPC UA Connections. OPC UA now supports encryption and username and password. The connector also supports outputting a model to a tag group (i.e., multiple tags), making getting modeled data into an OPC UA server even easier.
Move SQL data Into and Out of Your UNS. You can now read multiple rows from a SQL table, run them through a model to do things like change the column name or add in new data, and output them to MQTT as an array of JSON objects representing each row. This makes getting SQL data into your UNS easy. The SQL connector output also now supports table creation, so you can connect an MQTT input into the UNS, subscribe to a topic, and output the data model directly to a SQL table to review data locally. SQL outputs can also call stored procedures or log data as JSON in cases where the model schema isn’t finalized.
Leverage REST Templates. Do you need to get data into InfluxDB using their line protocol? Or Elastic Search? The REST output now allows you to transform the default JSON output to any format you need, including XML. This makes integration with REST based interfaces easy and flexible.
Enable Cloud-to-Edge with MQTT. MQTT now supports inputs and secure (TLS) connections, allowing you to connect to MQTT brokers like AWS IoT Core. This enables “Cloud-to-Edge” use cases, where machine learning alerts generated in AWS can be sent back down to HighByte Intelligence Hub, transformed via modeling, and then output to protocols like Sparkplug, making the data immediately available in any Sparkplug-enabled SCADA system like Ignition. Another common use case with a similar data flow is streaming sensor data to AWS IoT Core from third-party sensor providers (i.e. LoRa sensors). This data can be pulled in, modeled, and output to the SCADA system using HighByte Intelligence Hub.
Here’s some news from a new company using SaaS led by people I’ve known from other places in my past. They have an interesting take on manufacturing information systems.
ThinkIQ announced a suite of new solutions under its SaaS manufacturing platform, which features four new areas of data functionality, including ThinkIQ’s Visualize, Insight, Transform and Enterprise solutions.
This expanded functionality enables manufacturers to make sense of the data surfacing actions that enhance safety, reliability, and efficiency by leveraging a fact-based granular and a data-centric view of material flows and related provenance attribute data.
New platform components integrate with existing IoT infrastructure to help manage everything from supply chains to manufacturing processes and beyond. These added capabilities will continue to build upon ThinkIQ’s unprecedented material traceability and insight which helps manufacturers improve yield, quality, safety, compliance and brand confidence while reducing waste and environmental impact.
“The addition of the newest solutions within our platform will help manage the manufacturing process from supply chain to customer,” said Niels Andersen, CTO and CPO of ThinkIQ. “Having this truly transformative intelligence and insight into your supply chain helps organizations make smarter decisions about their processes which in turn makes them become more profitable and more competitive.”
The latest solutions offered on the ThinkIQ platform include:
ThinkIQ Visualize – This functionality moves companies past raw data to being able to explore, compare, and be aware of the data – with standardized metrics and views to bring wide visibility and context to data. ThinkIQ Visualize takes the existing data stream and brings on-premise gateways and connectors to centralize the data. Organizations will be able to see a view of all your data on one screen, and at multiple locations.
ThinkIQ Insight– This new feature uses advanced analytics to enable a material-centric view of operations. Deliverables include advanced visualizations, initial cause & effect identification, industry benchmarking, and cross-plant KPIs. Alerts and notifications bring problems to immediate attention, mitigation of recall risks, and potential yield improvements. It will give a material-centric view of operations, with insights unable to be seen earlier.
ThinkIQ Transform – This feature utilizes the results of the earlier steps to supply transformational intelligence and uncover root causes and effects. Data is correlated to the most important metrics. From process engineer, through plant manager to CEO, they have an instant, intelligent view of operations – one which stretches from the beginning of the supply chain through the plant and beyond.
ThinkIQ Enterprise – This functionality of the platform offers the ongoing benefits of Industry 4.0 manufacturing. Manufacturing process will now include traceability from raw materials to product delivery, as well as optimized supply chains and real-time contextualized data.
ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform is able to collect data inputs across supply chain (existing and IIoT sensors) and analyze with AI, ML to identify correlations and root causes. It creates a new set of value-added traceability data which is delivered with actionable insights and decisions to guide systems across the supply chain.
ThinkIQ, a pioneer of Digital Manufacturing Transformation SaaS, delivers unprecedented material traceability and insight into ways to improve yield, quality, safety, compliance and brand confidence. Our transformational intelligence platform delivers fact-based granular and data-centric contextualized view of material flows and related provenance attribute data that integrates into existing IoT infrastructures and crosses supply chains to Smart Manufacturing processes and beyond. Our customers have saved $10’s of millions by identifying waste and underperforming assets, as well as reducing warranty reserves for quality and safety issues. ThinkiQ is a privately held company headquartered in Aliso Viejo, CA.
Open Source, standards-based data platform to stimulate innovation, industrialize data management, and reduce time to market for new solutions in the energy industry.
Readers here know The Open Group perhaps mainly through the Open Process Automation Forum. But this technology consortium has many open-source projects in the works. Today’s news shows the continued vitality of the open-source movement.
The Open Group announced the OSDU Data Platform Mercury Release. Developed by The Open Group OSDU Forum, the OSDU Data Platform is an Open Source, standards-based, and technology-agnostic data platform for the energy industry that stimulates innovation, industrializes data management, and reduces time to market for new solutions.
The OSDU Data Platform will provide over time access to a vast portfolio of open and proven vendor-developed applications from a broad range of energy sources. By accessing this ecosystem, developers no longer have to develop and maintain the monolithic architecture needed to deliver unique value-add services. Now, with a single set of well-defined and industry-specific APIs, organizations can easily accelerate platform design and develop proprietary applications on top of the OSDU Data Platform.
With an open-source approach, any company – from established corporations to start-up challenger companies – can contribute new features to the platform, supporting a variety of business workflows. All work is validated by the OSDU Program Management Committee (PMC) to ensure it is aligned with the overall direction of the Forum.
With a single view of industry data, the OSDU Data Platform can be harnessed for innovative business applications. The Mercury Release of the OSDU Data Platform is now available to Operators and Software Developers who want to:
● Liberate data from traditional silos and make all data discoverable and usable in a single data platform
● Enable new integrated exploration and development workflows that reduce overall cycle time
● Take advantage of emerging digital solutions to provide scalability and accelerate decision making
Steve Nunn, President and CEO of The Open Group, commented: “The OSDU Data Platform Mercury Release represents an important achievement by the OSDU Forum in a very short space of time. Established in 2018, the OSDU Forum has accumulated over 185 Member organizations who are collaborating together to accelerate innovation and reduce costs in the energy sector. With a standard data platform, energy companies will be able to drive innovation by integrating digital technologies and utilizing open standards for better decision making. Looking ahead, this will be imperative to meet the world’s increasing energy demands while reducing greenhouse gas emissions.”
Johan Krebbers, GM Emerging Digital Technologies / VP IT Innovation, Shell, commented: “At the heart of most energy companies’ strategies is embracing the transformational technologies taking us forward in today’s digital era. This makes the need for a common architectural design clear, one that underpins how our industry works with its data.”
David Eyton, EVP Innovation & Engineering, bp, commented: “Data is at the heart of bp’s transformation into an integrated energy company. We believe that the future of the energy industry will be data driven and dependent on its ability to manage data in a manner that promotes data sharing with partners, innovation through data science, and rapid decision making throughout the lifecycle of the energy value chains. Being a founding member of the OSDU, bp has had an opportunity to be part of an organization that is fundamentally changing the data landscape for our industry. By integrating energy organizations, cloud services providers and software vendors the OSDU is providing an opportunity for collaboration that will be beneficial for all involved. We are very excited about the release of OSDU Mercury and look forward to expanding this approach into engineering, emissions, and new energy.”
To learn more about how to develop applications the OSDU Data Platform, visit the application developer community page here.
The current OSDU Forum Member List is available here.