Anyone reading this still curious about Microsoft appointing Satya Nadella CEO coming from the Web services division (Azure) or Amazon appointing Andy Jassy CEO coming from Amazon Web Services (AWS)? Those services continue to grow in importance to industrial software solutions. Case in point—this announcement from Seeq.
Seeq Corporation announced a new offering on Amazon Web Services (AWS) to accelerate access to manufacturing data for enabling AWS analytics on industrial data. The Seeq AWS Glue integration for Enterprise Historians, available on AWS Marketplace, simplifies industrial data discovery and migration to the AWS cloud using Seeq’s proven historian data access architecture.
Seeq also announced the completion of its SOC2 Type 1 certification. SOC 2 compliance is a critical consideration for companies evaluating SaaS applications to ensure vendors have the appropriate controls to protect data handled on their systems.
Seeq enables engineers and scientists in process manufacturing organizations to rapidly analyze, predict, collaborate, and share insights to improve production and business outcomes. Seeq customers include companies in the oil and gas, pharmaceutical, chemical, energy, mining, food and beverage, and other process industries.
AWS Glue is a serverless data integration service that makes it easy to discover, prepare, and combine data for analytics, machine learning, and application development. The integration of Seeq and AWS Glue creates a secure and virtualized connection to on premise historian databases, manages data discoverability, and enables data science teams to access this data for machine learning in AWS. The result is simplified access to machine data, process data, and contextual data stored in historian databases—including the OSIsoft PI system, OSIsoft PI Asset Framework (AF), AspenTech IP21—along with other data historians.
“Analytics software for manufacturing customers is an area long overdue for innovation,” says Megan Buntain, Director of Cloud Partnerships at Seeq. “By choosing Seeq SaaS to democratize innovations in big data, machine learning, and computer science, industrial organizations can easily access new capabilities to improve production and business success.”
In addition to Seeq integration with AWS, Seeq connects to an extensive set of data storage platforms from vendors including OSIsoft, Siemens, GE, Honeywell, Inductive Automation, AVEVA, AspenTech, Yokogawa, InfluxDB, Snowflake, and others. Seeq is available worldwide through a global partner network of system integrators, which provides training and resale support for Seeq in over 40 countries, in addition to its direct sales organization in North America and Europe.
The action occurs at the edge these days. At least, edge as defined by the IT groups as they reach out toward the plant or factory with networks, compute, and other technologies as IT searches for more and better data to feed their decision-making systems. This release comes from ZEDEDA, a company relatively new to me that is quickly filling a space in the system.
- Direct integration simplifies secure deployment and management of Kubernetes clusters and hardware at the distributed edge at scale, without requiring specialized IT skills
- Collaboration with SUSE enables fully automated deployment of K3s clusters in the field, directly from ZEDEDA’s built-in app marketplace
- Open foundation and additional support for native Docker containers and VMs on edge hardware prevents lock-in and enables any brownfield or greenfield application
Following are details
ZEDEDA has announced direct integration with Kubernetes to simplify remote deployment and management of Kubernetes clusters on edge nodes at scale.
There is a clear trend for the majority of workloads to be containerized and to increasingly leverage Kubernetes for standardization, redundancy, and scale-out. However, this presents users with a challenge because the same tools developed for orchestrating and deploying Kubernetes in centralized data centers or the public cloud do not scale down to constrained edge nodes in the field. Organizations also face the challenge of transitioning to cloud-native development principles at the edge while accommodating their legacy software investments.
ZEDEDA’s cloud-based orchestration solution has a simple and intuitive UI along with comprehensive APIs that abstract all the complexities of provisioning Kubernetes clusters at the distributed edge, automating cluster bring-up on target edge nodes within minutes. With a few clicks or API calls, administrators without specialized IT skills can deploy Kubernetes clusters in the field and remotely manage the entire lifecycle of both their Kubernetes runtime and the underlying hardware at scale.
“Adding Kubernetes support on top of our already industry-leading distributed edge orchestration solution gives our customers unmatched flexibility,” said ZEDEDA founder and CEO Said Ouissal. “By collaborating with SUSE, we’ve teamed up with the leader in Kubernetes to put these types of advanced deployments within reach for anyone.”
The solution integrates with customers’ existing CI/CD workflow, features a robust and unique Zero Trust security architecture and supports any combination of virtual machines (VMs) and native Docker containers. In addition to supporting the choice of Kubernetes runtime distribution, ZEDEDA’s unique enablement of VMs on edge nodes enables deployment of any combination of legacy Windows-based applications (e.g., SCADA, HMI, Historian, VMS, POS), monolithic Linux-based images, and other popular container runtimes such as Docker/Moby, Azure IoT Edge and AWS Greengrass.
Optimized for the Unique Needs of the Distributed Edge
Orchestration solutions initially built for the data center cannot seamlessly transition to meet the unique needs of distributed edge computing in areas of available compute footprint, autonomy, security and deployment scale. ZEDEDA’s solution is purpose-built to address these needs by starting with a lowest-common-denominator foundation that scales up to create a bridge to the data center paradigm.
Distributed edge nodes under management can range from IoT gateways to managed telco and universal enterprise CPE infrastructure to small remote server clusters. The solution supports any vertical use case including IoT, edge AI, virtualized networking and security technologies in deployments spanning the factory floor, retail stores, oil rigs, wind turbines, transportation systems and beyond. The full-stack edge orchestration approach is based on an eventual consistency model to ensure that edge nodes will run autonomously in these environments, even if they periodically lose connectivity to the ZEDEDA cloud orchestrator.
ZEDEDA’s Zero Trust security architecture assumes that edge nodes distributed in the field are physically accessible, in addition to not having a defined network perimeter. Features include support for silicon-based root of trust, measured boot, remote attestation, crypto-based ID (eliminating local device login), full disk encryption, remote port blocking, distributed firewall and more. Distributed firewall capability enables secure routing of data between edge applications and both on-prem and cloud resources based on network-wide policies.
Collaboration with SUSE
Working with a leading oil and gas services company, ZEDEDA and SUSE have recently enabled an edge solution that consolidates existing SCADA software with NFV capability and other functionality on compute clusters within their trucks to optimize monitoring of oil wells through wireline analytics.
SUSE’s K3s runtime is now available in ZEDEDA’s built-in app marketplace, ready for bulk deployment on edge computing clusters with a few clicks.
“Edge is about management at scale, and our customers need a solution that provides low-level hardware management and visibility as they increasingly look to deploy Kubernetes clusters outside of centralized data centers,” said Keith Basil, vice president of Cloud Native Infrastructure at SUSE. “We think it’s great that ZEDEDA has integrated with Kubernetes and that K3s is now available in their marketplace. We are excited for their contribution to the rapidly growing Kubernetes ecosystem.”
Built on an Open Foundation to Prevent Lock-in and Facilitate Ecosystem Growth
ZEDEDA’s subscription-based cloud orchestration solution leverages the bare metal EVE-OS deployed on edge nodes. EVE-OS is an open, secure and universal operating system for distributed edge computing with vendor-neutral APIs, hosted within Project EVE in the Linux Foundation’s LF Edge organization. In addition to preventing vendor lock-in, EVE-OS provides an anchor point to unify an ecosystem of edge computing hardware and software, similar to what Android does for the mobile market.
The company’s growing open edge ecosystem provides end users, solution OEMs and system integrators with choice ofhardware, applications and services, thereby greatly reducing the complexity of dealing with the inherently diverse edge landscape. The app marketplace that is part of ZEDEDA’s cloud orchestrator features popular edge applications spanning industrial connectivity, edge application frameworks, security, networking, analytics, data management and cloud connectivity. Customers can also create private marketplaces with their own curated and in-house developed content.
ZEDEDA’s strategic integration with Kubernetes follows their recent announcement of seamless integration with Microsoft Azure IoT to simplify the deployment and scalability of solutions leveraging Azure IoT. Additional integrations that further simplify distributed edge computing deployments are in development.
ThinkIQ introduced itself to me just a couple of months ago, and I set aside some time to talk with an acquaintance from the industrial software market Niels Anderson. https://themanufacturingconnection.com/2021/02/manufacturing-saas-platform-tracks-material-flow-not-assets/ The company published a couple of news items this month. One announces an extension to the product line; the second announces a partnership with CESMII for a smart manufacturing initiative in the poultry processing industry.
VisualOps Solutions to Suite of Products
ThinkIQ announced VisualOps, which is designed to help organizations obtain easy access to data from a material view, new visibility, and a path towards Industry 4.0 Manufacturing.
ThinkIQ VisualOps was created as a second step for companies on the path to Industry 4.0 Manufacturing. The benefits include having data standardized and available in one location, empowering manufacturing leaders, plant managers, process and data engineers, and operators, by allowing them to explore their manufacturing and supply chain data within the context of their business. The new function can also start the process of creating alerts and notifications that may bring problems to immediate attention.
“The addition of VisualOps allows customers to start the journey of monetizing their manufacturing and supply chain data using an Industry 4.0 Platform that will help them achieve their digital transformation goals,” said Niels Andersen, CTO and CPO of ThinkIQ. “This product will help organizations obtain the benefits of Industry 4.0 and lead them on the path to Smart Manufacturing.”
Some of the additional benefits of ThinkIQ VisualOps include:
- Ability to move companies past raw data to being able to explore, compare, and be aware of the data — with standardized metrics and views to bring wide visibility and context to what is currently just digital bits.
- Allows organizations to harness the power of what are mostly disconnected existing data streams from IoT, IIoT, HMIs, PLCs, CRM, MES, digitized manual data, and partner data, all into one single location.
- Includes on-premise gateways & connectors to centralize the data and securely send this data to the cloud, and most clients don’t need to add any new hardware or software to their existing environment.
- Software includes sourcing existing data from Automation, IoT and IIoT, CRM, and other digital captures, and also includes an equipment profile library, equipment modeling, manufacturing process layout, trending, standardized dashboards, and basic limits & notifications.
ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform collects data across the operation (existing and IIoT sensors) and leverages AI, ML to provide actionable real time insights (e.g., identify correlations and root causes, traceability and yield issues, etc.). It creates a new level of capability beyond what independent disconnected operating environments can provide today.
CESMII and ThinkIQ To Transform Global Food Leader’s Poultry Processing
CESMII selected ThinkIQ for inclusion in its co-funded Smart Manufacturing Innovation Projects. This project aims to transform poultry processing operations at one of the world’s largest food companies, and recognized leader in protein. The project seeks to quantify the impact of variability in the supply chain and the processing of chicken by-products and understand the resulting impact of yield for four product streams, including: bone meal, feather meal, chicken meal, and blood meal.
ThinkIQ will utilize the CESMII Smart Manufacturing Innovation Platform and create Profiles that optimize yield and material utilization on the food company’s poultry processing line. This will enable decisions based on real-time constraints in material flows, manufacturing operations, and energy consumption in a protein-based food processing environment. This project will demonstrate increased operational efficiencies that can be extended to other food processing and energy-intensive industries.
“Delivering value through interoperability and scalability is essential,” says Doug Lawson, CEO of ThinkIQ. “We are leveraging the CESMII SM Innovation Platform to avoid implementing yet another information island on the plant floor, thereby reducing cost and complexity, and facilitating rapid return on investment.”
ThinkIQ’s ability to track material flow through the entire process from “farm to fork” will provide the company with a unique view of their entire poultry processing, allowing them to gain visibility into real-time variance in their manufacturing process. This enables them to quantify and reduce impact of variance improving yield, provide feedback in “plant time” to reduce off-spec, waste, etc. and present profit enhancement opportunities throughout the project.
Out of a conversation I had during Hannover Messe, I received this application using an AI platform for optimization. You can head over to its website for more information.
Canvass Analytics Inc., (“Canvass AI”), a supplier AI for the industrial sector, announced that a major food ingredient manufacturer is using the Canvass AI platform to significantly reduce its plant’s energy costs and greenhouse gas emissions. In just one year of using the platform, the industrial company has reduced its annual energy costs by 4% and the plant’s carbon emissions by more than 9 Million pounds.
As part of the North American ingredient company’s drive to achieve cost efficiencies and reduce the environmental footprint of their operations, the manufacturer implemented the Canvass AI Platform. Canvass AI empowers industrial engineers to harness the power of AI to make data-driven decision in their day-to-day operations 12x faster than other platforms.
“This world leading manufacturer is using Canvass AI to address three of their top operational challenges: the need to reduce costs, optimize OEE, and reduce their environmental footprint. With Canvass AI, the operations team are able to use AI first-hand to make data-driven decisions that keep their operations within specification, ensuring throughput is maintained while reducing waste,” commented Humera Malik, CEO of Canvass AI.
“What we are seeing is that AI is ready and industrial companies are driving impact with AI today. However, in order to scale AI, companies need to have a clear business outcome in mind, automate a process that is standardized with good quality data, and leverage a platform that is purpose-built and secure. AI, when combined with a culture dedicated to continuous improvement, will enable the fourth industrial revolution driving zero-touch operations and zero defects,” commented Sachin Lulla, Global Digital Strategy and Transformation Leader, Ernst & Young.
Back in the 90s, I used to haul around a $25,000 vision system in the trunk of my car to perform demonstrations of machine vision technology applications.
Today, there is more video power in my smartphone than in that entire system.
Just like all the technologies we use in manufacturing, vision systems and video have become more powerful and useful,most often leveraging consumer electronics or IT innovations. I visited a small chemical refinery that installed streaming video into its operator interface for a unique, but essential, personnel safety/security application. Located in a rural area of Texas, the refinery operators periodically opened the gates to allow railway cars into the facility or to let the filled cars leave. The open gates became a welcome invitation to the local coyote population. Of course, these guys were not wanted wandering around the facility. The video system watched for incursions and alerted personnel.
Not too long ago, the bandwidth required by that streaming video would have been too expensive or awkward to be economical. Now, it’s just another sensor.
Intelligent Video for Health and Safety
These Covid pandemic days have led to new use cases for video. AT&T identifies a few key examples on their video intelligence page:
- Ensuring social distancing
- Counting people to maintain safe capacity
Infrared thermal imaging has progressed to the point that strategically placed thermal imaging cameras can monitor personnel for fevers—an outward sign of potential Covid infection. We can potentially stop the spread of the virus at the plant entrance.
Another Covid-related application involves contact tracing and social-distancing assurance. These applications require high bandwidth along with sophisticated analysis software—both now readily available. And, both technologies are poised for improvement. We will see 5G installations before long that will improve bandwidth, speed, and latency forvideo applications.
“Outside of these pandemic applications, process plants with hazardous areas have found video sensors to be a perfect solution to determining personnel safety during an incident. Rescue teams need to know who is in the area and where they are. Security teams can be alerted if someone wanders into a hazardous or restricted area.
Intelligent Video for Quality Control
Then we return to the applications I once tried to solve—product quality. While it is best practice to fix the process such that defects are not produced, vision inspection is another step in assuring products that fail to meet specification are not shipped to customers. Taking a feedback loop from inspection information provides a pathway to solving the process problem. As network bandwidth improves and video sensors become smaller, cheaper, faster, these video IoT solutions become more attractive.
5G is the Foundation
Apple released its latest iPhone (one of which is lying on my desk) with great hoopla about 5G. Apple pundits were originally less than enthusiastic about the 5G bandwidth. I have been advising them, along with clients and readers,about the tremendous value that will be unlocked by 5G. It may not be as apparent in an individual iPhone, but we will see a massive shift in business and manufacturing applications.
5G skeptics do exist, but most technologists are decidedly bullish on the possibilities. I think that manufacturers of many varieties will begin deploying the networks for one or two of the reasons that fit them, and then discover that they’ve received more benefit than they expected. Then managers and engineers will have difficulty remembering why there was any debate over moving from LTE to 5G.
As the AT&T Business team puts it in their “Agility Refined” white paper:
5G is the next generation of wireless communications technology. In essence, 5G will put the network edge closer to users and devices. It uses mid-band frequencies and millimeter wave (mmWave) to help accomplish this.
5G offers significantly larger spectrum allocations and enables exponentially increased data rates. It has a reduced range compared to today’s 4G frequencies—but the antennae needed for 5G are much smaller. This will allow for a dense network of small cells, enhancing the current user experience.
As you lay out your 5-year-and-beyond scenarios, this intelligent video powered by 5G will be technology to keep in the narrative.
This post was sponsored by AT&T Business, but the opinions are my own and don’t necessarily represent AT&T Business’s positions or strategies.
Work targeted to fleshing out the Edge continues. This news from LF Edge and the Digital Twin Consortium hits one of my keywords—interoperability. Industry does progress.
The Digital Twin Consortium, which coalesces industry, government, and academia to advance digital twin technology, announced a partnership with LF Edge, an umbrella organization within the Linux Foundation, that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system. Through the liaison, Digital Twin Consortium will work closely with LF Edge’s EdgeX Foundry, an open source, loosely coupled microservices framework. The two organizations will identify and solve common problems in the establishment, management, and operation of digital twins through edge computing platforms.
The liaison has been established to:
- Showcase how a common approach to digital twin technology can allow edge platforms connected to real-world entities to interoperate with virtual representations easily and flexibly.
- Accelerate EdgeX Foundry’s adoption of digital-twin-enabling technology and techniques. Specifically, explore EdgeX’s adoption of the Digital Twin Consortium digital twin reference architecture patterns to demonstrate interoperability.
- Collaborate on open-source projects to facilitate the implementation and consumption of Digital Twin Consortium platform stack reference architecture, guidelines, and related deliverables. Collaborate with LF Edge on the language, definitions, and taxonomy used to discuss digital twin technology.
“To advance edge computing, we need a global ecosystem that supports interoperability,” said Arpit Joshipura, general manager, Networking, Automation, Edge and IoT, the Linux Foundation. “With this collaboration, EdgeX Foundry aims to adopt and showcase the Digital Twin Consortium reference architecture and make it easier for developers to connect any digital twin to physical devices/sensors via EdgeX Foundry in their edge solutions.”
“Edge computing enables a new wave of applications and capabilities in many industries, especially when joined with other technologies, such as 5G, IoT, and digital twin,” said Dr. Said Tabet, Chief Architect, Office of the CTO, Dell Technologies, and Digital Twin Consortium Steering Committee member. “The agreement between LF Edge and Digital Twin Consortium sets the stage for a collaboration that will facilitate the integration of edge platforms and digital twin technologies.”
“We are excited about our collaboration with LF Edge’s EdgeX Foundry,” said Dan Isaacs, Chief Technical Officer, Digital Twin Consortium. “Their knowledge and experience in the integration of edge platforms within organizations will be invaluable as our organizations collaborate to advance the use of digital twin technology.”