ZEDEDA Introduces New Certification to Support Growing Use of Edge Computing

Companies are trying several different strategies to assemble a coalition or collaboration in pursuit of building market momentum. Many of these niche software categories are interesting. My wonder is whether the end game is the typical one among technology startups these day—eventual acquisition. Sometimes these work out and sometimes they don’t. But Zededa (I don’t like the all caps marketing ploy) does have some interesting technology. Here is its latest strategy.

  • Zededa Certified Edge Computing Associate (ZCEA) is a free, vendor-agnostic certification providing fundamental knowledge of the benefits and solutions of edge computing
  • ZECA is the latest addition to the Zededa Certification Program and part of Zededa Edge Academy, the industry’s first online educational hub
  • Edge computing is expected to be a cornerstone of ongoing digital transformation projects within all types of organizations

The Zededa Certified Edge Computing Associate (ZCEA) certification is available to anyone interested in learning more about the fundamentals of the edge computing industry. The course is free and vendor-agnostic, and the certificate can be shared on resumes, online profiles and job boards.

“As a leader in the industry, ZEDEDA is well-positioned to educate people in the value, challenges, and key industries poised to benefit,” said Kevin Freitas, Senior Director, Learning and Edge Academy. “We’re excited to launch this new certification to share edge computing essential knowledge as this aligns perfectly with our commitment to being leaders in the Edge Management and Orchestration market.”

Seeq, Databricks Partner to Accelerate IT-OT Data Convergence Across the Enterprise

What can we say? It’s still all about data—collect, analyze, present, use.

Seeq Corporation, a leader in advanced analytics for manufacturing, announced a partnership with Databricks, the data and AI company, that brings a native integration between each company’s platform to simplify access to high quality asset and process data, unify IT and OT data processing, and accelerate AI/ML adoption across the industrial manufacturing, pharmaceuticals, chemicals, and energy sectors.

This bi-directional integration enables users to seamlessly combine contextualized time series data from industrial assets with a vast array of enterprise data sources to deliver more robust reporting, analytics, and predictions in their business. Databricks customers can now take advantage of Seeq’s extensive connectivity to time series data sources and power a wide range of analytical use cases across the enterprise. Insights developed in Databricks Lakehouse Platform can be operationalized in Seeq, introducing new opportunities for process experts and data teams to deliver data-driven solutions to increase industrial productivity, improve operational reliability, enhance safety, and accelerate progress towards sustainability goals.

For petrochemical manufacturer Chevron Phillips Chemical Company, the Seeq and Databricks integration accelerated the company’s ability to scale data science and machine learning efforts across multiple digital initiatives involving process and laboratory data.

Brent Railey, Chief Data & Analytics Officer of Chevron Phillips Chemical Company comments, “We are very excited about this partnership, as it will be mutually beneficial for Databricks, Seeq, and their shared customers. Seeq brings key time-series functionality that just isn’t available in other solutions. Seeq also simplifies the complexities of connecting to various types of process data sources. Databricks brings scalable, elastic data engineering and data science capabilities at an affordable price. Seeq can bring data to Databricks for complementary analytic purposes within Databricks. Databricks can serve cleansed and refined IIoT data to Seeq for self-service analytics. This partnership should make this one-two punch even more powerful!”

“Our collaboration with Seeq unlocks tremendous value for customers, making it simpler for organizations to operationalize and democratize IoT datasets by leveraging the open and secure foundation of the Databricks Lakehouse Platform. This significantly lowers the barrier to data-driven innovation in the industry,” says Shiv Trisal, Global Manufacturing Industry Leader at Databricks.

“The Seeq and Databricks integration is a critical step toward bridging the communication gap between operations technology and information technology personnel, which will drive increased machine learning value across the enterprise,” says Megan Buntain, VP of Global Partnerships and Ecosystem at Seeq. “We’re thrilled to add Databricks to the Seeq partner ecosystem and look forward to continuing to innovate with their team to improve outcomes for manufacturers.”

Yokogawa to Release Collaborative Information Server Upgrade

More robust alarm management, improved access to maintenance information, and enhanced support of international standards.

Once I knew many people at Yokogawa and had visited their offices outside Atlanta and Houston a few times. Now I see Dave Emerson and Penny Chen a couple of times a year and haven’t heard anything from marketing for years. Now I’m back on Yokogawa’s radar.

This news is interesting because data has become a meaningful topic. This news reveals updates to the Collaborative Information Server (CI Server) part of the OpreX Control and Safety System family. The update provides more robust alarm management than before, improved access to maintenance information, and expanded support for international communications standards.

This solution will bring together large volumes of data from various plant equipment and systems to enable the optimized management of production activities across an entire enterprise, and provide the environment needed to remotely monitor and control operations from any location and make swift decisions.

Yokogawa came out with the first release of CI Server in 2021. By integrating the handling of all kinds of data from plant equipment, devices, and systems, this solution facilitates the optimized management of production activities across an entire enterprise. With this latest upgrade to CI Server, the company has introduced new functions that enhance its connectivity with plant systems, equipment, and devices, and enable the integrated management of alarms and the use of field device data for such purposes as routine maintenance.

1. OPC UA A&C message reception function

CI Server now supports the client function of OPC Unified Architecture Alarms & Conditions (UA A&C). This enables CI Server to safely and securely receive alarms, system state information, and other kinds of data from any device that supports OPC UA A&C server function, including third-party products.

2. CENTUM VP and ProSafe-RS alarm receiving function through V net

In addition to the monitoring of plant operations, CI Server can now directly receive CENTUM VP and ProSafe-RS alarms through V net. With the addition of this function, linkage with Yokogawa’s CENTUM VP distributed control system and ProSafe-RS safety instrumented system has been enhanced.

3. Linkage with Plant Resource Manager (PRM)

CI Server can now link with Yokogawa’s PRM (R4.06 or later) and is thus able to handle and provide to higher-level systems all the information that is managed using this software package. By linking with data from multiple application programs, CI Server enables the comprehensive monitoring of field device status information, parameters, and other kinds of data.

CI Server alarm integration and linkage with PRM

ZEDEDA Launches Edge Computing Application Services Suite

I’ve not seen much in the way of investment in traditional automation products. The larger companies now all call themselves software companies with investments devoted to acquisitions. Many smaller companies and startups either have a software niche or are working on a variety of edge applications.

A long-time contact from the IT world introduced me to ZEDEDA a few years ago and even asked me to appear on a couple of their webcasts. Its niche is called edge orchestration, and it has some significant investors. This new introduction  is called Edge Application Services. This platform includes granular edge application controls and configuration services. The initial component of the platform, Edge Access, provides secure access, control and audit tracing for edge deployments.

Edge computing is required to manage and process that data, but the complexity of distributed environments can make it difficult for customers to get started quickly. Enabling access to core services can provide an on-ramp for organizations to benefit from an initial edge use case while also establishing a foundation for future growth, just as was seen previously with cloud adoption.

“Just as we saw occur with the cloud providers in the early days, it is time for the edge market to evolve beyond just infrastructure and begin to offer value-added services in addition,” said Said Ouissal, founder and CEO of ZEDEDA. “Now, with ZEDEDA Edge Application Services, we are able to offer our customers the ability to manage, configure and control their edge applications simply by leveraging the ZEDEDA ecosystem.”

The first service in the suite, ZEDEDA Edge Access, enables IT administrators and platform operations teams to instantly access any remote device from any location at any time. It is a simple solution that provides secure access, control and audit tracing for edge deployments.

ZEDEDA’s open, distributed, cloud-native edge management and orchestration solution has attracted strategic OEM and customer relationships with Global 500 companies, including Emerson, Rockwell Automation, and VMware. The company continues to quadruple the number of edge nodes it has under management annually, scaling toward a hundred thousand edge nodes and has raised more than $55 million in capital from investors, including Coast Range Capital, Lux Capital, Energize Ventures, Porsche Ventures, Chevron Technology Ventures, Emerson Ventures, Juniper Networks, Rockwell Automation, Samsung Next and EDF North America Ventures.

OMAC and ei3 Release New Manufacturing Data Governance Guide 

OMAC birthed about the same time that I left manufacturing to become an editor at Control Engineering. Originally Open Modular Architecture Computer it held center stage at the annual ARC Industry Forum for many years. The open computer as a PLC replacement designed to drive down the high costs of controllers never really made it. The organization did drive a number of initiatives benefiting users, especially PackML, a uniform way of describing processes of a machine to an operator.

Now called The Organization for Machine Automation and Control, it has found a home within the Packaging Machine Manufacturers Institute (PMMI) which also was the temporary home of my old magazine Automation World.

I haven’t seen or heard much from OMAC for quite some time. I’m glad to see some news with the release of some useful information. 

OMAC, the leading industry association for machine automation and control, in collaboration with ei3, releases a new guide on industrial data sharing titled “Practical Considerations for Data Governance”. The guide provides expert insights and actionable recommendations to help organizations establish effective data governance and sharing practices. 

The document covers several critical topics, including the importance of data governance, the need for a common language to facilitate data exchange across systems, and the various applications that use plant floor data. It examines the key components for plant floor data, the significance of data governance standards and organizations in manufacturing, and establishes a structure for data access and integration from multiple sources.

Spencer Cramer, OMAC’s Chairman, and the Founder and CEO of ei3, emphasized the critical need for organizations to collect data and transform it into actionable insights to optimize production and efficiency in the constantly evolving manufacturing landscape. “We are excited to launch this guide and provide the industry with a resource that outlines practical considerations for facilitating data sharing within and across organizations to improve processes, save costs, and mitigate errors,” he said.

Glad to see Mark Fondl still active and looking at how to use new technologies. Once again as a new technology editor, Fondl and I had long discussions about the future of Ethernet as an industrial network. That was not a foregone conclusion in the late 90s. Now, it’s standard.

Mark Fondl, OMAC’s Digital Transformation Workgroup Leader and Vice President of Product Management at ei3, explained that plant floor data can be valuable for different user groups, and data scientists can help maximize its potential. He added, “To efficiently use plant floor data, an assessment should be made, and a team should be created to ensure all stakeholders are coordinated. The data governance policy should include both IT and OT, and guidelines should be provided for internal and external companies. The plan should be adaptable to changing capabilities to ensure its long-term success.”

OPC, MQTT, IoT, Edge, Power Future Manufacturing Technology

There was a time when I would take information from OPC Foundation and chat with the MQTT people and then return the favor. It was much like being in the midst of a religious war.

My response was (is) that the market will decide. Individual engineers will choose the solution that best fits their needs at the time. If both technologies have sufficient benefit to enough engineers to form a market, then both will survive. I think there is room in the market for both, since they sort of do the same thing, but actually each provides unique benefits.

I’ve been thinking about this for a while since I’ve had so many other things to digest. The impetus came from a couple of directions—OPC Foundation President Stefan Hoppe’s editorial in the June newsletter and from Stacey Higginbotham’s IoT Newsletter recently that discussed edge.

Hoppe wrote, “Still to this day people only think of OPC UA merely as a secure protocol to move information. It is so much more than that. It is a modeling language in cloud applications and digital twins. It is capable of file transport (since 2009). Most people know that OPC UA started as an initiative in the OT world and expanded from the PLC control plane to SCADA and later to MES and ERP. More and more people are realizing that OPC UA via MQTT is the bridge between OT and IT and is able to push information directly into Microsoft and AWS cloud dashboards without the need for an adapter.”

From Data to Data Sources

Stacey Higginbotham writing in Stacey on IoT Bringing AI to the farthest edge requires new computing.

Stacey writes about IoT generally. Most of her topics are commercial/consumer and chips (her reporting background). She does follow the IoT trail into manufacturing at times. In this newsletter she broaches into something I’ve been expounding for a long time, that is, how edge devices have become smarter with better communications. Then the IT world came up with the term Edge, which is, of course everything manufacturing.

We’re in the midst of a computing shift that’s turning the back-and-forth between cloud and edge computing on its head. This new form of computing has been creeping to the forefront for the last few years, driven by digital transformations and complicated connected devices such as cars.

But the more recent hype around AI is providing the richest examples of this shift. And it will ultimately require new forms of computing in more places, changing both how we think about the edge and the types of computing we do there. In short, the rise of AI everywhere will lead to new forms of computing specialized for different aspects of the edge. I’m calling this concept the complex edge.

As part of this shift in computing, we have to become more nuanced about what we mean when we talk about the edge. I like to think of it as a continuum moving from the most compute and power-constrained devices such as sensors to the most powerful servers that happen to be located on premise in a factory. In the middle are devices such as tablets, smartphones, programmable logic controllers (PLCs), and gateways that might handle incoming data from PLCs or sensors.

Moreover, each of these devices along the continuum might run their own AI models and require their own specialized type of computing to compare the data coming into those models. For example, I’ve written about the need for sensors to get smarter and process more information directly.

Smart sensors turn to analog compute

Cameras or image sensors are popular examples of such devices. This vision sensor from Useful Sensors, which can do person detection on a $10 device, runs a simple algorithm that looks for people and counts them. At a higher level, which requires more processing power, sensors from Sony or chips from CEVA are able to detect specific movements, faces, or other options.

A few weeks ago at the Sensors Converge event, a company called Polyn Technology showed off a version of a chip designed to take raw data and quickly convert it into an insight. To quickly process analog signals from the environment (such as vibrations or sound), the Polyn chip uses analog processing to process the signal and then sends the “insight” to another computer for more processing.

We not only have cameras shooting pictures for QA purposes, but also they are streaming video for applications from industrial engineering to surveillance to predictive maintenance. This is a vast amount of data. 

We have tools, but we will need more. Chips with built in communication and analytics are a start.

Follow this blog

Get a weekly email of all new posts.