Pay-As-You-Go Solution Provides a Simpler Way to Leverage More Cloud Capabilities
Inductive Automation recently moved its flagship Ignition software to the cloud. Now it’s available on the Microsoft Azure Marketplace on a pay-as-you-go plan.
Ignition Cloud Edition works together with the standard Ignition platform and Ignition Edge, creating a fully integrated and scalable system that connects people, processes, and programs for true digital transformation.
“Ignition Cloud Edition on Azure combines the vast power of the Ignition platform with Azure’s extensive cloud computing services. Adding this listing to Azure’s marketplace provides an additional avenue for access, bringing additional flexibility to architectures and opening up even more possibilities for innovation,” says Kevin McClusky, Chief Technology Architect, Inductive Automation.
Ignition Cloud Edition comes packaged with Ignition Core Modules: Perspective, Reporting, SQL Bridge, OPC UA, Enterprise Administration Module (EAM), Tag Historian, and Alarm Notification. It also includes the Web Development, Twilio Notification, and Voice Notification modules, as well as the MQTT Engine, MQTT Distributor, MQTT Transmission modules from Cirrus Link Solutions. A cloud connector module called the MongoDB Module is also included, and Cloud Edition users will get new cloud connector modules as they become available.
Ignition Cloud Edition’s pay-as-you-go purchasing model opens up budget options, allowing companies to use part of their operating expenditure budget allotted for cloud services instead of capital expenditures. It is important to note that Cloud Edition is not SaaS, and users are responsible for configuration, backup, and upgrades of the software. Cloud Edition is backed by TotalCare Support, which includes unlimited phone and web support, upgrade protection, discounts on training courses, and more.
What can we say? It’s still all about data—collect, analyze, present, use.
Seeq Corporation, a leader in advanced analytics for manufacturing, announced a partnership with Databricks, the data and AI company, that brings a native integration between each company’s platform to simplify access to high quality asset and process data, unify IT and OT data processing, and accelerate AI/ML adoption across the industrial manufacturing, pharmaceuticals, chemicals, and energy sectors.
This bi-directional integration enables users to seamlessly combine contextualized time series data from industrial assets with a vast array of enterprise data sources to deliver more robust reporting, analytics, and predictions in their business. Databricks customers can now take advantage of Seeq’s extensive connectivity to time series data sources and power a wide range of analytical use cases across the enterprise. Insights developed in Databricks Lakehouse Platform can be operationalized in Seeq, introducing new opportunities for process experts and data teams to deliver data-driven solutions to increase industrial productivity, improve operational reliability, enhance safety, and accelerate progress towards sustainability goals.
For petrochemical manufacturer Chevron Phillips Chemical Company, the Seeq and Databricks integration accelerated the company’s ability to scale data science and machine learning efforts across multiple digital initiatives involving process and laboratory data.
Brent Railey, Chief Data & Analytics Officer of Chevron Phillips Chemical Company comments, “We are very excited about this partnership, as it will be mutually beneficial for Databricks, Seeq, and their shared customers. Seeq brings key time-series functionality that just isn’t available in other solutions. Seeq also simplifies the complexities of connecting to various types of process data sources. Databricks brings scalable, elastic data engineering and data science capabilities at an affordable price. Seeq can bring data to Databricks for complementary analytic purposes within Databricks. Databricks can serve cleansed and refined IIoT data to Seeq for self-service analytics. This partnership should make this one-two punch even more powerful!”
“Our collaboration with Seeq unlocks tremendous value for customers, making it simpler for organizations to operationalize and democratize IoT datasets by leveraging the open and secure foundation of the Databricks Lakehouse Platform. This significantly lowers the barrier to data-driven innovation in the industry,” says Shiv Trisal, Global Manufacturing Industry Leader at Databricks.
“The Seeq and Databricks integration is a critical step toward bridging the communication gap between operations technology and information technology personnel, which will drive increased machine learning value across the enterprise,” says Megan Buntain, VP of Global Partnerships and Ecosystem at Seeq. “We’re thrilled to add Databricks to the Seeq partner ecosystem and look forward to continuing to innovate with their team to improve outcomes for manufacturers.”
Honeywell engineers have been busy with a variety of sustainability technology applications. Beyond what I’ve written previously here is another initiative.
Honeywell announced an expansion to its Honeywell Forge Sustainability+ for Industrials | Emissions Management software application that allows industrial companies to measure and monitor both direct and indirect greenhouse gas (GHG) emissions of their operations.
The software application can collect data from Honeywell’s leading sensors and gas-cloud imaging cameras to measure direct GHG emissions, also known as Scope 1 emissions. The new capability aggregates data from additional sources to measure indirect GHG emissions from the purchase of energy, known as Scope 2 emissions.
Despite global efforts in decarbonizing the power sector, electricity and heat generation are responsible for over 40% of global CO2 emissions, one of the main types of greenhouse gas. Measuring, calculating and accounting emissions are key steps to abate emissions and are incentivized by the recently enacted Inflation Reduction Act in the U.S. GHG emissions are also regulated around the world.
“Honeywell’s newly expanded solution provides customers with a more comprehensive view of their emissions and a critical tool toward meeting their sustainability goals,” said Ravikrishnan Srinivasan, vice president and general manager of Emissions Management at Honeywell Connected Enterprise. “Honeywell is uniquely positioned with its ready-now technology and experience to be the transformational partner that helps organizations accelerate their progress in achieving sustainability outcomes.”
OMAC birthed about the same time that I left manufacturing to become an editor at Control Engineering. Originally Open Modular Architecture Computer it held center stage at the annual ARC Industry Forum for many years. The open computer as a PLC replacement designed to drive down the high costs of controllers never really made it. The organization did drive a number of initiatives benefiting users, especially PackML, a uniform way of describing processes of a machine to an operator.
Now called The Organization for Machine Automation and Control, it has found a home within the Packaging Machine Manufacturers Institute (PMMI) which also was the temporary home of my old magazine Automation World.
I haven’t seen or heard much from OMAC for quite some time. I’m glad to see some news with the release of some useful information.
OMAC, the leading industry association for machine automation and control, in collaboration with ei3, releases a new guide on industrial data sharing titled “Practical Considerations for Data Governance”. The guide provides expert insights and actionable recommendations to help organizations establish effective data governance and sharing practices.
The document covers several critical topics, including the importance of data governance, the need for a common language to facilitate data exchange across systems, and the various applications that use plant floor data. It examines the key components for plant floor data, the significance of data governance standards and organizations in manufacturing, and establishes a structure for data access and integration from multiple sources.
Spencer Cramer, OMAC’s Chairman, and the Founder and CEO of ei3, emphasized the critical need for organizations to collect data and transform it into actionable insights to optimize production and efficiency in the constantly evolving manufacturing landscape. “We are excited to launch this guide and provide the industry with a resource that outlines practical considerations for facilitating data sharing within and across organizations to improve processes, save costs, and mitigate errors,” he said.
Glad to see Mark Fondl still active and looking at how to use new technologies. Once again as a new technology editor, Fondl and I had long discussions about the future of Ethernet as an industrial network. That was not a foregone conclusion in the late 90s. Now, it’s standard.
Mark Fondl, OMAC’s Digital Transformation Workgroup Leader and Vice President of Product Management at ei3, explained that plant floor data can be valuable for different user groups, and data scientists can help maximize its potential. He added, “To efficiently use plant floor data, an assessment should be made, and a team should be created to ensure all stakeholders are coordinated. The data governance policy should include both IT and OT, and guidelines should be provided for internal and external companies. The plan should be adaptable to changing capabilities to ensure its long-term success.”
There was a time when I would take information from OPC Foundation and chat with the MQTT people and then return the favor. It was much like being in the midst of a religious war.
My response was (is) that the market will decide. Individual engineers will choose the solution that best fits their needs at the time. If both technologies have sufficient benefit to enough engineers to form a market, then both will survive. I think there is room in the market for both, since they sort of do the same thing, but actually each provides unique benefits.
I’ve been thinking about this for a while since I’ve had so many other things to digest. The impetus came from a couple of directions—OPC Foundation President Stefan Hoppe’s editorial in the June newsletter and from Stacey Higginbotham’s IoT Newsletter recently that discussed edge.
Hoppe wrote, “Still to this day people only think of OPC UA merely as a secure protocol to move information. It is so much more than that. It is a modeling language in cloud applications and digital twins. It is capable of file transport (since 2009). Most people know that OPC UA started as an initiative in the OT world and expanded from the PLC control plane to SCADA and later to MES and ERP. More and more people are realizing that OPC UA via MQTT is the bridge between OT and IT and is able to push information directly into Microsoft and AWS cloud dashboards without the need for an adapter.”
From Data to Data Sources
Stacey Higginbotham writing in Stacey on IoT Bringing AI to the farthest edge requires new computing.
Stacey writes about IoT generally. Most of her topics are commercial/consumer and chips (her reporting background). She does follow the IoT trail into manufacturing at times. In this newsletter she broaches into something I’ve been expounding for a long time, that is, how edge devices have become smarter with better communications. Then the IT world came up with the term Edge, which is, of course everything manufacturing.
We’re in the midst of a computing shift that’s turning the back-and-forth between cloud and edge computing on its head. This new form of computing has been creeping to the forefront for the last few years, driven by digital transformations and complicated connected devices such as cars.
But the more recent hype around AI is providing the richest examples of this shift. And it will ultimately require new forms of computing in more places, changing both how we think about the edge and the types of computing we do there. In short, the rise of AI everywhere will lead to new forms of computing specialized for different aspects of the edge. I’m calling this concept the complex edge.
As part of this shift in computing, we have to become more nuanced about what we mean when we talk about the edge. I like to think of it as a continuum moving from the most compute and power-constrained devices such as sensors to the most powerful servers that happen to be located on premise in a factory. In the middle are devices such as tablets, smartphones, programmable logic controllers (PLCs), and gateways that might handle incoming data from PLCs or sensors.
Moreover, each of these devices along the continuum might run their own AI models and require their own specialized type of computing to compare the data coming into those models. For example, I’ve written about the need for sensors to get smarter and process more information directly.
Smart sensors turn to analog compute
Cameras or image sensors are popular examples of such devices. This vision sensor from Useful Sensors, which can do person detection on a $10 device, runs a simple algorithm that looks for people and counts them. At a higher level, which requires more processing power, sensors from Sony or chips from CEVA are able to detect specific movements, faces, or other options.
A few weeks ago at the Sensors Converge event, a company called Polyn Technology showed off a version of a chip designed to take raw data and quickly convert it into an insight. To quickly process analog signals from the environment (such as vibrations or sound), the Polyn chip uses analog processing to process the signal and then sends the “insight” to another computer for more processing.
We not only have cameras shooting pictures for QA purposes, but also they are streaming video for applications from industrial engineering to surveillance to predictive maintenance. This is a vast amount of data.
We have tools, but we will need more. Chips with built in communication and analytics are a start.
Data management and analytics are two consistent current trends in the industrial / manufacturing technology market. This announcement from Emerson, now calling itself “a global software and engineering leader,” discusses how they brought these to asset management.
AMS Device Manager Data Server securely extends intelligent field device data to outside systems to make it easier for reliability and maintenance teams to further capitalize on modern advanced analytics software, providing a step change in operational efficiency and smart manufacturing.
AMS Device Manager Data Server publishes intelligent field device data nearly instantaneously to industrial software analytics solutions already in use by customers, eliminating the need for complex custom data integration and manual workarounds that often cause delayed results and siloed data. This data is relayed via secure industry protocols.
AMS Device Manager Data Server makes it easy to import critical instrument and valve data into common dashboarding tools and applications like Microsoft PowerBI, Emerson software tools such as the Plantweb Optics platform, Plantweb Insight, Aspen MTell and AspenTech Inmation, plant historians and others.