Industrial Internet of Things Interoperability with OPC

Industrial Internet of Things Interoperability with OPC

This essay is third in a series on moving data on the Industrial Internet of Things. We’ll take a look at OPC UA.

First we looked at Opto 22’s new product that provides for RESTful APIs as a way to open up information to move from platform to platform. This is relatively simple and understood by almost all recent college graduates. This method does not model data, but it can move vast amounts of data.

MQTT is a transport technology providing a low bandwidth middleware. However, developers have recently added a data model, MQTT Sparkplug. MQTT is agnostic as to the message. You can move data from all manner of sources—including OPC—using this transport. People began thinking there should be a way to describe the message within this ecosystem.

The Grandfather of them all

I reached out to Tom Burke, president of the OPC Foundation, for more information about how OPC UA fits into this picture.

So what is OPC UA? “OPC UA is about multivendor secure reliable interoperability for data and information integration from the embedded world to the cloud.”

The key to OPC is interoperability. It allows clients from one supplier to import data from another. Many years ago, for example, I was in the Wonderware labs. They had PLCs from almost all known suppliers. In order for its HMI/SCADA software application to work, developers had to write drivers from every one of those PLCs. With OPC, they could communicate with every OPC-enabled PLC. And so could every other supplier. It opened competition, the very thing that customers want.

That was more than 20 years ago. Today there are over 4,200 vendors who have OPC products in the marketplace. Or, as Burke puts it, “The days are over where the end users are willing to pay for multivendor interoperability by developing custom Software Solutions to integrate products from multiple vendors together.”

The OPC UA working group has already developed a simple https interface for UA that will meet and exceed the needs of many applications.

The biggest value is the data modeling/semantics OPC UA provides, followed by the security mechanisms.

I received this from the OPC Foundation:

So for any IoT application, one should ask:

  1. How do you model/describe your data?
  2. How do you store your data?
  3. How do you authenticate?
  4. How do you encrypt?
  5. Where do you store your secrets?
  6. What do you use for transport?
  7. Who/what can you connect to?

 OPC UA can answer all these questions.

Choose wisely

So, you can see that there are several options. It really depends upon your data needs. OPC UA guarantees interoperability. REST and MQTT are standards in themselves, but as a user you’d have to ask your supplier if they using them in an open manner. It’s possible that they are using an open standard in a proprietary way that promotes additional system integration business.

It all sounds confusing on the surface. Like any project you’re beginning, be clear about expectations and specifications up front. And, of course, maybe a blend is just right for you.

Talking MQTT For Industrial Data Exchange

Talking MQTT For Industrial Data Exchange

I ran a brief series on industrial data, interoperability, and the Purdue Model (see this one, for example, and others about that time). It’s about how data is becoming decoupled from the application. It’s not hierarchical, seeking out applications that need it.

This week I took a look at Opto 22’s latest innovation—use of RESTful APIs in an industrial controller. The next step seemed to be looking at MQTT. This is another IT-friendly technology that also serves as an open, standardized method of transporting data—and more.

Then I’ll follow up on a deeper discussion of OPC and where that may be fitting in within the new enterprise data architecture.

I’ll finish the brief series with an application of (perhaps) Big Data and IIoT. It’s not open standard, but shows where enterprises could be going.

MQTT and Sparkplug

Inductive Automation has been around for about 13 years, but it has shown rapid growth over the past 5. It is a cloud-based HMI/SCADA and IIoT platform. I finally made it to the user conference last September and was amazed at the turnout—and at the companies represented. Its product is targeted at the market dominated in the past by Wonderware, Rockwell Automation RS View, and GE Proficy (Intellution iFix in a former life). It’s a private company, but I’ve been trying to assemble some competitive market share guesses. My guess is that Inductive ranks very well with the old guard. Part of the reason is its business model that seems friendly to users.

Just as Opto 22 was an early strong supporter of OPC (and still supports it), so also is Inductive Automation a strong OPC shop. However, just as Opto 22 sees opportunities for better cloud and IT interoperability with REST, Inductive Automation has seen the same with MQTT. In fact, it just pulled off its own Webinar on the subject.

I put in a call and got into a conversation with Don Pearson and Travis Cox. Following is a synopsis of the conversation. It is also a preview of the ICC user conference in Folsom, CA Sept. 19-21. At the conference you can talk to both Arlen Nipper, president and CTO, Cirrus Link and co-developer of MQTT along with Tom Burke, president of the OPC Foundation.

Don and Travis explained that MQTT itself is a middleware “broker” technology. It describes a lightweight, publish/subscribe transport mechanism that is completely agnostic as to the message contained in the communication. So, you could send OPC UA information over MQTT or other types of data. The caveat, as always, is that the application on the receiving end must speak the same “language.”

They see apps talking directly to PLCs/PACs/controllers as going away. We are in the midst of a trend of decoupling data from the application or device.

MQTT is “stateful”, it can report the last state of the device. It rides on TCP/IP, uses TLS security, and it reports by exception.

Describing the message

MQTT is, in itself, agnostic as to the message itself. However, to be truly useful it needs a message specification. Enter Sparkplug. This technology describes the payload. So, it is needed on both sides of the communication. it doesn’t need to know the device itself, as it is all about information. it is a GitHub project and, as is MQTT, part of the eclipse foundation.

I have known Don and Travis for years. I have never heard them as passionate about technology as they were during our conversation.

If you are coming to Folsom, CA for the conference, you’ll hear more. I will be there and would love to have a breakfast or dinner with a group and dive into a deep discussion about all this. Let me know.

Model-Driven Operations Management User Experience

Model-Driven Operations Management User Experience

model driven operations managementTim Sowell always packs many operations management ideas into a brief blog post. Sowell is a VP and Fellow at Schneider Electric Software (Wonderware). I’ve looked at his posts before. He is always thinking out in front of most people.

His Feb. 14 post, Composite Frameworks What Are They, the Shift to Model Driven vs. Custom: How Do They Play?, takes a look at moving the user experience of operations management software into newer territory.

He probably says much more, but this is the take I’m going to analyze. He’s pointing out the difficulties of using traditional approaches to programming and presenting User Interfaces in a way that keeps pace with today’s expectations.

“Traditionally companies have built User Interfaces to an API, with the calls needed to execution actions and transactions; these have worked well especially within a plant. But a key to operational systems being effective and agile is their ability to adapt on a regular basis. This requires a sustainable and evolving system. This is especially important in form/ transaction activities where information is provided and where actions/ data input, and procedures need to be carried out.”

He does not stop there but proceeds to enumerate some challenges:

  1. Operational Process cross-over functional domains and applications
  2. Lack of governance
  3. AgilityResponsive manufacturing business processes
  4. Increase the performance of their people assets
  5. Too much Custom Code, making it unmanageable and evolutionary

He wonders why we can’t use techniques gleaned from Business Process Modeling. That’s a good question! He notes that some people will say that BPM is not real-time like manufacturing/industrial applications are. But he rebuts that “this also aligns with what the industrial world is very comfortable world with—that of ‘stable in control loops’.”

Operations management  solutions

Here are some proposed solutions:

  • Providing a graphical configuration environment for the capture and defining of operational process including the validation of data input, and guiding actions, working inline with the user Interface/ forms etc.
  • Providing a framework for building of reusable forms, and reusable procedures that can be managed as templates and standards to enforce consistent operational practices.
  • Empowering the operational domain people to develop, evolve and manage their procedures.

“Most of all empowering the different roles in the plant, that operational close loop moving to an “activity” centric system where information, and action is driven from a consistent operational model and practices.”

This is a consistent Sowell message. Let’s see what we can template-ize or project as a model rather than custom code everything.

More and more owner/operators and users I talk to are getting tired of the expense and lead-time for custom coded projects. They need the speed and flexibility of using models and standards for application implementation. I think this is where Sowell was headed (if not, he’ll correct me, I’m sure). This will serve to move industry forward as a more profitable contributor to enterprise health.

Updated News from the Wonderware Software Conference

Updated News from the Wonderware Software Conference

I have heard from friends (non-editors) who attended the Wonderware software conference. I have edited their news items into what I hope is a coherent post. This news includes some significant information that I have not seen elsewhere, yet. I should get my friends to report more. Getting the point of view of users is valuable. I hope you agree.

This also shows what I was talking about Monday where an industry leader under attack can push the envelop a different direction in order to remain fresh and offer customers more.

1) “It’s getting really cloudy.”

They showed off a number of tools, some old, some new that are all based around a cloud platform or at a minimum communication with the cloud. They are continuing to push really really hard the Wonderware Online and Smart Glance products. Talking with Saadi Kermani, the main evangelist for them, they are in continual release mode and have a long list of planned features that will be rolling out over the coming months designed to keep them up with, or exceed, competitive offerings.

“Also the new version of Smart Glance is really nice and very modern look and feel.” The best part about something like smart glance is that it’s a relatively simple product to get up and running for your org. And pretty cost effective to boot. “My personal analysis is that what will make Wonderware Online super valuable is an ecosystem of partners building sophisticated apps in cloud platforms like Azure and others.”

Some other big cloud stuff. They previewed Wonderware Development Studio online. Now you can log on with a web browser, stand up an environment picking and choosing which machines you want, what software is installed on each, how the redundancy is configured, etc. Hit go and in some amount of time..maybe an hour or so… the machines are ready for you to login via remote desktop running on Azure. This seems like a really awesome setup for integrators that don’t want to maintain a bunch of different environments and versions. Supposedly there was no cost to have then configured. “My only concern is the pricing and how easy it might be to run up a $1K bill before you know it. I did ask about the idea of buying the IP and underlying system that did the automation so you can run it on premise. If I remember right the person I was talking to said it might be possible but wasn’t on the current roadmap.”

2) Hello IOT

A guy from Microsoft had a nice presentation on Monday afternoon talking about Azure and how Azure fits into the IOT space. It was a bit high level but I think we need all the education we can get right now.

During the intro sessions on Tuesday John Krajewski discussed a little more about IOT and specifically talked about their new MQTT OI Server. OI Server is the new name they have given to IO drivers. They are doing some different things with packaging and scalability but that’s incidental to the IOT discussion. Back to IOT, they have written a server that can talk MQTT natively so you can seamlessly push and pull data to/from System Platform with other devices talking MQTT just like you would a PLC.

Alvaro Martiniez the product manager admitted it was early Alpha stage for the product but I hope it opened some people’s eyes to possibilities. “I can’t wait to get my hands on it.”

An integrator presentation to a standing room only crowd discussed multiple aspects of IOT. It was really broken into three parts. The first was an overview of 5 of the major protocols one might come across when getting into the IOT space. Next was a technical deep dive on one of the protocols, MQTT. [Gary’s note: At the Inductive conference, I attended a session by one of the developers of MQTT. This is an important protocol, riding atop TCP/IP stack, that can standardize IoT messaging.] Next were 4 or 5 demos showing the use of MQTT inside or integrated with System Platform and the Historian. “My personal favorite was showing how you could build a simple set of code that read MQTT data and sent it directly to the Historian… which in theory means you could have 100% cloud infrastructure for piping device data straight over to the Wonderware Online historian.”

3) Next Gen

They have been working on revamping the System Platform look, feel, and function for the last few years. The 2014 releases have made some pretty big shifts in the functionality space but nothing major expected until the next release. Tim Sowell’s comments about a unified operations center are dead on with the vision.

The biggest change you are going to see.. and they talked about this very publicly so I don’t think I’m giving anything away, is that InTouch is no longer the shell for visualization. It is now essentially a compiled stand-alone app that is build from many panes that you create and configure from within the System Platform Graphics platform. The visualization engine will use the context of the graphics and objects to make navigation easier. “But the bigger concept is that this is now an operations center, not a simple HMI.”

Longer term there should be a play for higher end integrators and pure software plays to develop Apps that plugin to the operations center and provide additional value and context.

Lots and lots of discussion of context. When these apps are run inside the operations center they will automatically know what’s being displayed on the screens and can automatically adjust. A simple example is that you pull up a graphic with a tank farm. Maybe you have a video feed app with live cameras on the tanks and it automatically swaps to the correct camera view. Or you have another pane with customers orders from the ERP so you know what tanker trucks you should be expecting in the next few hours. “While this has technically been possible before I think the key is that they are going to be adding a lot of functionality to make these otherwise standalone functions a first class citizen within the Operations view.”

They are also making big changes in the development environment trying to make the engineering experience more user friendly. To me this is a clear play to make System Platform look a lot less intimidating and make the 15 minute demo a lot easier and more obvious. For high end engineers and integrators this new layout and development method will probably be a turnoff but I think it will help Wonderware tell the System Platform story easier and get newer, less sophisticated, customers on board.

HMI SCADA At the High End

HMI SCADA At the High End

I’m still pondering the whole HMI/SCADA market and technologies. I’m still getting a few updates after the Inductive Automation conference I attended in California and the Wonderware conference in Dallas that I missed.

The two have traditionally been referred to in trade publications together.

Today, I think three or four things are blending. Things are getting interesting.

SCADA is “supervisory control and data acquisition.” The supervisory control part has blended into the higher ends of human-machine interface. Data Acquisition software technology is a key platform for what we are today calling the “Industrial Internet of Things.” I’ve heard one technologist predict that soon we’ll just say “Internet.”

Data acquisition itself is a system that involves a variety of inputs including sensors, signal analyzers, and networks. The software part brings it all under control and provides a format for passing data to the next level.

HMI also involves a system these days. Evolving from operator interface into sophisticated software that includes the “supervisory control” part of the system.

Some applications also blend in MES and Manufacturing Intelligence. These applications, often engineered solutions atop the software platforms, strive to make sense of the data moving from HMI/SCADA either using it for manufacturing control or as a feed to enterprise systems.

Wonderware has been an historical force in these areas. Its original competitor was Intellution which is now subsumed into GE’s Proficy suite. The other strong competitor is Rockwell Automation. All three sell on a traditional sales model of “seats” and/or “tags.”

Inductive Automation built from enterprise grade database technology and has a completely different sales model. It is driving the cost of HMI/SCADA, and in some ways MES, down.

Competitors can meet that competition by either pursuing a race to the bottom or through redefining a higher niche. The winner of the race to the bottom becomes the company built from the ground up for low individual sales price.

Wonderware announcements

All of that was just an analyst prologue to a couple of items that have popped up from Schneider Electric Software (Wonderware) over the past couple of days.

timSowellTo my mind, Tim Sowell is addressing how some customers are taking these platforms to a new level.   Writing in his blog last weekend, Sowell notes, “For the last couple of years we have seen the changing supervisory solutions emerging, that will require a rethink of the underlying systems, and how they implemented and the traditional HMI, Control architectures will not satisfy! Certainly in upstream Oil and Gas, Power, Mining, Water and Smart Cities we have seen a significant growth in the Integrated Operational Center (IOC) concept. Where multiple sites control comes back into one room, where planning and operations can collaborate in real-time.”

I have seen examples of this Integrated Operations Center featuring such roles as operations, planning, engineering, and maintenance. But this is more than technology—it requires organizing, training, and equipping humans.

Sowell, “When you start peeling back the ‘day in the life of operations’ the IOC is only the ‘quarterback’ in a flexible operational team of different roles, contributing different levels of operational. Combined with dynamic operational landscape, where the operational span of control of operational assets, is dynamically changing all the time. The question is what does the system look like, do the traditional approaches apply?”

Tying things together, Sowell writes, “Traditionally companies have used isolated (siloed) HMI, DCS workstation controls at the facilities, and then others at the regional operational centers and then others at the central IOC, and stitched them together. Now you add the dynamic nature of the business with changing assets, and now a mobile workforce we have addition operational stations that of the mobile (roaming worker). All must see the same state, with scope to their span of control, and accountability to control.”

The initial conclusion, “We need one system, but multiple operational points, and layouts, awareness so the OPERATIONAL TEAM can operate in unison, enabling effective operational work.”

Intelligence

Here is a little more detail about the latest revision of Wonderware Intelligence to which I referred last week and above.The newest version collects, calculates and contextualizes data and metrics from multiple sources across the manufacturing operation, puts it into a centralized storage and updates it all in near-real time. Because it is optimized for retrieval, the information can then be used to monitor KPIs via customizable dashboards, as well as for drill-down analysis and insights into operating and overall business performance.

“Wonderware Intelligence is an easy-to-use, non-disruptive solution that improves how our customers visualize and analyze industrial Big Data,” said Graeme Welton, director of Advansys (Pty) Ltd., a South African company that provides specialized industrial automation, manufacturing systems and business intelligence consulting and project implementation services. “It allows our customers to build their own interactive dashboards that can capture, visualize and analyze key performance indicators and other operating data. Not only is it more user-friendly, it has better query cycle times, it’s faster and it has simpler administration rights. It’s an innovative tool that continues to drive quality and value.”

Wonderware Intelligence visual analytics and dashboards allow everyone in the operation to see the same version of the truth drawn from a single data warehouse. The interactive and visual nature of the dashboards significantly increases the speed and confidence of the users’ decision making.

Follow this blog

Get a weekly email of all new posts.