Interoperability and the Development of JSON

Interoperability and the Development of JSON

Interoperability enables growth of an industry, innovation, and great benefits for users. We see it broadly in the Web and more specifically in industry with OPC. It is topic to which I return frequently. We can talk about all the components of the “Industrial Internet of Things” whether it be devices, databases, big data analytics, visualization, but without interoperability the IoT will be severely hampered.

Dave Winer developed outlining applications to help writers of prose and code organize their thoughts. He also developed RSS and knows something about interoperability and the politics of standards.

In this podcast, Winer talks with Allen Wirfs-Brock about how JSON came to be and the back story about how Tim Bray (a developer of XML) came to be interested in its evolution. “Along the way we get a lot of interesting tidbits about how JavaScript and JSON evolved,” says Winer.

Data is like air

This all reminded me of some previous blog posts about data wants to be free. Moira Gunn, host of TechNation, an NPR show and also a podcast, discussed this topic in her opening “Take Five” essay in that podcast. She said, “Like air, data just flows. The power of data lies in its being replicated over and over.” She was thinking about Google and the attempt to have your past eradicated. But the concept also works for us.

Interoperability

I was thinking about my thoughts voiced yesterday about the use of open technology. Arlen Nipper, co-developer of MQTT, likes to tout that his middleware powers the Internet of Things. He says this because MQTT is the backbone of Facebook Messenger.

Ah, there is my point about the use of open technologies. Messenger is a closed silo. Try to move your data. Try to use your data in another application. Try to text someone from another app to an address in Messenger. Nope. Can’t do anything. Facebook wants you captured completely within its silo.

What’s that old phrase? Buyer Beware?

Interoperability and the Development of JSON

Industrial Internet of Things Interoperability with OPC

This essay is third in a series on moving data on the Industrial Internet of Things. We’ll take a look at OPC UA.

First we looked at Opto 22’s new product that provides for RESTful APIs as a way to open up information to move from platform to platform. This is relatively simple and understood by almost all recent college graduates. This method does not model data, but it can move vast amounts of data.

MQTT is a transport technology providing a low bandwidth middleware. However, developers have recently added a data model, MQTT Sparkplug. MQTT is agnostic as to the message. You can move data from all manner of sources—including OPC—using this transport. People began thinking there should be a way to describe the message within this ecosystem.

The Grandfather of them all

I reached out to Tom Burke, president of the OPC Foundation, for more information about how OPC UA fits into this picture.

So what is OPC UA? “OPC UA is about multivendor secure reliable interoperability for data and information integration from the embedded world to the cloud.”

The key to OPC is interoperability. It allows clients from one supplier to import data from another. Many years ago, for example, I was in the Wonderware labs. They had PLCs from almost all known suppliers. In order for its HMI/SCADA software application to work, developers had to write drivers from every one of those PLCs. With OPC, they could communicate with every OPC-enabled PLC. And so could every other supplier. It opened competition, the very thing that customers want.

That was more than 20 years ago. Today there are over 4,200 vendors who have OPC products in the marketplace. Or, as Burke puts it, “The days are over where the end users are willing to pay for multivendor interoperability by developing custom Software Solutions to integrate products from multiple vendors together.”

The OPC UA working group has already developed a simple https interface for UA that will meet and exceed the needs of many applications.

The biggest value is the data modeling/semantics OPC UA provides, followed by the security mechanisms.

I received this from the OPC Foundation:

So for any IoT application, one should ask:

  1. How do you model/describe your data?
  2. How do you store your data?
  3. How do you authenticate?
  4. How do you encrypt?
  5. Where do you store your secrets?
  6. What do you use for transport?
  7. Who/what can you connect to?

 OPC UA can answer all these questions.

Choose wisely

So, you can see that there are several options. It really depends upon your data needs. OPC UA guarantees interoperability. REST and MQTT are standards in themselves, but as a user you’d have to ask your supplier if they using them in an open manner. It’s possible that they are using an open standard in a proprietary way that promotes additional system integration business.

It all sounds confusing on the surface. Like any project you’re beginning, be clear about expectations and specifications up front. And, of course, maybe a blend is just right for you.

Talking MQTT For Industrial Data Exchange

Talking MQTT For Industrial Data Exchange

I ran a brief series on industrial data, interoperability, and the Purdue Model (see this one, for example, and others about that time). It’s about how data is becoming decoupled from the application. It’s not hierarchical, seeking out applications that need it.

This week I took a look at Opto 22’s latest innovation—use of RESTful APIs in an industrial controller. The next step seemed to be looking at MQTT. This is another IT-friendly technology that also serves as an open, standardized method of transporting data—and more.

Then I’ll follow up on a deeper discussion of OPC and where that may be fitting in within the new enterprise data architecture.

I’ll finish the brief series with an application of (perhaps) Big Data and IIoT. It’s not open standard, but shows where enterprises could be going.

MQTT and Sparkplug

Inductive Automation has been around for about 13 years, but it has shown rapid growth over the past 5. It is a cloud-based HMI/SCADA and IIoT platform. I finally made it to the user conference last September and was amazed at the turnout—and at the companies represented. Its product is targeted at the market dominated in the past by Wonderware, Rockwell Automation RS View, and GE Proficy (Intellution iFix in a former life). It’s a private company, but I’ve been trying to assemble some competitive market share guesses. My guess is that Inductive ranks very well with the old guard. Part of the reason is its business model that seems friendly to users.

Just as Opto 22 was an early strong supporter of OPC (and still supports it), so also is Inductive Automation a strong OPC shop. However, just as Opto 22 sees opportunities for better cloud and IT interoperability with REST, Inductive Automation has seen the same with MQTT. In fact, it just pulled off its own Webinar on the subject.

I put in a call and got into a conversation with Don Pearson and Travis Cox. Following is a synopsis of the conversation. It is also a preview of the ICC user conference in Folsom, CA Sept. 19-21. At the conference you can talk to both Arlen Nipper, president and CTO, Cirrus Link and co-developer of MQTT along with Tom Burke, president of the OPC Foundation.

Don and Travis explained that MQTT itself is a middleware “broker” technology. It describes a lightweight, publish/subscribe transport mechanism that is completely agnostic as to the message contained in the communication. So, you could send OPC UA information over MQTT or other types of data. The caveat, as always, is that the application on the receiving end must speak the same “language.”

They see apps talking directly to PLCs/PACs/controllers as going away. We are in the midst of a trend of decoupling data from the application or device.

MQTT is “stateful”, it can report the last state of the device. It rides on TCP/IP, uses TLS security, and it reports by exception.

Describing the message

MQTT is, in itself, agnostic as to the message itself. However, to be truly useful it needs a message specification. Enter Sparkplug. This technology describes the payload. So, it is needed on both sides of the communication. it doesn’t need to know the device itself, as it is all about information. it is a GitHub project and, as is MQTT, part of the eclipse foundation.

I have known Don and Travis for years. I have never heard them as passionate about technology as they were during our conversation.

If you are coming to Folsom, CA for the conference, you’ll hear more. I will be there and would love to have a breakfast or dinner with a group and dive into a deep discussion about all this. Let me know.

OPC UA Publish Subscribe Protocol

I’ve decided not to try to publish bunches of news for bunches of posts per day. It’s not quantity I’m looking for–and I get the feeling neither are you. Besides, according to some of my sources of statistics, readership of this blog is about half of the big magazines in the space. Not bad considering that half or more of their views come via search–and they have tons more stuff to search.

Tom Burke, president of the OPC Foundation sent this article describing the latest protocol advancement it is working on. It is a publish/subscribe protocol that lets OPC UA and messaging protocols such as AMQP and MQTT play nicely with OPC. This gives users the benefits of both types of messaging. Certainly a winning proposition. In fact, I had been wondering about the OPC response to MQTT which is popping up more frequently.

OPC UA PubSub: Bringing the Power of the Cloud to Industrial Automation

Many OPC systems today have a small number of HMIs or SCADA applications which manage a larger number of devices. In some cases, MES systems and ERP systems are part of the picture and use OPC interfaces to collect data from the factory and send it on to enterprise applications. This model works well for many applications and will continue to be a mainstay of many industrial automation users which have a lot of equipment installed in a single location that needs to be managed. However, the widespread deployment of cloud based solutions has many factory operators wondering how they can take advantage of it to streamline their operations. The needs of these users have been the driving force behind the new OPC UA PubSub Specification. This specification layers OPC UA on top of message based middleware such as AMQP or MQTT in a way that allows users to take advantage of OPC UA features such the robust information modelling framework while adapting to the message centered communication paradigm of the middleware.

Use OPC UA PubSub to Broadcast Data and Events to the Cloud

OPC UA PubSub defines a loosely coupled message protocol that can be used with multiple encodings (e.g. JSON, UA Binary or XML) and multiple transports (e.g. AMQP, MQTT, XMPP et. al.). Applications which publish information create data or event subscriptions as they would for normal OPC UA communications and forward the notifications produced to the Message Oriented Middleware. Applications which consume information create subscriptions with the Message Oriented Middleware which will forward the messages to them as they arrive. The OPC UA PubSub Specification defines a format for these messages that allow them to be consumed by subscribers who have no knowledge of OPC UA and no ability to connect to the publisher (Figure 1).

OPC UA PubSub Fig 1

The middleware in these cases may support durable queuing, multicast and/or filtering which allows OPC UA data or events to reach a much larger variety of applications including big data applications which depend on a supply of real time data from the factory.

Data is not Enough: OPC UA Extends its Information Model to the Cloud

The raw data in messages produced by publishers can have a structure which can be understood by subscribers which have no access to information other than the message. However, the metadata associated with the message can provide important additional context which allows the subscribers to properly interpret the message. To facilitate this OPC UA PubSub defines a metadata message that can be delivered using the same middleware broker infrastructure. These messages also allow the publishers to instantly report changes to their configuration which affects the content of the messages. Each message published includes an identifier for the metadata version that applies to the message which ensures that subscribers can easily detect and manage changes to the metadata.

End to End Security: Cloud Services run by Third Parties may not be Secure Enough

The Cloud relies on infrastructure provided by vendors that specialize in providing large scalable systems. However, the nature of the Cloud means these third parties will have access to the data even if the communication between the application and the broker is secure. OPC UA PubSub provides for end-to-end security which ensures that only applications authorized by the operators will be able to view or modify the data no matter how many intermediaries are required to deliver the data. OPC UA PubSub includes a key distribution model that allows loosely coupled applications to share keys as needed (Figure 2). Access to the Security Key Servers is controlled using web based standards for federated identity management such as OAuth2. For example, a factory owner can use the OAuth2 support built into Active Directory to provide authorization services for their Security Key Servers. This access will be independent of the middleware used to deliver the messages to their intended recipients and allows for access to be granted or revoked as needs evolve.

OPC UA PubSub Fig 2

Figure 2 – OPC UA PubSub End-to-End Security Model

OPC UA PubSub: A Flexible solution that can Evolve

Different middleware vendors want operators to commit to using their protocol for their operations. OPC UA PubSub provides a framework for simultaneously supporting multiple protocols as the needs of factory owners evolve while providing a standard architecture for describing complex information. Figure 3 illustrates how this works in practice where a Machine Vendor uses MQTT to communicate with its machines deployed in a customer’s factory while the Factory Operator uses AMQP to capture analytics. In both cases, the data being sent to the Cloud is based on OPC UA PubSub and conforms to an OPC UA Information Model. The bottom line for factory operators is OPC UA reduces costs and provides greater flexibility by allowing factory operators to focus on information their enterprise needs instead of the protocol needed to move the information between systems.

OPC UA PubSub Fig 3

Figure 3 – OPC UA PubSub a Flexible Framework that Evolves as Needs Change

 

Dell World Features IoT, Cloud, Analytics

Dell World Features IoT, Cloud, Analytics

I received an invitation to Dell World that seemed like a great opportunity to broaden my horizons and dig deeper into the technologies that will provide the platform for Industrial Internet of Things applications and benefits.

When one of the Dell people asked me how it went, I told them that learning about Dell’s technologies helped fill in a gap in my coverage of the whole “connected manufacturing” space. As perhaps the only manufacturing focused writer attending, I certainly received attention

The ecosystem that many refer to as Internet of Things or IoT includes connected things, database + storage (cloud), analytics, and visualization. Dell does not play in the “things” space as defined by the end devices, but it has significant data center, software, and analytics plays. Two items announcemented at Dell World expanded the offering.

The first that Michael Dell, CEO of Dell, announced during his keynote was an IoT product called Edge Gateway 5000. This industrialized intelligent, connected device serves to gather inputs from the “things” of the system, perform some analytics, and serve them to the cloud. The second was announced jointly with Satya Nardella, Microsoft CEO. This is a cloud partnership where Dell will be supporting Microsoft Azure.

Some excerpts of the announcements are below, but first an observation. In the industry I cover, the CEO will usually appear for a few minutes at the keynote and talk a little about financials or the theme of the week. Then they have a motivational speaker who goes for 45 minutes. Sometimes there is a product speaker who will do 30 minuts of product introductions.

Dell held the stage for most of the 90+ minutes. He gave an outline of the new, private company, discussed the industry, interviewed several customers, yielded the floor for the CMO to talk about Dell company support for entrepreneurship, then sat for a 30 minute conversation with Nadella. He showed intelligence, grace and humor.

Here are excerpts from the product announcements.

Wednesday at Dell World, Dell and Microsoft Corp. announced a new cloud solution and program that enable organizations of all sizes to use the Microsoft cloud platform to transform their business. A new, Microsoft Azure-consistent, integrated system for hybrid cloud and extended program offerings will help more customers benefit from Azure and Dell to drive greater agility and increased time to value, whether they choose on-premises or public cloud solutions.

Dell today announced the launch of the new Edge Gateway 5000 Series purpose-built for the building and factory automation sectors. Composed of an industrial-grade form factor, expanded input and output interfaces, and with wide operating temperature ranges, the Edge Gateway 5000, combined with Dell’s data analytics capabilities, promises to give companies an edge computing solution alternative to today’s costly, proprietary IoT offerings.

The Dell Edge Gateway sits at the edge of the network (near the devices and sensors) with local analytics and other middleware to receive, aggregate, analyze and relay data, then minimizes expensive bandwidth by relaying only meaningful data to the cloud or datacenter. Thanks to new Dell Statistica data analytics also announced today, Dell is expanding capabilities out to the gateway. This means companies can now extend the benefits of cloud computing to their network edge and for faster and more secure business insights while saving on the costly transfer of data to and from the cloud.

Follow this blog

Get a weekly email of all new posts.