Control and Industrial Internet of Things Get RESTful

“It is the next big thing [in the Industrial Internet of Things].”

I have been waiting for quite some time for the next Opto 22 move. It has always been the early, if not first, mover in adopting technologies that are IT friendly for OT. This next big thing according to Marketing VP Benson Hougland is a controller with a RESTful API.

Let’s look at a couple of big reasons. HMI/SCADA software is rapidly moving to being a cloud-based app with HTML5 clients. Getting to the cloud means getting through firewalls. REST helps. Then consider that recent graduates, and current students, are studying and playing with such technologies as REST and MQTT and others, rather than all the specific industrial technologies and protocols, on their Arduinos and Raspberry Pi’s. They will be right at home programming HMI or database applications with technologies such as REST.

The Announcement

Opto SNAP RESTful PACIndustrial automation manufacturer Opto 22 has announced immediate availability of version 9.5 of PAC Project, a Microsoft Windows-based integrated software development suite for industrial automation, process control, remote monitoring, and Internet of Things applications.

The most significant addition in this version is new firmware for Opto 22 programmable automation controllers (PACs) that includes an HTTP/S server with a RESTful API, providing developers with secure, programmatic access to control variables and I/O data using any programming language that supports JavaScript Object Notation (JSON).

This new capability closes the IT/OT gap, allows for rapid Industrial Internet of Things (IIoT) application development, provides for secure data exchange using open Internet standards, and reduces time to market in machine and system design.

The addition of a secure RESTful server and an open, documented API to a programmable automation controller (PAC) is a significant, ground-breaking industry innovation, because REST architecture and associated technology are intrinsic to the Internet of Things and paramount to web and mobile-based application development.

Opto 22’s implementation of REST directly into a commercially available, off-the-shelf industrial PAC is unique in the market and places the company as the first and only industrial automation and controls manufacturer to offer this industry-changing technology.

Other features found in this new version include new tools to develop modular control applications with nested subroutines, new debugging tools to reduce development time, support for a worldwide installed base of legacy Optomux I/O systems, and integration of third-party systems and protocols with the IIoT.

To provide enhanced security and auditing for HMI access, PAC Project now offers sophisticated user groups and data rights, as well as the ability to embed video directly into HMI windows.

Opto 22 RESTful ArchitecturePAC Project 9.5 provides updated firmware for Opto 22 SNAP PAC S-series and R-series controllers that enable a secure HTTPS server on PAC controllers. Combined with a RESTful open and documented API, this new version allows developers to write applications that access data on the PAC using the developer’s programming language of choice with the well-known and widely supported JSON data format. This new capability allows software and IoT application developers to decrease time to market, reduce the development learning curve, and eliminate layers of middleware for secure Industrial Internet of Things (IIoT) applications.

Firmware version 9.5 for SNAP PAC R-series and S-series PAC-R and PAC-S controllers enables REST endpoints for both analog and digital I/O points as well as control program variables including strings, floats, timers, integers, and tables. REST endpoints are securely accessed using the new fully documented RESTful API for SNAP PACs. Names of RESTful endpoints are derived from a configured PAC Control program strategy file and are therefore unique to each PAC’s program and I/O configuration. Client data requests are returned in JavaScript Object Notation (JSON) format, enabling PAC controllers and I/O to be used with virtually any software development language with JSON support, including C, C++, C#, Java, JavaScript, node.js, Python, PHP, Ruby, and many more.

Database support is also available for database tools that work with JSON, like MongoDB, MySQL, and Microsoft’s SQL Server.

With the release of PAC Project 9.5, developers are no longer tied to a specific manufacturer’s software development environment. They can use the development environment and language of their choosing to write new software, create web services, and build Internet of Things applications.

RESTful data from PACs is secured using TLS encryption over HTTPS connections authenticated using basic access authentication (Basic Auth). RESTful data access can be restricted to read-only use, or allow reading and writing to I/O and strategy variables. The HTTP/S server is disabled by default and must be configured and enabled to operate, preventing unwanted or unauthorized access to the controller over HTTP.

Also included in this release are two Node-RED nodes, used for communicating with SNAP PAC controllers through the RESTful API with Node-RED, a visual tool for wiring up the Internet of Things. Node-RED is an open-source, graphical, flow-based application development tool designed by the IBM Emerging Technology organization that makes wiring up APIs, represented as “nodes,” simple and easy to do. Node-RED is particularly useful for developing IoT applications that interact with cloud-based platforms and APIs, such as IBM Bluemix, IBM Watson, Amazon’s AWS IoT, AT&T MX2, Microsoft Azure, and Google Cloud Platform.

In contrast to OT, IT enterprise networks use the same open standards and protocols found on the Internet. The Internet was founded on open communication standards like TCP/IP. Application-specific protocols are layered on top: HTTP/S, SMTP, SNMP, MQTT, and so on.

The Internet uses programming languages like JavaScript, Java, and Python and presents information using technologies like HTML5 and CSS, all of which are open.

 

Definitions:

 

  • MQTT—to collect device data and communicate it to servers
  • XMPP—to enable the near-real-time exchange of structured yet extensible data between two or more devices on the network
  • DDS—a fast bus for integrating intelligent machines
  • AMQP—a queuing system designed to connect servers to each other
  • API–(Application Programming Interface)—A set of protocols, routines, and tools that web-based applications can use to communicate with other web-based applications.
  • JSON–(JavaScript Object Notation)—The primary data format used for asynchronous communication between web browsers and web servers. JSON was primarily developed to replace browser plugins such as Flash and Java applets. JSON is a request/response method web browsers can use to ask for information from web servers.
  • REST–(Representational State Transfer)—A set of architectural constraints used to develop web applications. Designed as a common development standard for applications used on the Internet, REST confines developers to a specific set of rules to follow.
  • RESTful Architecture—When a web site or API is conforming to the constraints of the REST architecture, it is said to be a RESTful system.

 

 

Bots, Messaging, and Interface Visibility

Bots, Messaging, and Interface Visibility

Apps are so last year. Now the topic of the future appears to be bots and conversational interfaces (Siri, etc.). Many automation and control suppliers have added apps for smart phones. I have a bunch loaded on my iPhone. How many do you have? Do you use them? What if there were another interface?

I’ve run across two articles lately that deal with a coming new interface. Check them out and let me know what you think about these in the context of the next HMI/automation/control/MES generations.

Sam Lessin wrote a good overview at The Information (that is a subscription Website, but as a subscriber I can unlock some articles) “On Bots, Conversational Apps, and Fin.”

Lessin looks at the history of personal computing from shrink wrapped applications to the Web to apps to bots. Another way to look at it is client side to server side to client side and now back to server side. Server side is easier for developers and removes some power from vertical companies.

Lessen also notes a certain “app fatigue” where we have loaded up on apps on our phones only to discover we use only a fraction of them.

I spotted this on Medium–a new “blogging” platform for non-serious bloggers.

It was written by Ryan Block–former editor-in-chief of Engadget, founder of gdgt (both of which sold to AOL), and now a serial entrepreneur.

He looks at human/computer interfaces, “People who’ve been around technology a while have a tendency to think of human-computer interfaces as phases in some kind of a Jobsian linear evolution, starting with encoded punch cards, evolving into command lines, then graphical interfaces, and eventually touch.”

Continuing, “Well, the first step is to stop thinking of human computer interaction as a linear progression. A better metaphor might be to think of interfaces as existing on a scale, ranging from visible to invisible.”

Examples of visible interfaces would include the punchcard, many command line interfaces, and quite a bit of very useful, but ultimately shoddy, pieces of software.

Completely invisible interfaces, on the other hand, would be characterized by frictionless, low cognitive load usage with little to no (apparent) training necessary. Invisibility doesn’t necessarily mean that you can’t physically see the interface (although some invisible interfaces may actually be invisible); instead, think of it as a measure of how fast and how much you can forget that the tool is there at all, even while you’re using it.

Examples of interfaces that approach invisibility include many forms of messaging, the Amazon Echo, the proximity-sensing / auto-locking doors on the Tesla Model S, and especially the ship computer in Star Trek (the voice interface, that is — not the LCARS GUI, which highly visible interface. Ahem!).

Conversation-driven product design is still nascent, but messaging-driven products are still represent massive growth and opportunity, expected to grow by another another billion users in the next two years alone.

For the next generation, Snapchat is the interface for communicating with friends visually, iMessage and Messenger is the interface for communicating with friends textually, and Slack is (or soon will be) the interface for communicating with colleagues about work. And that’s to say nothing of the nearly two billion users currently on WhatsApp, WeChat, and Line.

As we move to increasingly invisible interfaces, I believe we’ll see a new class of messaging-centric platforms emerge alongside existing platforms in mobile, cloud, etc.

As with every platform and interface paradigm, messaging has its own unique set of capabilities, limitations, and opportunities. That’s where bots come in. In the context of a conversation, bots are the primary mode for manifesting a machine interface.

Organizations will soon discover — yet again — that teams want to work the way they live, and we all live in messaging. Workflows will be retooled from the bottom-up to optimize around real-time, channel based, searchable, conversational interfaces.

Humans will always be the entities we desire talking to and collaborating with. But in the not too distant future, bots will be how things actually get done.

Bots, Messaging, and Interface Visibility

Cisco Helps Customers Address Shadow IT Challenge

Do you know which clouds your company’s files are stored on? Did you even know you had a “shadow IT” problem? This sounds like the problem of the 80s and early 90s when forward thinkers brought PCs into the office because IT took so long to provide information or update reports. Or the past five years as people brought in their tablets and smart phones.

In this case, it’s either the Rolling Stones singing, “Hey! You! Get off of my cloud.” Or maybe Joni Mitchell, “I’ve looked at clouds from both sides now.”

According to Cisco, over the past year unauthorized use of public cloud services has growth 112 percent with the average large enterprise now using 1,220 individual cloud services.

So, Cisco has developed and released Cloud Consumption as a Service. The software discovers and continually monitors cloud usage across an organization and helps organizations reduce cloud risks and costs. It also delivers detailed analysis and benchmarking that helps the IT team proactively manage cloud usage across the organization.

Employees and business groups are increasingly going around IT to get the cloud services they need to do their job. The phenomenon, dubbed shadow IT, is growing exponentially and becoming a major headache for IT leaders. Recent analysis by Cisco reveals the average large enterprise now uses 1,220 individual cloud services. That’s up to 25 times more than estimated by IT. And the average number of cloud services has grown 112 percent over the past year, and 67 percent over the past six months.

Using the new software-as-a-services product, customers can reduce financial and security exposure by identifying risky cloud services, cloud use anomalies, and compliance issues. They can also cut cloud costs by finding ways to consolidate redundant services. Perhaps most importantly, the ability to discover and monitor cloud usage helps IT team better understand the needs of employees and internal teams. When combined with industry benchmarking, this critical information helps IT teams formulate a more strategic cloud strategy for their organization.

Making Educated Decisions on Cloud Services

CityMD is a fast-growing urgent care organization with 50 facilities across New York City and New Jersey. After using Cloud Consumption, CityMD were surprised to discover employees were using nearly 400 cloud services, with IT only formally supporting 15-20.

“Now that we have full visibility into our cloud usage we can make educated decisions about the services that are right for the business and get a better idea of what risks we may face,” said Robert Florescu, vice president of Information Technology at CityMD. “Our company was founded by ER doctors, so they want cloud services fast. With this information at our fingertips we can partner more effectively with our business groups. We can also proactively ensure we have the appropriate security and compliance measures in place.”

From IT Consultant to Business Consultant

To help Cisco improve the new product a number of Cisco partners participated in an early adopter program. Executives at Aqueduct Technologies and World Wide Technology describe the reaction of customers and the potential impact on their business..

“I recently had a conversation with a VP of IT at one of my larger customers. I was able to show him data in the Cloud Consumption dashboard and he was amazed at services he didn’t realize he was using,” said Chris Jennings, the President and chief strategy officer at Aqueduct Technologies—a regional partner based in Boston, Massachusetts. “We then had the opportunity to help them identify public, private and hybrid solutions that can impact their business. Cloud Consumption gives us the insight to transform our relationship from an IT consultant to a business consultant.”

“We want to empower customers to monitor and manage cloud services,” said Jim Melton, the technical architect of the Cloud Practice at World Wide Technology—a national partner headquartered in St. Louis, Missouri. “Cisco Cloud Consumption not only helps with this goal, it also provides insight to help define the next steps in a customer’s cloud journey, and concrete data to build the business case for their cloud initiatives. Cloud Consumption is helping us move from a product-centric conversation to a solutions-oriented, business outcome approach with our customers.”

Price and Availability

The new Cisco Cloud Consumption as a Service is now available globally via qualified Cisco channel partners. The new service costs approximately $1-2 dollars per employee per month, depending on the size of the business.

Year End Internet of Things Acquisition

I’ve taken some time during the holidays to get off the daily posting gerbil wheel to study even more deeply into the Industrial Internet of Things.

You may ask why. Every analyst firm now has an IoT practice. They do consulting of one sort or another. But, many are constrained by their models. I’ve seen some of the analyses. I think I can contribute.

Last week just before Christmas, PTC announced acquisition of Kepware to deepen its Internet of Things offering. I’ll have a longish analysis to kick off the New Year Monday.

The Internet of Things is a strategy, not a thing. It is described by an ecosystem, not a product.

A look at the 30-year history of the company reveals that it has grown by acquisition. First within its (then) core technology from CAD to modeling. Then into PLM. Then a Retail practice. Then it developed a services platform. None of these were core to my coverage, so PTC is not a company I’ve followed closely.

Then Jim Heppelmann, CEO, caught the Internet of Things virus. Meanwhile, the ThingWorx developers had a cool technology and cast about looking for a focus. Ah, it perfectly fits within the Internet of Things. Seemed like a fit, and an acquisition was consummated.

Following ThingWorx (2014) was Axeda–a company that itself had undergone more than one transformation. Then ColdLight helped complete the portfolio with its analytics engine.

I’ve consulted with a few companies and talked with others who wanted to jump into the Internet of Things by simply bolting on a product acquisition. They thought maybe just adding sensors (the “things” of the IoT, right?) they would be and IoT company. Or maybe a networking company.

No, PTC has the right idea. It remains to be seen if it bought the best technologies and if they can make them work together. I’ve seen companies fail at that point.

More later.

Meanwhile, I hope your 2015 was successful and that your 2016 will be one of personal growth and success.

Bots, Messaging, and Interface Visibility

Dell World Features IoT, Cloud, Analytics

I received an invitation to Dell World that seemed like a great opportunity to broaden my horizons and dig deeper into the technologies that will provide the platform for Industrial Internet of Things applications and benefits.

When one of the Dell people asked me how it went, I told them that learning about Dell’s technologies helped fill in a gap in my coverage of the whole “connected manufacturing” space. As perhaps the only manufacturing focused writer attending, I certainly received attention

The ecosystem that many refer to as Internet of Things or IoT includes connected things, database + storage (cloud), analytics, and visualization. Dell does not play in the “things” space as defined by the end devices, but it has significant data center, software, and analytics plays. Two items announcemented at Dell World expanded the offering.

The first that Michael Dell, CEO of Dell, announced during his keynote was an IoT product called Edge Gateway 5000. This industrialized intelligent, connected device serves to gather inputs from the “things” of the system, perform some analytics, and serve them to the cloud. The second was announced jointly with Satya Nardella, Microsoft CEO. This is a cloud partnership where Dell will be supporting Microsoft Azure.

Some excerpts of the announcements are below, but first an observation. In the industry I cover, the CEO will usually appear for a few minutes at the keynote and talk a little about financials or the theme of the week. Then they have a motivational speaker who goes for 45 minutes. Sometimes there is a product speaker who will do 30 minuts of product introductions.

Dell held the stage for most of the 90+ minutes. He gave an outline of the new, private company, discussed the industry, interviewed several customers, yielded the floor for the CMO to talk about Dell company support for entrepreneurship, then sat for a 30 minute conversation with Nadella. He showed intelligence, grace and humor.

Here are excerpts from the product announcements.

Wednesday at Dell World, Dell and Microsoft Corp. announced a new cloud solution and program that enable organizations of all sizes to use the Microsoft cloud platform to transform their business. A new, Microsoft Azure-consistent, integrated system for hybrid cloud and extended program offerings will help more customers benefit from Azure and Dell to drive greater agility and increased time to value, whether they choose on-premises or public cloud solutions.

Dell today announced the launch of the new Edge Gateway 5000 Series purpose-built for the building and factory automation sectors. Composed of an industrial-grade form factor, expanded input and output interfaces, and with wide operating temperature ranges, the Edge Gateway 5000, combined with Dell’s data analytics capabilities, promises to give companies an edge computing solution alternative to today’s costly, proprietary IoT offerings.

The Dell Edge Gateway sits at the edge of the network (near the devices and sensors) with local analytics and other middleware to receive, aggregate, analyze and relay data, then minimizes expensive bandwidth by relaying only meaningful data to the cloud or datacenter. Thanks to new Dell Statistica data analytics also announced today, Dell is expanding capabilities out to the gateway. This means companies can now extend the benefits of cloud computing to their network edge and for faster and more secure business insights while saving on the costly transfer of data to and from the cloud.

Follow this blog

Get a weekly email of all new posts.