Alliances Advance Edge to Cloud Analytics and Computing

Alliances Advance Edge to Cloud Analytics and Computing

Much of the interesting activity in the Industrial Internet of Things (IIoT) space lately happens at the edge of the network. IT companies such as Dell Technologies and Hewlett Packard Enterprise have built upon their core technologies to develop powerful edge computing devices. Recently Bedrock Automation and Opto 22 on the OT side have also built interesting edge devices.

I’ve long maintained that all this technology—from intelligent sensing to cloud databases—means little without ways to make sense of the data. One company I rarely hear from is FogHorn Systems. This developer of edge intelligence software has recently been quite active on the partnership front. One announcement regards Wind River and the other Google.

FogHorn and Wind River (an Intel company) have teamed to integrate FogHorn’s Lightning edge analytics and machine learning platform with Wind River’s software, including Wind River Helix Device Cloud, Wind River Titanium Control, and Wind River Linux. This offering is said to accelerate harnessing the power of IIoT data. Specifically, FogHorn enables organizations to place data analytics and machine learning as close to the data source as possible; Wind River provides the technology to support manageability of edge devices across their lifecycle, virtualization for workload consolidation, and software portability via containerization.

“Wind River’s collaboration with FogHorn will solve two big challenges in Industrial IoT today, getting analytics and machine learning close to the devices generating the data, and managing thousands to hundreds of thousands of endpoints across their product lifecycle,” said Michael Krutz, Chief Product Officer at Wind River. “We’re very excited about this integrated solution, and the significant value it will deliver to our joint customers globally.”

FogHorn’s Lightning product portfolio embeds edge intelligence directly into small-footprint IoT devices. By enabling data processing at or near the source of sensor data, FogHorn eliminates the need to send terabytes of data to the cloud for processing.

“Large organizations with complex, multi-site IoT deployments are faced with the challenge of not only pushing advanced analytics and machine learning close to the source of the data, but also the provisioning and maintenance of a high volume and variety of edge devices,” said Kevin Duffy, VP of Business Development at FogHorn. “FogHorn and Wind River together deliver the industry’s most comprehensive solution to addressing both sides of this complex IoT device equation.”

Meanwhile, FogHorn Systems also announced a collaboration with Google Cloud IoT Core to simplify the deployment and maximize the business impact of Industrial IoT (IIoT) applications.

The companies have teamed up to integrate Lightning edge analytics and machine learning platform with Cloud IoT Core.

“Cloud IoT Core simply and securely brings the power of Google Cloud’s world-class data infrastructure capabilities to the IIoT market,” said Antony Passemard, Head of IoT Product Management at Google Cloud. “By combining industry-leading edge intelligence from FogHorn, we’ve created a fully-integrated edge and cloud solution that maximizes the insights gained from every IoT device. We think it’s a very powerful combination at exactly the right time.”

Device data captured by Cloud IoT Core gets published to Cloud Pub/Sub for downstream analytics. Businesses can conduct ad hoc analysis using Google BigQuery, run advanced analytics, and apply machine learning with Cloud Machine Learning Engine, or visualize IoT data results with rich reports and dashboards in Google Data Studio.

“Our integration with Google Cloud harmonizes the workload and creates new efficiencies from the edge to the cloud across a range of dimensions,” said David King, CEO at FogHorn. “This approach simplifies the rollout of innovative, outcome-based IIoT initiatives to improve organizations’ competitive edge globally, and we are thrilled to bring this collaboration to market with Google Cloud.”

Industrial Internet of Things Integral Part of Industry of Things Conference

Industrial Internet of Things Integral Part of Industry of Things Conference

The Industry of Things World USA conference in San Diego in its third year is becoming a premier Internet of Things (IoT) event in the US. Organized by weConnect in Berlin, Germany, it attracts a few hundred attendees, excellent speakers, and me (of course). The organizers leverage worldwide contacts–organizing similar events in Berlin and Singapore. They also have similar events in other technology areas.

Topics cover a range of IT and OT subjects. I make sure to get to the OT people who are here. This is a quick recap of what I’ve seen so far.

Charlie Gifford spoke at a breakout session on ISA95. He updated us on the latest changes proposed to the standard. His other focus was to promote event-driven architecture. He suggested that we build a library of operations events such that when an event occurs information about the change with the updated data is broadcast to subscribers. This is a great bandwidth saving over continuous point-to-point connections. He is also concerned with how to interconnect the many existing databases within a plant or production location.

Jagannath Rao, SVP of IoT and MindSphere for Siemens, discussed the evolution of MindSphere and its latest incarnation. Key point–Siemens has committed to openness–providing for open APIs especially in its MindSphere platform and adoption of open technologies such as OPC UA.

MindSphere v 2 enabled people to go out and do Proof of Concept (PoC) projects. From these Siemens could determine what customers were interested in and what the problems were that they were trying to solve. This all fed back into the product development process leading to the recent release of v 3.

V3, now a product, builds on open technologies–open being the key word. The platform moved from SAP Leonardo to Amazon Web Services (AWS) providing a more robust cloud experience. AWS is a Infrastructure as a Service, while MindSphere is Platform as a Service containing open APIs and data models. The next step on the journey is for Siemens to build out an ecosystem of 3rd party applications.

When asked about TSN, Rao also brought up 5G, both of which point out the importance of the Edge for initial processing of IoT data. Siemens is preparing for this next step, for example its Sinumeric Edge contains much analytics power, then ability to communicate information not just vast streams of data.

OPC vice president of marketing Stefan Hoppe, during his breakout session, discussed the acceptance of OPC UA in industry and the power of the release of publish/subscribe with OPC UA. His strong discussion point was to emphasize that OPC UA is not a protocol. It is an information model. It uses protocols—AMQP, MQTT, DDS, JavaScript, whatever to communicate the information from one device to another (or many). Proponents of a protocol who suggest that a protocol is superior to OPC UA miss the point that it’s not a protocol but actually an information model.

One key potential misunderstanding…Hoppe’s presentation made OPC appear to be German-centric and tied to the German Industrie 4.0. We need to keep in mind that the OPC Foundation Board is only 33% German, and that OPC UA lends itself to the digitalization efforts of any of the countries developing standards. It has become the official communication technology for many standardization efforts including the Open Process Automation Forum. It is truly global.

Lin Nease, IoT technologist at Hewlett Packard Enterprise, chatted with me at a one-on-one meeting about the edge and the power of Xeon server technology in its edge devices as well as software-defined control. I think I’ll be seeing more from HPE as it builds out its IoT infrastructure.

Schneider Electric Asset Performance Management Offering Shows Growth

Schneider Electric Asset Performance Management Offering Shows Growth

So last week I shared an update on Schneider Electric from the ARC Forum–mostly on cybersecurity. A helpful marketing person guided me to the press release with all the data that updated the software side of the week’s news–specifically asset performance management. For the most part the discussion did not center on product updates but on “increasing momentum surrounding customer adoption”. In other words, Schneider wanted to highlight an area of software not often brought to center stage and show that it is a growth area.

Kim Custeau (I misspelled her name in my last post, I believe–thank you autocorrect), Asset Performance Management Business Lead, shared how investments in the cloud, advanced machine learning, and augmented reality, coupled with new partnerships, have empowered customers.

“Defining and executing an asset performance strategy is a critical component to improving productivity while safeguarding business continuity,” she said. “We have been delivering proven, industry leading asset performance solutions for nearly 30 years, and continue to invest in a long-term strategy to drive innovation in this area. Our focus is to provide real value to our customers by empowering them to maximize return on capital investment and improve profitability. We are proud to see our customer results speak for themselves with significant savings.”

Machine learning and prescriptive analytics:

  • Duke Energy prevented an estimated $35 million cost from early warning detection of a steam turbine problem
  • Ascend Performance Materials now responds faster to alerts saving an estimated $2 million through avoided plant shutdowns

Augmented Reality:

  • BASF is implementing AR to improve asset performance, reliability, and utilization while increasing production efficiency and safety because technicians leverage an augmented digital representation of the asset.

Cloud and Hybrid Deployment:

  • WaterForce partnered with Schneider Electric to develop and IIoT remote monitoring and control system in the cloud that allows farmers to operate irrigation pivots with greater agility, efficiency, and sustainability.

New Partnerships:

  • MaxGrip and Schneider Electric announced a partnership to expand APM consulting and add Risk-based Maintenance capabilities. The APM Assessment is a first step for industrial companies to evaluate asset reliability and digital transformation strategy.
  • Schneider Electric and Accenture completed development of a Digital Services Factory to rapidly build and scale new predictive maintenance, asset monitoring, and energy optimization offerings. As a result, a large food and beverage company saved over $1 million in maintenance costs
OPC Foundation Cites Advancements at Recent ARC Forum

OPC Foundation Cites Advancements at Recent ARC Forum

The OPC Foundation was active during the recent ARC Industry Forum in Orlando as a Platinum Sponsor and presenting a press conference. With OPC UA released and in use and the publish/subscribe about to be release, OPC Foundation’s emphasis has been on companion specifications. It had a joint press conference with the FieldComm group to discuss its joint working group and then released news of a released companion specification with Ethernet Powerlink. The last release, something I was able to work on pre-release review, concerns a study with ARC Advisory Group on adoption of the UA specification.

Below are some details. More at the Foundation website.

OPC and FieldComm

The OPC Foundation and FieldComm Group announced an alliance to advance process automation system multi-vendor interoperability and simplified integration by developing a standardized process automation device information model.

A joint working group between OPC Foundation and FieldComm Group, tasked with developing a protocol independent companion specification for process automation devices, was formed in late 2017. The goal of the working group is to leverage the extensive experience of FieldComm Group with the HART and FOUNDATION Fieldbus communication protocols to standardize data, information, and methods for all process automation devices through FDI using OPC UA. The OPC UA base information model and companion Device Information (DI) specification will be extended to include the generic definition and information associated with process automation devices.

The OPC Foundation and FieldComm Group have worked together for over a decade, initially working on the development of the EDDL specification and most recently on the creation of FDI technology.

“FDI provides the new standard for device integration to deliver a protocol independent path to configuration, diagnostics and runtime operation for process devices,” states Ted Masters, President and CEO of FieldComm Group. “The partnership between OPC Foundation and FieldComm Group further builds upon the common information model of both to deliver process automation data in context which is the key to enabling value from enterprise systems and analytics. The 350+ suppliers of devices and applications that are members of FieldComm Group have an opportunity to benefit from the key initiative to develop a standard process automation information model by their adoption of FDI and OPC UA technologies.”

“I’m excited that the OPC Foundation and FieldComm Group are working together on this important initiative, and will be partnering with other organizations, end-users and suppliers to make the dream of a standardized process automation device information model a reality. This is truly a breakthrough in our industry that will provide significant operational benefits across all points of the value chain,” states Thomas J. Burke, OPC Foundation President and Executive Director.

“This important collaboration will provide a solid foundation for standardization of devices that will serve as the base infrastructure for the numerous other collaborations that the OPC Foundation is doing across international boundaries,” says Stefan Hoppe, OPC Foundation Global Vice President.

The joint working group plans to release an extensible, future-proof process automation information model specification during the first quarter of 2019.

OPC and Powerlink

An OPC UA companion specification is now available for POWERLINK according to a joint announcement by the OPC Foundation and the Ethernet POWERLINK Standardization Group (EPSG). The companion specification describes how payload data is exchanged between POWERLINK and any OPC UA platform. The result is integrated communication from the sensor to the cloud.

“As technologies, OPC UA and POWERLINK complement each other perfectly,” emphasized Thomas Burke, President of the OPC Foundation, in his announcement. “POWERLINK is among the leading real-time bus systems used in plants and machinery. Together with OPC UA, POWERLINK networks can now communicate seamlessly and securely with the IT environment and into the cloud.”

“This specification allows OPC UA and POWERLINK to fuse into a single network,” added Stefan Schönegger, Managing Director of the EPSG. “We’re then able to join devices from different manufacturers and across different levels of the automation pyramid into a single, cohesive system.”

A joint working group between the OPC Foundation and the EPSG had been working on the specification since 2016. The document can be downloaded from the OPC Foundation website.

OPC UA Adoption

OPC Foundation announced today the release of an in-depth ARC Advisory Group report on the important role the OPC data connectivity standards play in control automation today and in future IIoT and Industrie4.0 based solutions.

Key ARC report findings confirmed that with an estimated global install base of over 45 million units, OPC is the de facto standard for open data connectivity and that OPC UA is well positioned to serve as the next data connectivity foundation for control automation applications in traditional industrial settings and new ones like building automation, transportation, and others. Key contributing factors to the continued success of OPC UA included the scalability, performance, and robustness of the technology and the large community of end-users, vendors, and other standards bodies actively working with the OPC Foundation to best utilize OPC UA in their applications.

According to Thomas Burke, OPC Foundation president, “the [ARC report] findings accurately reflect what we [OPC Foundation] have been seeing from an adoption and collaboration point of view. I highly recommend reading this ARC report for a high level perspective of what OPC UA is doing in the market and the future of data connectivity”

Commenting on the popularity of the OPC UA standard, Mr.Burke explained “OPC UA has something to offer for everyone from end-users and product vendors to other standards bodies. After people look at what is really out there as far as a single standard that has the scalability, performance, and flexibility to meet the challenges of modern data connectivity and interoperability and has the reputation and a large enough adoption base needed to make it a safe investment – they come to realize OPC UA is the real deal.”

“OPC technology has become a de facto global standard for moving data from industrial controls to visualization up to MES/ERP and IT cloud levels”, according to Craig Resnick, Vice President, ARC Advisory Group. “The rapid expansion of OPC UA in automation, IIoT, and into new, non-industrial markets suggests that OPC will remain an important technology for multivendor secured interoperability, plant floor-to-enterprise information integration, and a host of other applications yet to be envisioned.”

Compute Power Meets IoT at HPE Discover 2017 in Madrid

Compute Power Meets IoT at HPE Discover 2017 in Madrid

Hewlett Packard Enterprise (HPE) held its European customer conference, Discover, in Madrid this week. Points of emphasis that impacted me included Internet of Things business, high power computing, a view of the changes going on at HPE, and a look at the future of IT and the plant.

Bloggers and influencers

I was here as part of a blogger program separate from press and analysts. Bloggers are a livelier group than press. I think that I am the only independent blogger in manufacturing in the US (everyone else works for a magazine or analyst firm). There were 25 bloggers at Discover from countries as diverse as Denmark, Belgium, Germany, Italy, Spain, New Zealand, Canada, and the USA. Rather than attending press conferences our program included “coffee talks” that were live-streamed on the Web. These were informal presentations plus question-and-answer sessions.

There was one press conference I attended—the announcement of the partnership with ABB on the mini-data center product. Instead of conversation and give-and-take, one or two journalists asked questions in a challenging manner that were seldom designed to elicit more information. Note: I wrote about the partnership earlier this week.

Retiring CEO Meg Whitman used a quote from Gartner in her remarks. “The Edge will eat the cloud.” HPE has developed edge computing devices called Edgeline that I discussed in August after my first meeting with the company. These are powerful computing devices based on PXI platform technology from National Instruments. The blogger group devoted some time discussing how valid that comment was.

We concluded that you will need both. I have an example from a conversation I had with Rod Anilker, a technologist in the OEM group. Imagine taking the computing power and openness of the HPE platform to replace proprietary controllers such as CNC, PLC, DCS. These devices at the edge would solve many control and other edge applications with the additional capability of sending data to the cloud.

Now, imagine the storage and computing power HPE has accumulating vast amounts of data—maybe from a power generation fleet or a company’s many refineries—achieving scale sufficient to do some pretty cool pattern recognition. The predictive, prescriptive, and planning possibilities would be awesome.

Pieces of HPE Corporate Puzzle

Antonio Neri, President and COO and next year’s CEO, let general session with these main points.
• Intelligent Edge (where the action is->acquisition of Aruba so important)
• Secure platforms from edge to core to cloud
• Memory-driven computing (acquisition of Silicon Graphics another important piece)
• Artificial Intelligence (inside everything)
• Software-defined and multi-cloud platforms
• Partner ecosystem
• Flexible consumption (scale up, scale back)
• Advise and transform HPE PointNext
• Outcome-as-a-Service, future of enterprise computing


PointNext is the services arm introducing the concept of Edge-as-a-Service. In face, HPE features “as-a-service” in many guises.

This concept seems to be modeled on ideas emanating from GE’s consumption of services mode. Capturing and processing data at source where action needs to happen as the foundation of the model. Then you provide IT in a way that scales, pay-as-you-go concept, subscription-based. Therefore, the customer has flexibility and reduced risk.

Take the expertise from a data center that runs 24/7 and put it at edge. Then it’s all about extracting data. Take this into machine learning. This starts to morph into the concept from the OEM group.

The model architecture takes HPE’s new GreenLake plus EaaS. GreenLake edge compute includes design services, information services, operational & support services, and pay-as-you-go. Upfront consulting to help evaluate the client’s requirements and business process and recommends solution packages.

Technology overview

David Chalmers, research and development executive, briefed us with a business and technology overview.

Hewlett Packard Enterprise is one of the two business left standing after Meg Whitman (and the board) split the company following some bad years of plans and leadership. Following the split, several businesses were divested.

Chalmers related HPE has been changing fast into an infrastructure/solutions company (he said, the world’s largest). “The strength of our portfolio is the best in 10 years, much from organic development. The SGI acquisition yielded more compute options (SGI acquisition), including low power, high performance computing. By 2022 60% of data will never get to data center, it’ll reside at the edge. Therefore intelligent edge is important. SGI brought high performance analytics.”

Another couple of tidbits. At the new HPE people bring business cases first, then talk about the technology solution. The OT world order of magnitude larger than IT world. (Hmmm)

Oh, and there were many new products. They don’t all apply to my areas of coverage. But the engineers have been busy.


I just realized I made it through the entire discussion without mentioning the technology that brought me to HPE Discover—Internet of Things. Much of that relates to the Edge and devices such as Edgeline. Obviously important IoT garnered significant floor space in the exhibition area.

There will be more in another post.


Follow this blog

Get every new post delivered right to your inbox.