Software Is Center Stage at Rockwell Automation Event

Software Is Center Stage at Rockwell Automation Event

Data is the new currency.

I heard that somewhere. There is much truth buried in the thought. That makes software and connectivity key technologies. I hear this everywhere. I am thinking through what I learned at the Rockwell Automation event while at an enterprise computing event in Spain. Enterprise IT has discovered Industrial Internet of Things (IIoT). Silos are collapsing everywhere.

Still, it is surprising that Rockwell Automation, the quintessential hardware company, emphasizes software. This has become the key component of the Connected Enterprise. There must be sales dollars here, also. Theory is nice, but sales are nicer.

By the way, here is proof I was there. A “Robot Selfie” from the Innovation Booth.

The Rockwell software portfolio has been growing a step at a time. This year it looks like it has most of the pieces assembled for a full manufacturing software suite. And this is not only MES. That is a component, for sure. But also there is connectivity, historian, databases. And now what appears to be a robust analytics application.

John Genovesi, Vice President of Information Software, told me during our interview, that the company had made a couple of small acquisitions (in Silicon Valley they call it “aquihiring”) last March and already the new team has written an analytics engine that forms the guts of the new application.

Project Scio (see-oh, from the Latin to know) is the next step. To make decisions when and where they matter most, new capabilities offered through Project Scio reduce hurdles to unleashing information. These capabilities open up access to ad-hoc analytics and performs advanced analysis by pulling structured and unstructured data from virtually any existing source in the enterprise. Project Scio can also intelligently fuse related data, delivering analytics in intuitive dashboards – called storyboards – that users can share and view. Users then have the ability to perform self-serve drill downs to make better decisions, dramatically reducing the time to value.

“Providing analytics at all levels of the enterprise – on the edge, on-premises or in the cloud – helps users have the ability to gain insights not possible before,” said Genovesi. “When users gain the ability to fuse multiple data sources and add machine learning, their systems could become more predictive and intelligent. Scio puts analytics to work for everyone. By its addition to the scalable and open FactoryTalk Analytics Platform, Project Scio gives users secure, persona-based access to all data sources, structured or unstructured. And a configurable, easy-to-use interface means that all users can become self-serving data scientists to solve problems and drive tangible business outcomes.”

Key attributes of Project Scio include the following:

  • Device Auto-Discovery: Manually mapping software to each plant-floor device can be a time-consuming and error-prone process. Project Scio can auto-discover Rockwell Automation devices and tags, as well as third-party device data, to save time and help reduce risk. Additionally, the auto-discovery process gives users access to more detailed information than is typically available through manual mapping, such as device name, line location and plant location.
  • Leave Isolated Analytics Behind: Rather than leave data at its source and take database snapshots, Project Scio brings data into a centralized location and can continually refresh that data. Additionally, connections to data sources only need to be established once. This connection allows users to create custom analytics and refresh them at their preferred rate without the support of a data scientist.
  • Flexible Machine Learning (ML): Use the right ML algorithm for the right use case. Project Scio is configurable to support many industry-leading algorithms, including SparkML, MLLib and Python.
  • Closed-Looped Analytics:Using either ML or predefined settings, Project Scio includes capabilities that can monitor operations and automatically trigger control adjustments if processes start to fall outside allowable parameters. This can help users optimize control, improve product quality and consistency, and reduce scrap and waste.
  • Applications Marketplace: Rockwell Automation will introduce an applications marketplace for applications developed in-house and by third parties. The ability to access any data source and create custom analytics for each user’s application is a central feature. However, users can also take advantage of pre-engineered FactoryTalk Analytics applications from Rockwell Automation. These applications allow users to monitor common KPIs, such as OEE and quality, in a standardized way and without any configuration.
  • Open Architecture: Industrial producers cannot be expected to rip and replace all their legacy control and information systems before gaining value from analytics. These scalable and open-architecture capabilities are designed to be extended to a full ecosystem of IIoT data sources. The quick connection to the full range of systems that feed data into a Connected Enterprise includes controllers, MES software and edge devices.

In addition to these Information Solutions, Rockwell Automation offers a range of Connected Services which helps provide customers the ability to ensure network integrity, security, infrastructure design and maintenance, and remote monitoring of equipment including predictive maintenance. These services can help customers with every aspect of their Connected Enterprise journey, including developing an IIoT infrastructure and strategy, and providing remote monitoring and analytics.

New OPC UA Support

Rockwell spokespeople made sure that I understood two things with this year’s message. Scalable. And Open. The company is adopting open, interoperable communications. Notice above that the self-discovery is not only Rockwell’s products, but also those from other companies.

Another interoperable standard that Rockwell has not supported much for years is OPC United Architecture (UA).

Interesting quote from the news release, “We actually helped develop the OPC UA specification, and we’re now adding OPC UA support into our portfolio.”

The initial offering on the software side includes OPC UA client/server functionality in the FactoryTalk Linx software, which it will be launching in early 2018. There also are future product-line extensions planned for both software and hardware portfolios. Second, the FactoryTalk Linx Gateway provides an OPC UA server interface to deliver information collected by FactoryTalk Linx from Logix 5000 and other Allen-Bradley controllers to external OPC UA clients. This permits third-party software to coexist with FactoryTalk software.

For example, custom-built MES applications can interact directly with the control layer to better coordinate production. The FactoryTalk Linx Gateway also will include a new FactoryTalk Linx Data Bridge software service that will transfer sets of tag data from one data source to another at a user-defined rate. This permits movement of data between servers and, more importantly, enables Logix 5000 controllers to indirectly interface with OPC UA servers. Among its many uses, this software could allow Logix 5000 controllers to interact and control a robot, weight scale or similar automation device using OPC UA.

Canvass Analytics, OSIsoft to Deliver Predictive Insights

Canvass Analytics, OSIsoft to Deliver Predictive Insights

Whether it is the Industrial Internet of Things or Industry 4.0 or Smart Manufacturing no benefits are garnered at the end without a superb analytics engine. Recently I talked with Humera Malik, CEO of Canvass Analytics, about a new analytics company and product that brings Artificial Intelligence (AI) and Machine Learning (ML) to the field.

Early in my management career we called accounting “ancient historians” because reports only came out 10 days following a month end. That is too late to be what we call these days “actionable information.”

Turns out that a similar problem has existed in the predictive analytics field. OSIsoft and others have provided tools to capture huge amounts of industrial and manufacturing data. To get anything out of it you needed to establish a project, bring in a bunch of data scientists, and try to glean some trends or fit some models.

What was needed was a powerful engine that can use this data closer to real time, fit it into a model (selecting one from among several), and give operators, maintenance technicians, engineers, and others information in a useable time frame without bringing in a bunch of data scientists. These data scientists it turns out need to reside in the software. The entire process must be transparent to the user.

Enter Canvass Analytics, a provider of AI-enabled predictive analytics for Industrial IoT, which just announced a partnership with OSIsoft, a global leader in operational intelligence, that will enable Industrial companies to accelerate the return on investment of their IoT initiatives.

Malik commented, “Predictive and automated analytics gives operations teams the insights to answer questions such as, how can I increase yield, how can I reduce downtime and how can I reduce my maintenance costs? Canvass’ AI-enabled analytics platform accelerates the delivery of predictive insights by automating data analysis and leveraging machine learning technologies to adapt to data changes in real-time. For operations teams, this means they have the latest intelligence in order to make critical operational decisions.”

The combination of OSIsoft’s methodology to collect, store and stream data from any Industrial IoT source with Canvass’ AI-enabled automated analytics platform brings a new approach to creating predictive models that continually retrain themselves. With the resulting insight, Industrial companies have the potential to reduce plant maintenance by up to 50 percent and optimize plant operations by 30 percent.

“We are enthusiastic about the value that we see companies like Canvass Analytics extracting from the vast amounts of IIoT and other streaming data that we collect in our role as the single source of the truth,” said J. Patrick Kennedy, founder and CEO of OSIsoft.

Preparing Organizations and Workforce for the Future

Preparing Organizations and Workforce for the Future

Some interesting technology trends are developing concerning impact on the future workforce. Liam Quinn, sr. VP and CTO Dell Technologies, talked with me a couple of weeks ago about the results of research and a report with the Institute For The Future detailing these trends. The report is not specifically about manufacturing, but the ideas were all applicable.

Part of the discussion focused on a future workforce—what it will look like, what tools it’ll use, how it’ll use the tools, and so forth. That started a thought process about how we would train people for this new technological world. How do we get the maximum number of people (students) involved so that they do not get left behind.

Those thoughts led to my Dell Technologies contact sending me a link to a blog by Ari Lightman, Distinguished Service Professor, Digital Media and Marketing at Carnegie Mellon University’s Heinz College. He co-leads the Chief Information Security Officer Executive Education and Certification Program and is Commercialization Advisor for the Center for Machine Learning and Health. Ari is also Director of the CIO Institute at Carnegie Mellon University. He teaches classes focused on assessing and measuring the impact of emerging technologies, including Digital Transformation.

How influenced are you by dystopian science fiction? Lightman begins his essay referring to how sci fi movies often depict future technology that is opposed to humans. Elon Musk is garnering headlines by talking about the evils of artificial intelligence.

Quinn discussed the future workforce of human-machine partnership in particular glowing terms with obvious excitement about the possibilities. I was curious about preparing for that positive future. So, we’ll dive into Lightman’s thoughts.

Says Lightman, “Questions we all should be thinking about: How will we interface with these emerging technologies? What are potential hurdles that need to be addressed? In the future, where human and machine interaction is seamless, does everyone benefit or are parts of society left behind? These secondary and tertiary impacts of an increasingly digitized world need to be examined and developed along with input not just from consumers and technologists but also from economists, regulators, ethicists, etc.”

Discussing conferences and colloquia at Carnegie Mellon, he states, “ The question is, what do organizations need to do today? When I look around the audience at these talks and get a chance to speak with participants, their enthusiasm often gives way to confusion and disillusionment when they try to reconcile where they currently are in this idealized future portrayed by the speaker.”

How do we cope with these thoughts, fears, and expectations in order to prepare for the future? Lightman suggests taking a pragmatic view.

The first step is to develop resiliency. – Remember the old saying “You have to get back on the horse that threw you”. Shocks will occur and they will become more frequent, so how do organizations adapt and learn how to minimize these disruptive shocks – become more resilient? Those who are complacent will become disenfranchised. The organizations my institution has worked with, and who we consider ahead of the curve, have put some of the following into place:

Build a Culture around Data

At Carnegie Mellon University, courses in Deep Learning, Machine Learning, Artificial Intelligence and other ways to interpret data have exploded and will continue to grow in popularity. From an organization perspective, how do we operationalize data-driven discoveries; what are the appropriate governance structures; how do we prepare (understand, predict and organize) around new regulations concerning privacy (e.g. GDPR), how do we incorporate security (proactive and reactive) into the beginning of any development effort (not simply as a bolt on); how do we provide appropriate levels of education on different data types, stores, analytical methods and interpretation?

Simplify Complexity

There is an explosion of data, coming from a multitude of different sources in a variety of forms resulting in a slew of possible interpretations. How do we explain this to various stakeholders with different objectives and levels of expertise? Hint: Learn how to tell effective stories.

Build a Safe Place for Experimentation

If disruptive shocks and complexity are increasing and the pace of technological innovations is accelerating; organizations will need to learn how to experiment. Too many unknowns leading to indecisiveness can be addressed through experimentation.

Embrace Uncomfortable Discussions
There is no covering up that disruption will lead to displacement. An open dialogue on how technological advance will impact industries, companies and employees needs to occur to level set expectations and prepare the workforce.

Understanding Employees

We spend an inordinate amount of time and money on understanding consumers and little on our own employees. Effective use of technology is predicated on understand motivation and incentives, utilization requirements, and adoption patterns. We are approaching the most inter-generational workforce ever, resulting in different behavior patterns, learning modalities and preferred ways of working. Knowing your employees can help with smoother technical adoption, understanding of consumer behavior and the four other initiatives mentioned.

I suggest reading the entire essay, but I like these ideas. A lot of wisdom there.


Let’s Tour Dell EMS IoT Booth At Its User Conference

Let’s Tour Dell EMS IoT Booth At Its User Conference

Dell EMS Internet of Things (IoT) group assembled a mini supply chain as its booth at the user conference Dell EMS World in Las Vegas in May. At the October Dell EMS World in Austin, these were put together as an ice cream factory and distribution, and the booth featured an ice cream machine. I sure could have used an ice cream by the time I got through all the exhibits.

The Dell IoT Gateway was the common denominator of the exhibit tying everything together.

The first station features construction. Here are a couple of guys trying out the DAQRI augmented reality helmets. I had the opportunity to try these in Hannover. A really cool application of AR.

They are looking at a combination of the construction (see the red “steel” framework) and drawings that show the layout of electrical conduit, HVAC ducting, and other details. As a construction worker, they can get a feel of where things go, as well as spot interferences the designer missed.

This station showed product on its way to market through sensing and communication from Nokia.

Below is a layout of the Emerson process manufacturing system.

They brought actual pipe, pump, motor, instruments, wirelessHART communication. No, it didn’t make ice cream.

This station featured IMS Evolve–an application that brings sensor data into the cloud and provides track and trace, as well as other analytics, assuring the safety of the food product through the supply chain from the point of view of proper temperature.

Don’t forget security! Here is a photo of a physical security video system from V5.

The Dell Gateway is an edge device capable of accumulating data from the disparate sources, performing storage and analysis at the edge then sending information to the cloud for further analysis.  It seems that everywhere I go, the “edge” is the place where innovation is centered right now. This simple demo showed the power of the edge.

MIMOSA Asset Lifecycle Information Model Open Meeting Set

MIMOSA Asset Lifecycle Information Model Open Meeting Set

Manufacturing and production information is rapidly moving to the cloud. I wrote yesterday about what all the companies are trying to do to bring information into their ecosystems. Not all the efforts promote interoperability. Dell is open source, coming the closest to the ideal. Microsoft and Siemens are closest for individual companies.

What they are all lacking is bringing in asset lifecycle information.

Enter MIMOSA, developer and proponent of the most complete asset lifecycle information model. CCOM has been publicly proved in the Oil & Gas Pilot Demo Project and in several private company instances.

Another drawback to these systems occurs when a company implements more than one. Let’s suppose that a company installs both SAP and Microsoft. And then maybe GE Predix. How are these proprietary systems all going to get along together?

MIMOSA has a solution—web service Information Service Bus Model the heart of the Open Industrial Interoperability Ecosystem (OIIE). These open standards describe how to tie together all the parts into an interoperable industrial system.

These standards plus current efforts to define Industry Standard Datatsheet Definition and a joint working group to write a companion specification with OPC UA will be discussed at the open meeting.

There will be an MIMOSA meeting  on Sept. 28-29. All are invited to attend. BP Helios Center, 201 Helios Way, Houston, Texas 77079

More information coming.

For deeper information on MIMOSA click on the white paper small ad on the right.


Follow this blog

Get every new post delivered right to your inbox.