by Gary Mintchell | May 6, 2015 | Automation, Internet of Things, Manufacturing IT, News, Operations Management, Technology
In the rush of a lot of news and a vacation thrown in, I’m still digesting news from Hannover Messe in April. Microsoft had called and asked if I could stop by for an interview, but unfortunately I was not at Hannover.
Below is a Microsoft blog post. The writer posits three industrial ages, and then he surprises us by announcing the arrival of a fourth. Interestingly, it is at Hannover two years ago where Industrie 4.0 sprang forth into our consciousness. Here is Microsoft’s take on the fourth generation of manufacturing along with a few specific examples of what it means in practice.
I think this is a good, though not necessarily complete, look at aspects of Industry 4.0.
From the blog
When we think about what it takes to build a successful business, there were three main eras, which characterized important shifts in the global marketplace. The first was the industrial revolution when people began to mass-produce and distribute goods with tremendous scale and efficiency. Since everyone received information at the same time and speed was not an issue, change wasn’t particularly fast.
What followed was the Information Age where people weren’t just using technology to drive production efficiencies; they were using it to drive information efficiencies. During this time, competitive advantage began to shift to our access to information.
Today, information and data are ubiquitous which has had a tremendous effect on both our digital work and life experiences. The world has formed a giant network where everyone has access to anyone and everything. Some people refer to this as the Connected Age.
However, the ubiquity of data and connected devices, coupled with important advances in machine learning, are powering a new set of capabilities called the Internet of Things (IoT). IoT is now at the forefront of a fourth era in business productivity. With IoT, companies worldwide are transforming the way they plant crops, assemble goods and maintain machinery. Now, several Microsoft customers and partners, including Fujitsu, KUKA Robotics, and Miele, are announcing IoT initiatives that will change the way people live and work.
IoT’s influence on those companies and many others is on display this week at the large industry fair Hannover Messe, where the term “Industry 4.0” was first coined. Everywhere we look there are examples of physical assets integrated with processes, systems and people, and exciting possibilities are being fueled by this transformation.
At this event, Microsoft is showing how we’re helping manufacturers innovate, bring products to market more quickly and transform into digital businesses. Aided by unlimited compute power and rich data platforms, the creation of “systems of intelligence” that enable reasoning over vast amounts of data are empowering individuals and organizations with actionable insights.
Blending physical with digital
Fujitsu is bringing together its Eco-Management Dashboard, IoT/M2MP platform, Microsoft cloud services, and Windows tablets in a way that can enable managers, engineers, and scientists to improve product quality, streamline systems, and enhance functionality while reducing costs. For example, at its facility in Aizu Wakamatsu, Japan, Fujitsu is able to grow lettuce that is both delicious and low in potassium so that it can be consumed by dialysis patients and people with chronic kidney disease. They can track all of the plant info from their Windows tablets through the cloud. These solutions will also be able to help other agriculture and manufacturing companies transform their businesses through innovation.
Artificial intelligence is no longer a fantastic vision for the future—it is happening today. KUKA, a manufacturer of industrial robots and automation solutions, is using the Microsoft IoT platform to create one of the world’s first showcases that blends IT with robotic technologies into a smart manufacturing solution with new capabilities.
Intelligent Industrial Work Assistant (LBR iiwa), a sensitive and safe lightweight robot, uses precise movements and sensor technology to perceive its surroundings around a complex task like performing the delicate action of threading a tube into a small hole in the back of a dishwasher. Errors in the supply chain are addressed in real time through Windows tablets, making the automated process faster and easier. Through this demonstration, KUKA is highlighting how its LBR iiwa can collaborate with humans to jointly perform the task as peers working together without being controlled by a human or using a vision system.
Eyeing physical assets through a digital lens
For companies trying to understand how this approach can help, look at the infrastructure you already have. How can these assets become connected and intelligent? What kind of data would help to reduce cost, or increase agility? How can you use insights to grow revenue in existing operations, or offer those insights to customers and create new revenue streams?
The focus here is on transforming existing business models and adding cloud-connected services. In the age of Industry 4.0, manufacturing and resource companies will no longer compete over the products and features they offer, but on new business models they can either pursue themselves or offer to customers.
by Gary Mintchell | May 4, 2015 | Automation, Data Management, Internet of Things, Manufacturing IT, Networking, News, Operations Management, Technology
For being so quiet for so long, the OPC Foundation is certainly hitting the news often lately. There was news about a couple of open-source initiatives. Then the Foundation itself opened up a little with an “open-shared” program.
Then it was announced as the communication platform of Industry 4.0 in Germany.
Now a couple European automation rivals—Beckhoff Automation (Germany) and B&R Automation (Austria)—have made OPC news.
Taken in sum, these announcements plus the earlier ones reveal the importance of OPC to industrial communication. It became a standard for moving important data from control systems to human-machine interface systems and then on to SCADA and MES systems.
With the introduction of UA built on modern software technologies including built-in security and embeddable format, the technology everyone used but also everyone dissed finds itself on the cutting edge of modern connected industrial Internet strategies.
OPC and Beckhoff
News coming from last month’s Hannover Messe included this joint announcement from OPC and Beckhoff.
OPC UA is about scalable communication with integrated security by design up to MES / ERP systems and into the cloud, EtherCAT is about hard real-time capability in machines and factory control systems. Both technologies complement each other perfectly.
Industrie 4.0 and Internet of Things (IoT) architectures require consistent communication across all levels while using Internet technologies: both in as well as outside of the factory, for example to cloud-based services. That exactly is what the OPC Foundation and the EtherCAT Technology Group (ETG) want to account for by defining a common definition of open interfaces between their respective technologies.
At the Hanover Fair Thomas J. Burke, President and Executive Director of the OPC Foundation and Martin Rostan, Executive Director of the ETG signed a Memorandum of Understanding in which both organizations agree to closely co-operate developing these interfaces.
OPC and B&R
Not to be outdone, B&R Automation issued a press release announcing it will be supporting the OPC Foundation’s new real-time technology working groups, whose goal is to add real-time capability to the OPCUA communication standard. This will involve two key additions to the OPCUA standard. The first is a publisher-subscriber model; the other is utilization of the IEEE 802.1 standard for time-sensitive networking (TSN).
B&R will be contributing its real-time expertise to the working groups. “The updates to the OPC UA standard will benefit from our years of experience in developing real-time solutions,” says Stefan Schönegger, marketing manager at B&R.
OPC UA uses a publish/subscribe network model. B&R is the main proponent of PowerLink. PowerLink uses publish/subscribe technology, too. So, B&R wants to show compatibility.
“This is a fundamental requirement for the M2M communication you find in integrated systems such as packaging lines,” explains Schönegger.
In order to fulfill real-time requirements, the OPC UA standard will make use of the IEEE 802.1 TSN standard. “At the moment, TSN is still a working title for a group of new IEEE standards designed to provide native real-time capability for the IEEE 802 Ethernet standard,” says Schönegger. This would allow for a seamless transition to substantially faster Ethernet standards such as POWERLINK for field-level communication and demanding motion control tasks.
Beyond the automation industry, TSN is currently also being evaluated by the automotive and telecommunications industries. “The first cars based on TSN are expected to hit the market in the very near future,” reports Schönegger. This would help secure the widespread availability of this technology. In addition to B&R, the new OPC working groups will be also supported by other leaders in the field of automation, as was announced by KUKA on April 13, 2015.
OPC UA already plays a central role in the IT-related areas of modern production systems. “The addition of TSN and the publisher-subscriber model will greatly expand the range of potential OPC UA applications,” says Schönegger.
Takeaway
What all this means is that OPC can now become even faster and more usable than before. The little protocol that everyone uses and everyone complains about is getting cred as it becomes more modern. These technological advances should make it more valuable. And that will be significant in this new connected enterprise era.
by Gary Mintchell | Apr 22, 2015 | Automation, Data Management, Internet of Things, Manufacturing IT, Networking, News, Operations Management, Organizations, Software, Standards, Technology
I have been writing about some open source initiatives with OPC UA. I think it’s cool and long overdue that there is so much happening in the OPC world lately. See these:
Open Source OPC UA Development
Open Source OPC UA for Manufacturing
Last week at Hannover, the OPC Foundation announced several items—including the promotion of Stefan Hoppe to Vice President of the Foundation (pictured). Another one deals with an open (sort of) source initiative designed to broaden the appeal of OPC outside of the industrial automation community.
OPC Announces “OPC UA Open Shared Source” Strategy
The OPC Foundation announced that the OPC Unified Architecture (OPC UA) specifications and technology will be made available to companies, consortiums, and end users without requiring membership in the OPC Foundation. The OPC Foundation is implementing an “open shared source” strategy to facilitate widespread adoption of the technology beyond industrial automation.
OPC UA provides a complete solution for information modeling allowing consortiums and vendors to plug in their simple or complex information models directly into OPC UA and take advantage of all of the OPC UA SOA allowing generic devices and applications to seamlessly share information.
The OPC Foundation open shared source strategy provides developers a quick jump start on the technology enabling prototyping projects without any barriers.
The OPC Foundation vision of interoperability providing the best specifications, technology, certification and process is the core of this open shared source strategy for the technology and specifications. The open shared source will be hosted on open source collaboration community work space. The OPC Foundation OPC UA stacks available to the OPC Foundation members will be under RCL license, allowing OPC Foundation members to build the highest quality OPC UA enabled products and then be able to certify the products through the comprehensive OPC Foundation certification and interoperability programs.
Stefan Hoppe, OPC Foundation Vice President, commented that “Adoption of OPC standards in industrial automation and specifically reaching out to other domains requires new ways of thinking to evangelize and increase awareness about the OPC Technology. OPC Unified Architecture is becoming the dominant infrastructure and information modeling architecture for the Internet of things and Industry 4.0, and these initiatives require complete transparency and open this about the technology to be a core part of their infrastructure.”
Continuing the Conversation
Andy Robinson pointed to a neat little app on OPC on YouTube. I replied and that led to a cool conversation between Andy and Rick Bulotta of ThingWorx. Here is what a nice little conversation can happen on Twitter. I’d like to invite more of these.
Andy Robinson @archestranaut @garymintchell re OPCUA, thought you might be interested in this. It’s a small start but the vision is great! http://ow.ly/LWdXq
Then Rick Bullota chimed in:
Rick Bullotta @RickBullotta why introduce #MQTT into the mix? it is a weak subset of OPCUA (no metadata, RPC, discovery)? #IoT
Andy Robinson @archestranaut 100% for typical in the building plant floor apps. But for weak or expensive connectivity may be better.
Rick Bullotta @RickBullotta living with “all of the above” here @Thingworx. MQTT is more like OPC (actually, NetDDE) than OPCUA.#IoT
Andy Robinson @archestranaut also agree #mqtt maybe not ideal for commands as would probably require Qos2, not a huge deal…
Rick Bullotta @RickBullotta btw, I think it’s quite cool what you did though! nice work!
Andy Robinson @archestranaut not my work. Someone else.
Andy Robinson @archestranaut but I do think the basic idea of accepting not many new #IOT device will speak UA out of box leads us to .. . think about how we might integrate at least a subset of critical info into our already setup SCADA systems.
Andy Robinson @archestranaut I don’t suspect folks like Thingworx are looking to supplant traditional SCADA. At least doesn’t seem logical
Rick Bullotta @RickBullotta we’re already doing this with @ThingWorx via OPC/OPCUA, historians, HMI APIs, databases, and so on.
Andy Robinson @archestranaut which is why you will be fine while other “similar” offerings will disappear on the low end
Rick Bullotta @RickBullotta correct. connecting, augmenting integrating and expanding their reach, whether within the plant or beyond.
Rick Bullotta @RickBullotta been there, done that, twice. 😉 I also founded Lighthammer (now SAP MII).
Andy Robinson @archestranaut nice to have a more civil convo after the last mini-twitp*** of about a month ago with others. 🙂
by Gary Mintchell | Mar 30, 2015 | Automation, Internet of Things, News, Operations Management, Technology
Developing testbeds for testing development of technology extensions seems to be hot right now. The Smart Manufacturing Leadership Coalition has a couple going in conjunction with US government money. There is a bid out from the US government for development of some more, also related to energy efficiency.
The Industrial Internet Consortium announced its first energy-focused testbed: the Communication and Control Testbed for Microgrid Applications. Industrial Internet Consortium member organizations Real-Time Innovations (RTI), National Instruments, and Cisco, are collaborating on the project, working with power utilities CPS Energy and Southern California Edison. Additional industry collaborators include Duke Energy and the power industry organization – Smart Grid Interoperability Panel (SGIP).
I recently saw where an analyst positioned the IIC with the German Industry 4.0 initiative–while ignoring the US Smart Manufacturing group altogether. These advanced manufacturing strategies are showing some growth. Both of these have commercial technology companies solidly behind them. I would think that they will have more impact in the long run than SMLC. But we’ll see.
Here is some background from the IIC press release. “Today’s power grid relies on a central-station architecture not designed to interconnect distributed and renewable power sources such as roof-top solar and wind turbines. The system must over-generate power to compensate for rapid variation in power generation or demands. As a result, much of the benefit of renewable energy sources in neighborhoods or businesses is lost. Efficiently integrating variable and distributed generation requires architectural innovation.”
The goal of the Communication and Control Testbed is to introduce the flexibility of real-time analytics and control to increase efficiencies in this legacy process – ensuring that power is generated more accurately and reliably to match demand. This testbed proposes re-architecting electric power grids to include a series of distributed microgrids which will control smaller areas of demand with distributed generation and storage capacity.
These microgrids will operate independently from the main electric power grid but will still interact and be coordinated with the existing infrastructure.
The testbed participants will work closely with Duke Energy, which recently published a distributed intelligence reference architecture, as well as SGIP to help ensure a coordinated, accepted architecture based on modern, cross-industry industrial internet technologies.
The Communications and Control framework will be developed in three phases that will culminate in a field deployment that will take place at CPS Energy’s “Grid-of-the-Future” microgrid test area in San Antonio, Texas.
The initial phases will be tested in Southern California Edison’s Controls Lab in Westminster, CA.
by Gary Mintchell | Mar 25, 2015 | Automation, Manufacturing IT, Operations Management, Technology
During my continuous research for topics such as Industry 4.0, digital manufacturing, smart manufacturing and the industrial Internet of things, I came across this Siemens PLM software blog.
In it, Zvi Feuer, Siemens PLM Software’s Senior Vice President, Digital Factory, Manufacturing Engineering Software, shares his perspective on “how Siemens helps companies worldwide to realize innovation in manufacturing.”
Feuer says, “I want to be able to offer our customers industry solutions which provide the means to turn any manufacturing operation into a high tech manufacturer. In order for us to sell not only the software but also usage methodologies And, in fact, to increase productivity with the customer and to help the customer deliver to his customers in a better and faster shape. This will obviously create opportunities for people, opportunities for jobs.”
Siemens executives have explained its digital manufacturing strategy to me for more than 10 years. And the vision has been remarkably consistent. The first conversations were even before the UGS acquisition that led to the Siemens PLM business.
PLM As ERP for Manufacturing
The blog refers to a white paper, PLM For Manufacturing, “If you are looking for ways to connect all domains of the design/build lifecycle, consider a manufacturing process management (MPM) solution. This provides an enterprise-scalable foundation that allows you to perform product design, while simultaneously optimizing manufacturing processes. This means that you can better manage lifecycle cost, meet launch dates and maintain product quality targets.”
That statement reflects Siemens thinking even before the acquisition. Is it possible to design not only the product but the manufacturing digitally, and then proof it all out digitally before even cutting the first steel.
“We believe that an MPM system that is part of an enterprise PLM system is the best way to move
forward. This will provide an environment that supports a flexible process plan capable of reflecting any changes to the product design or requirements. This might be called a single window for enterprise data management – a single application that supports the complete lifecyle of product data in an enterprise environment. The main idea is to provide users with one platform for all their data management needs. Teamcenter PLM software is the only comprehensive system that provides a platform in which users can conduct all their data management needs from engineering to manufacturing to execution.”
Is it sustainable?
This is a grand vision. It reads like Goldratt’s “The Goal” coming to life totally automated. But, there are inherent problems to the grand scheme. I have witnessed and otherwise seen the benefits of more and better information informing production/maintenance teams enabling better decisions and improvements. But to think that this could eventually happen without human intervention–I doubt that ever happens effectively.
The white paper also talks about complexities of manufacturing and software, then it argues that it would be better to put everything into one overarching software application. I would argue, along with my Lean friends, that this would just make for one very complex software application.
Any of us who have actually done automation know that when the application gets too complex, then it doesn’t work. It is not maintained. It is not understood. People begin developing their more simplified (and understandable) workarounds.
The vision is like most things I have witnessed over the past 40 years of applying technology. We develop something. We get benefits. We get over ambitious and build something cumbersome. People stop using it. We develop something simpler. People use it. And so on.
Digital manufacturing and Industrie 4.0? Interesting. The jury is still deliberating as to whether it is giving Germany the desired competitive edge in manufacturing.