Catching Up With ABB Automation and Power World

Catching Up With ABB Automation and Power World

Ulrich Spiesshofer, ABB CEO

Ulrich Spiesshofer, ABB CEO

I was not able to attend ABB’s Automation and Power World this year. Too many places to go at the same time.

However, someone I trust, Mehul Shah of LNS Research, was there and wrote his observations on the LNS blog.

Mehul focuses on software and linked it to the Internet of Things. “The conference also featured a prime focus on the Internet of Things (IoT), as a panel was presented on stage, containing key event sponsor Microsoft, ABB, and an ABB customer. The trio provided insight and examples into how the IoT trend is impacting the industry.”
Highlighting ABB’s solution in the IoT space, Spiesshofer discussed the following key areas of focus
• Robotics
• Intelligent devices
• Control systems
• Advanced communication infrastructure
• Enterprise software
• Analytics solutions

“A notable fact that was highlighted at conference was that—to my surprise—more than 50% of what ABB’s currently offers is software related. ABB had made a few major acquisition over the last decade to build its software offering. The most impactful was the acquisition of Ventyx for $1 billion in 2010. This gave ABB a major boost in asset, operations, energy, and workforce management solutions in some of the asset intensive industries. ABB has also made some other acquisitions such as Insert Key Solutions and Mincom to build its Enterprise Asset Management software offerings. It seems clear the company understands the importance of its software business to remain competitive, and has also developed a separate Enterprise Software group that houses some of these acquisitions.”

Interesting that the investments were in software applications. Several years ago a CEO told me that software was important to his company—and that there was software in most of the company’s hardware products. That was correct—but my point was software business, not technology. ABB seems to have kept emphasis on software business even while Spiesshofer has been divesting some of the acquisitions made under previous CEO Joe Hogan.

Shah’s Takeaways

• It was impressive to see the effort that ABB has invested to bring its acquisitions under one brand.
• ABB has taken a first step in building a technology roadmap by bringing some of the software offerings together as part of the Enterprise Software group. LNS sees this as a big step in the right direction strategically, and should prove of great benefit to current ABB customers as well as prospects.
• However, ABB currently has important software products that remain outside of its Enterprise Software group and it remains to be seen if these solutions will receive the required attention, especially when considering the breadth of ABB’s portfolio. Two examples of this are the company’s Manufacturing Execution System (MES) offering, and the aforementioned Decathlon for Data Centers.
• ABB has a full-fledged MES offering with some good customers currently leveraging this MES across discrete, process, and batch industries.
• ABB might have some ground to cover in MES compared to some of its closest competitors in this space. Companies like GE, Siemens, Schneider Electric and Rockwell Automation have been heavily focused on the software business with many announcing reorganizations to increase resources allocated to software over the past several years and.
• Another area we would like to hear from ABB is around their offerings in IoT. While there were number of products that were categorized as IoT solution, ABB will need a holistic offering and vision around how their industrial clients can leverage these solutions to drive value.
• To answer the question, yes—ABB can compete effectively in the software business. But there is still some grounds to cover. ABB has had a lot of critical parts of the software business for quite a while and has been slower than many of its competitors in pulling it all together.

Gary’s Take

I agree with Mehul for the most part. I knew ABB had an MES offering, and I’ve interviewed Marc Leroux many times over the years. But it always seemed a little under the covers. The same with the Ventyx acquisition. It was easy to forget about it as it didn’t seem to get the promotion it deserved.

ABB is such a diverse conglomerate that sometimes it’s hard to know what it focuses on. I always followed the automation—primarily process automation. Several years ago, I think at Hannover but maybe SPS in Nuremberg, ABB executives explained the factory automation offering and the added emphasis the company was placing on it. But there are so many things and so few promotional dollars.

Also a few years ago, ABB decided to add its Power users to its Automation user group conference—hence Automation and Power World. However, the first two of those featured much more power and much less automation. It looks as if the company is striking a balance at the conference. But the Power division is still a laggard in performance.

ABB is a strong company, but it has much work to do in order to reach peak performance.

Open Source OPC UA Development

Open Source OPC UA Development

There are many new and cool open source projects going on right now. These are good opportunities for those of you who program to get involved. Or…you could take a hint and turn your passion into an open source project.

I’ve written three articles since November on the subject:
Open Source Tools Development
Open Source SCADA
Open Source OPC UA for manufacturing

Sten Gruener wrote about yet another OPC UA open source project. This one seems to be centered in Europe (but everything on the Web is global, right?). This is an open source and free C (C99) implementation of OPC UA communication stack licensed under LGPL + static linking exception. A brief description:

Open
• stack design based solely on IEC 62541
• licensed under open source (LGPL & static linking exception)
• royalty free, available on GitHub
Scalable
• single or multi-threaded architecture
• one thread per connection/session
Maintainable
• 85% of code generated from XML specification files
Portable
• written in C99 with POSIX support
• compiled server is smaller than 100kb
• runs on Windows (x86, x64), Linux (x86, x64, ARM e.g. Raspberry Pi, SPARCstation), QNX and Android
Extensible
dynamically loadable and reconfigurable user models

Background Information

OPC UA (short for OPC Universal Architecture) is a communication protocol originally developed in the context of industrial automation.

OPC UA has been released as an “open” standard (meaning everybody can buy the document) in the IEC 62541 series. As of late, it is marketed as the one standard for non-realtime industrial communication.

Remote clients can interact with a Server by calling remote Services. (The services are different from a remote procedure call that is provided via the “Call” service.) The server contains a rich information model that defines an object system on top of an ontology-like set of nodes and references between nodes. The data and its “meta model” can be inspected to discover variables, objects, object types, methods, data types, and so on. Roughly, the Services provide access to:

  • Session management
  • CRUD operations on the node level
  • Remote procedure calls to methods defined in the address space
  • Subscriptions to events and variable changes where clients are notified via push messages.

The data structures the services process as in- and output can be encoded either as a binary stream or in XML. They are transported via a TCP-based custom protocol or via Webservices. Currently, open62541 supports only the binary encoding and TCP-based transport.

Open Source OPC UA Development

Industrial Internet Testbed Announced

Developing testbeds for testing development of technology extensions seems to be hot right now. The Smart Manufacturing Leadership Coalition has a couple going in conjunction with US government money. There is a bid out from the US government for development of some more, also related to energy efficiency.

The Industrial Internet Consortium announced its first energy-focused testbed: the Communication and Control Testbed for Microgrid Applications. Industrial Internet Consortium member organizations Real-Time Innovations (RTI), National Instruments, and Cisco, are collaborating on the project, working with power utilities CPS Energy and Southern California Edison. Additional industry collaborators include Duke Energy and the power industry organization – Smart Grid Interoperability Panel (SGIP).

I recently saw where an analyst positioned the IIC with the German Industry 4.0 initiative–while ignoring the US Smart Manufacturing group altogether. These advanced manufacturing strategies are showing some growth. Both of these have commercial technology companies solidly behind them. I would think that they will have more impact in the long run than SMLC. But we’ll see.

Here is some background from the IIC press release. “Today’s power grid relies on a central-station architecture not designed to interconnect distributed and renewable power sources such as roof-top solar and wind turbines. The system must over-generate power to compensate for rapid variation in power generation or demands. As a result, much of the benefit of renewable energy sources in neighborhoods or businesses is lost. Efficiently integrating variable and distributed generation requires architectural innovation.”

The goal of the Communication and Control Testbed is to introduce the flexibility of real-time analytics and control to increase efficiencies in this legacy process – ensuring that power is generated more accurately and reliably to match demand. This testbed proposes re-architecting electric power grids to include a series of distributed microgrids which will control smaller areas of demand with distributed generation and storage capacity.

These microgrids will operate independently from the main electric power grid but will still interact and be coordinated with the existing infrastructure.

The testbed participants will work closely with Duke Energy, which recently published a distributed intelligence reference architecture, as well as SGIP to help ensure a coordinated, accepted architecture based on modern, cross-industry industrial internet technologies.

The Communications and Control framework will be developed in three phases that will culminate in a field deployment that will take place at CPS Energy’s “Grid-of-the-Future” microgrid test area in San Antonio, Texas.

The initial phases will be tested in Southern California Edison’s Controls Lab in Westminster, CA.

Open Source OPC UA Development

Cloud Platforms for Internet of Things

This past Monday, 3/16, Microsoft held its  Microsoft Convergence 2015 in Atlanta. There, Microsoft CEO Satya Nadella announced the Azure IoT Suite.

I think that cloud-based platforms supporting this Internet of Things (IoT) phenomenon will proliferate for a while until we reach some sort of stability. Nadella came from this part of Microsoft, so I’m not surprised to see continued emphasis on these enterprise platform technologies.

An interesting highlight for us manufacturing and production geeks was that Microsoft brought an application by Rockwell Automation front and center. When Rockwell started doing these services, the Internet of Things phrase had not even been invented. It now finds itself in front of the IoT parade in the Microsoft keynote. I guess the time has come.

Quoting from its blog, “Microsoft’s vision is to help companies thrive in this era of IoT, delivering open, scalable platforms and services that any company, whether startup or the most established global enterprises, can use to create new value, right now. Nadella mentioned our investments in the Windows 10 IoT operating system for devices, and equally with the Azure IoT Suite, we’re bringing together a variety of Azure services to help our customers accelerate their transformation to digital businesses.”

This reminds me of conversations with Microsoft people stretching as far back as 1999 where the topic was Microsoft as a platform company that provided a foundation for industrial applications. Looks like it is consistently fulfilling that vision.

Microsoft introduced a preview of the Azure Intelligent Systems Service last April. It is designed to securely connect, manage and capture machine-generated data from sensors and devices. “If the Intelligent Systems Service was a starting point, the Azure IoT Suite is its evolution and maturation – a reflection of what we learned from the feedback provided by our customers and partners throughout the preview.”

The Azure IoT Suite is an integrated offering that takes advantage of all the relevant Azure capabilities to connect devices and other assets (i.e. “things”), capture the diverse and voluminous data they generate, integrate and orchestrate the flow of that data, and manage, analyze and present it as usable information to the people who need it to make better decisions as well as intelligently automate operations. The offering, while customizable to fit the unique needs of organizations, will also provide finished applications to speed deployment of common scenarios we see across many industries, such as remote monitoring, asset management and predictive maintenance, while providing the ability to grow and scale solutions to millions of “things.”

Additionally, the Azure IoT Suite will provide a simple and predictable pricing model despite the rich set of capabilities and broad scenarios it delivers, so our customers can plan and budget appropriately. This approach is aimed at simplifying the complexities that often exist with implementing and costing IoT solutions.

The Azure IoT Suite will be released in preview later this year.

Nadella even talked manufacturing industry featuring Rockwell Automation, playing this video. One of my Twitter contacts pointed this out and asked how much was real and how much was hype. Well, I’ve actually seen similar applications at Rockwell, so it is just good marketing communication of a real service built on what is now known as Internet of Things.

Open Source OPC UA Development

Internet of Things and Emerson Process Management

Jim Cahill recently wrote about the Emerson Process Management take on the Internet of Things discussion. His report was about a presentation by Charlie Peters at the 2015 Investor Conference. I find it interesting that there is sufficient publicity behind the IoT discussion to bring it up to investors.

Many people strive to define what is included in an Internet of Things technology discussion. Peters’ list hits just about everything. “Ubiquitous connectivity, accessible costs/capacity and powerful & friendly tools. Smart phones, tablets, cellular and wi-fi communications expand connectivity tremendously. Sensors, data storage and computation power lower costs and access. And social networks, big data and prognostics make tools more friendly, intuitive and more valuable to use.”

Why do we care? What applications would be affected (or maybe already are affected)? Peters sees, “monitoring, infrastructure management, intelligent manufacturing and production, energy efficiency and improved environmental performance and compliance.”

I especially appreciate his discussion of implications from possibilities and challenges—increased digital and cloud infrastructure, more intelligent products, enriched business models, and enhanced digital customer models.  

In Emerson Process Management president Steve Sonnenberg‘s portion of the presentation, he highlighted an example of new business models being created with these technologies and services—a steam management operation on Jurong Island in Singapore. Thousands of acoustic wireless devices are being installed to monitor steam traps which are being remotely monitored by Emerson experts to instantly spot energy losses and avoid wasting energy. This results in large energy savings and reduced carbon dioxide emissions. 

I think we have been designing and installing “Internet of Things” technologies for years in manufacturing. The consumer world of connected mobile phones, thermostats, and now watches has served to popularize the term. Regardless, as both suppliers and their customers learn to design new business models to exploit the technology, we will witness another surge of productivity and profitability in manufacturing.

Follow this blog

Get a weekly email of all new posts.