Catching Up With ABB Automation and Power World

Catching Up With ABB Automation and Power World

Ulrich Spiesshofer, ABB CEO

Ulrich Spiesshofer, ABB CEO

I was not able to attend ABB’s Automation and Power World this year. Too many places to go at the same time.

However, someone I trust, Mehul Shah of LNS Research, was there and wrote his observations on the LNS blog.

Mehul focuses on software and linked it to the Internet of Things. “The conference also featured a prime focus on the Internet of Things (IoT), as a panel was presented on stage, containing key event sponsor Microsoft, ABB, and an ABB customer. The trio provided insight and examples into how the IoT trend is impacting the industry.”
Highlighting ABB’s solution in the IoT space, Spiesshofer discussed the following key areas of focus
• Robotics
• Intelligent devices
• Control systems
• Advanced communication infrastructure
• Enterprise software
• Analytics solutions

“A notable fact that was highlighted at conference was that—to my surprise—more than 50% of what ABB’s currently offers is software related. ABB had made a few major acquisition over the last decade to build its software offering. The most impactful was the acquisition of Ventyx for $1 billion in 2010. This gave ABB a major boost in asset, operations, energy, and workforce management solutions in some of the asset intensive industries. ABB has also made some other acquisitions such as Insert Key Solutions and Mincom to build its Enterprise Asset Management software offerings. It seems clear the company understands the importance of its software business to remain competitive, and has also developed a separate Enterprise Software group that houses some of these acquisitions.”

Interesting that the investments were in software applications. Several years ago a CEO told me that software was important to his company—and that there was software in most of the company’s hardware products. That was correct—but my point was software business, not technology. ABB seems to have kept emphasis on software business even while Spiesshofer has been divesting some of the acquisitions made under previous CEO Joe Hogan.

Shah’s Takeaways

• It was impressive to see the effort that ABB has invested to bring its acquisitions under one brand.
• ABB has taken a first step in building a technology roadmap by bringing some of the software offerings together as part of the Enterprise Software group. LNS sees this as a big step in the right direction strategically, and should prove of great benefit to current ABB customers as well as prospects.
• However, ABB currently has important software products that remain outside of its Enterprise Software group and it remains to be seen if these solutions will receive the required attention, especially when considering the breadth of ABB’s portfolio. Two examples of this are the company’s Manufacturing Execution System (MES) offering, and the aforementioned Decathlon for Data Centers.
• ABB has a full-fledged MES offering with some good customers currently leveraging this MES across discrete, process, and batch industries.
• ABB might have some ground to cover in MES compared to some of its closest competitors in this space. Companies like GE, Siemens, Schneider Electric and Rockwell Automation have been heavily focused on the software business with many announcing reorganizations to increase resources allocated to software over the past several years and.
• Another area we would like to hear from ABB is around their offerings in IoT. While there were number of products that were categorized as IoT solution, ABB will need a holistic offering and vision around how their industrial clients can leverage these solutions to drive value.
• To answer the question, yes—ABB can compete effectively in the software business. But there is still some grounds to cover. ABB has had a lot of critical parts of the software business for quite a while and has been slower than many of its competitors in pulling it all together.

Gary’s Take

I agree with Mehul for the most part. I knew ABB had an MES offering, and I’ve interviewed Marc Leroux many times over the years. But it always seemed a little under the covers. The same with the Ventyx acquisition. It was easy to forget about it as it didn’t seem to get the promotion it deserved.

ABB is such a diverse conglomerate that sometimes it’s hard to know what it focuses on. I always followed the automation—primarily process automation. Several years ago, I think at Hannover but maybe SPS in Nuremberg, ABB executives explained the factory automation offering and the added emphasis the company was placing on it. But there are so many things and so few promotional dollars.

Also a few years ago, ABB decided to add its Power users to its Automation user group conference—hence Automation and Power World. However, the first two of those featured much more power and much less automation. It looks as if the company is striking a balance at the conference. But the Power division is still a laggard in performance.

ABB is a strong company, but it has much work to do in order to reach peak performance.

Open Source OPC UA Development

Open Source OPC UA Development

There are many new and cool open source projects going on right now. These are good opportunities for those of you who program to get involved. Or…you could take a hint and turn your passion into an open source project.

I’ve written three articles since November on the subject:
Open Source Tools Development
Open Source SCADA
Open Source OPC UA for manufacturing

Sten Gruener wrote about yet another OPC UA open source project. This one seems to be centered in Europe (but everything on the Web is global, right?). This is an open source and free C (C99) implementation of OPC UA communication stack licensed under LGPL + static linking exception. A brief description:

Open
• stack design based solely on IEC 62541
• licensed under open source (LGPL & static linking exception)
• royalty free, available on GitHub
Scalable
• single or multi-threaded architecture
• one thread per connection/session
Maintainable
• 85% of code generated from XML specification files
Portable
• written in C99 with POSIX support
• compiled server is smaller than 100kb
• runs on Windows (x86, x64), Linux (x86, x64, ARM e.g. Raspberry Pi, SPARCstation), QNX and Android
Extensible
dynamically loadable and reconfigurable user models

Background Information

OPC UA (short for OPC Universal Architecture) is a communication protocol originally developed in the context of industrial automation.

OPC UA has been released as an “open” standard (meaning everybody can buy the document) in the IEC 62541 series. As of late, it is marketed as the one standard for non-realtime industrial communication.

Remote clients can interact with a Server by calling remote Services. (The services are different from a remote procedure call that is provided via the “Call” service.) The server contains a rich information model that defines an object system on top of an ontology-like set of nodes and references between nodes. The data and its “meta model” can be inspected to discover variables, objects, object types, methods, data types, and so on. Roughly, the Services provide access to:

  • Session management
  • CRUD operations on the node level
  • Remote procedure calls to methods defined in the address space
  • Subscriptions to events and variable changes where clients are notified via push messages.

The data structures the services process as in- and output can be encoded either as a binary stream or in XML. They are transported via a TCP-based custom protocol or via Webservices. Currently, open62541 supports only the binary encoding and TCP-based transport.

Manufacturing Software Future Is Loosely Coupled in Layers

Manufacturing Software Future Is Loosely Coupled in Layers

timSowell

Tim Sowell, VP and Fellow at Schneider Electric Software, always writes thoughtful and forward-looking blogs about the state of manufacturing software.  In this one, he discusses taking a lead from the human body “with reducing risk through an enterprise nervous system for industrial architectures.”

He says, “If we think about it – the human nervous system has over a billion neurons spread throughout the body to help control its various functions. If the brain had to deal constantly with a billion signals, it would “crash” the system. Thus, nature has designed a system where functions are layered in an architecture that helps create a robust sense-and-response mechanism.”

I’ve added a few books for your intellectual broadening about brain science. More and more, those of us in the more “physical” systems business are getting metaphors from biological systems. The human system is a great metaphor.

On Intelligence, Jeff Hawkins (founder of Palm) 

Decartes’ Error, Antonio Damasio 

The Feeling of What Happens, Antonio Damasio

And while I’m at it, Loosely Coupled, Doug Kaye, regarding Sowell’s later statements. 

Sowell has lately been asked about flat vs. layered architecture, and a similar question around one platform vs. multiple platforms. Here are his points:

Layers allow me to contain change 

Layers allow me to manage complexity, divide and conquer 

Inter-operable layers reduce technology lock-in and increase options for clients 

Federated means lower level has autonomy but cannot violate higher level rules and principles .

He continues, “The world is made up of layers of information, interaction, and decisions . It is important to optimize across a layer, so interaction with the “things” at that layer is focused, efficient, and in context of that layer in content and time. As you transverse layers so does the context of information, the interaction between different “things” and complexity or focus change.”

Further, “In the industrial automation/ operations control has it’s layers of executing with the different equipment components in the process unit, requiring speed and tight coupling. As we go up the layers to supervisory then MES and Information, the context changes, responsibility for decisions increases, but time context changes. The “things” interacting change, combined with more complex messages with more context.”

And here is Sowell’s conclusion. This is very much in tune with what I see as the direction software is going. Interoperability being the key.

“Autonomous functions (layers) which have Interoperability is key for fast relevant actionable decisions to take place with the most efficiency. So why do we ask about one, when we should design in layers but understand the layers the context, things, and actions. But understand how the layers must be “loosely coupled but aligned” so that operational execution aligns with business strategy in near real time.”

Open Source OPC UA Development

Automation Conferences and Jim Pinto

I have a potpourri of items to start the day. In the morning I leave for a week serving at the Tijuana Christian Mission. We will do a variety of service projects including building a section of a cinder-block security wall at its Rosarito orphanage site. We will do some work at the women’s shelter. We will also have some “real” Mexican tacos and check out the Pacific Ocean. I will be writing ahead, but there may be some gaps.

ABB

I decided that I just had too much going on along with watching my budget to attend this year’s ABB Automation and Power World event in Houston. This is the first one I’ve missed. And, yes, I do feel some withdrawal pain. What little news I’ve seen so far says that attendance is about 8,000. That is fantastic. I have seen no other news so far.

There were a couple of press releases in general. I subscribe to news feeds using Feedly on my iPad. I scan hundreds of items a day. Unfortunately, whatever Web technology ABB uses, when I click on the teaser lead in to the story to go to the Website, nothing happens. I’ve reported it to ABB several times in the past. For now, I don’t tweet or write up these items–I can’t see them.

Jim Pinto on Tolerance

My friend Jim Pinto who once wrote a monthly column on automation for me has switched his outlook on life. He has been tackling social problems lately in his new blog.

The latest edition is an impassioned plea for tolerance. He talks about treating other people with dignity. Certainly that is a life skill that will help you become successful except in the most toxic of organizational environment. But certainly successful as a person.

The piece did send me in search of a book in my library from the late 60s called “A Critique of Pure Tolerance.” For you philosophers, you might get just a sniff of Kant in the title. Rightly so. Three philosophers contributed essays–a Hegelian, a Kantian, and a positivist. One author was Robert Paul Wolfe. I can neither remember the other two or find the book right now. The point was (throwback to anti-VietNam protests) that sometimes you really shouldn’t tolerate the thoughts of others. I just offer that as a token of meaningless debate.

Real news from Dassault

Dassault Apriso 40Just received this update. By the way, I think these pre-configured apps are the beginning of the future for manufacturing software. Seems Apriso is making us smart–at least according to the press relations manager. Version 4.0 of Dassault Systèmes’ DELMIA Apriso Manufacturing Process Intelligence (MPI) application suite is now available. New Maintenance, Logistics and Warehouse Intelligence Packs add visibility to another 200+ new KPIs.

Manufacturers operating globally are challenged to accurately measure analytics across sites to identify “best-in-class” performance. MPI 4.0 now offers 700+ pre-configured, built-in measures and KPIs within seven DELMIA Apriso Intelligence Packs. Intelligence Packs are pre-configured to work out-of-the-box with existing Apriso products (or may be integrated with other vendor products) to deliver the industry’s most robust EMI solution for global manufacturing excellence.

MPI 4.0 now offers Maintenance, Logistics and Warehouse Intelligence Packs, in addition to existing Production, Machine, Labor and Quality Intelligence Packs.

Advanced manufacturing strategies

There is one thing that puzzles me. Does anyone care about the variety of “smart manufacturing” theories and initiatives that take up so much room in magazines and blogs these days? I keep asking and writing, but the response is muted.

Granted, the European initiatives, principally Industrie 4.0, seem to be supplier driven. The US counterpart, Smart Manufacturing, has a government component, but is largely academic backed by some private companies who wish to take advantage of a pool of Ph.D. candidate researchers. It does talk about building a platform. However, the commercial impact is still in the distant future.

Just checking in. I’m working on a paper. If you have anything to contribute, I’m all ears.

Open Source OPC UA Development

Standards and Roadblocks to Manufacturing Software Development

Looks like there is a debate in the software development community again. This time around node.js

Dave Winer is a pioneer in software development. I used his first blogging platform, Radio Userland, from 2003 until about 2009 when it closed and I moved first to SquareSpace and then to WordPress. Below I point to a discussion about whether the node.js community needs a foundation.

His points work out for manufacturing software development, too. Groups of engineers gather to solve a problem. The problem usually involves opening up to some level of interoperability.

This is a double-edged sword for major suppliers. They’d prefer customers buy all their solutions from them. And, yes, if you control all the technology, you can make communications solider, faster. However, no supplier supplies all the components a customer wants. Then some form of interoperability is required.

Therefore, technologies such as OPC, HART, CIP, and the like. These all solved a problem and advanced the industry.

There are today still more efforts by engineers to write interoperability standards. If these worked, then owner/operators would be able to move data seamlessly, or almost seamlessly, from application to application solving many business problems.

Doing this, however, threatens the lucrative market of high-end consultants whose lock-in of custom code writing and maintenance is a billion-dollar business. Therefore, their efforts to prevent adoption of standards.

Winer nails all this.

I am new to Node but I also have a lot of experience with the dynamics [Eran] Hammer is talking about, in my work with RSS, XML-RPC and SOAP. What he says is right. When you get big companies in the loop, the motives change from what they were when it was just a bunch of ambitious engineers trying to build an open underpinning for the software they’re working on. All of a sudden their strategies start determining which way the standard goes. That often means obfuscating simple technology, because if it’s really simple, they won’t be able to sell expensive consulting contracts.

He was right to single out IBM. That’s their main business. RSS hurt their publishing business because it turned something incomprehensible into something trivial to understand. Who needs to pay $500K per year for a consulting contract to advise them on such transparent technology? They lost business.

IBM, Sun and Microsoft, through the W3C, made SOAP utterly incomprehensible. Why? I assume because they wanted to be able to claim standards-compliance without having to deal with all that messy interop.

As I see it Node was born out of a very simple idea. Here’s this great JavaScript interpreter. Wouldn’t it be great to write server apps in it, in addition to code that runs in the browser? After that, a few libraries came along, that factored out things everyone had to do, almost like device drivers in a way. The filesystem, sending and receiving HTTP requests. Parsing various standard content types. Somehow there didn’t end up being eight different versions of the core functionality. That’s where the greatness of Node comes from. We may look back on this having been the golden age of Node.

Follow this blog

Get a weekly email of all new posts.