Digital Transformation has generated so much news that company executives have begun ordering projects and task forces within the company to begin that transformation. The pressure on engineers and IT people increases with each new directive. To help clients deal with these new directives, ARC Advisory Group launched the Digital Transformation Council (DTC) at its 2018 Forum.
The council is a member community for industry, energy, and public-sector professionals. Membership is by invitation only and restricted to end users of digital transformation technology, such as professionals working for manufacturers, utilities, and municipalities. There is no fee to join.
“As data-driven market disruption grows, professionals across similar industries need to connect and learn from one another,” according to Jesus Flores-Cerrillo, Associated R&D Director at Praxair, one of the world’s largest providers of industrial gases. He added, “It’s becoming mission-critical to understand how to use data to develop services and products and optimize operations and assets. That can only be accomplished by understanding the possibilities provided by modern data tools such as artificial intelligence, machine learning, and digital twins.”
“We are delighted to support the Digital Transformation Council by bringing members together in person and online,” commented Greg Gorbach, Vice President at ARC Advisory Group. “This community will enable individuals and companies to get up to speed quickly on digital transformation innovations and share ideas about what provides value and what doesn’t.”
Each February, a member-only meeting, anchored to the annual ARC Industry Forum, will bring the Council together to set the focus and agenda for the coming year. Members will also gather via virtual quarterly meetings to discuss research findings, activities, and other topics.
In addition to annual in-person meetings and quarterly virtual meetings, Digital Transformation Council members will have year-round access to research and fellow members via an online community. ARC Advisory Group’s role will be to conduct research, organize meetings, provide venues, and facilitate peer-to-peer discussions. ARC will also deliver technical support for the group’s online presence.
The DTC will address topics such as analytics, industrial Internet of Things (IIoT), artificial intelligence and machine learning, cybersecurity, and additive manufacturing.
I am a sucker for open platforms. When the PR agency wrote with a teaser about discussing open platforms with Marc Lind, SVP Strategy at Aras, a PLM supplier, I bit. They threw in “digital twin” and “digital thread” as the topping and cherry atop the sundae, and the appointment was made.
We talked just before Christmas, but I’ve had such a crazy January that I’ve just now gotten to this in my pile of things to write.
PLM is often thought of as an enterprise application and covered by analysts who also watch such areas as ERP. I’ve talked with suppliers for years as a magazine editor, but they didn’t really seem to fit well within the magazines and they most likely were not advertising prospects, so there wasn’t pressure to write much. I’m saying that I’m not an expert in the area like some of my friends.
But I’ve followed the technology for many years. I’ve seen it coming—this coordination of digital and physical. As soon as the digital folks could get it all together—especially better databases and interfaces—then I knew we’d be much closer to the realization of digital manufacturing.
Lind told me something about the Aras platform. First, he said, was the attempt at doing away with silos where you might have your mechanical CAD, then have your electrical CAD, then perhaps your MES, and your ERP. He said not only was there a problem within manufacturing, think about the next step, say connected cars and other systems of systems, where things really need to interact across boundaries.
Check out the Aras platform. It’s interesting. And once again as I’m seeing more often, it is exploring a different business model that can make its platform and products available to a wider customer base. For other writing I’ve done on open platforms, click the small “ad” on my site to download the MIMOSA white paper.
We also talked digital twin, one of the foundation concepts for digital manufacturing.
He said the term Digital Twin was coined back in 2002 by Dr. Michael Grieves while at University of Michigan. Effectively, the Digital Twin is an exact virtual representation of a physical thing. It’s as if the physical product or system was looking in a virtual mirror.
Grieves describes it as a mirroring (or twinning) of what exists in the real world and what exists in the virtual world. It contains all the informational sets of the physical ‘thing’ meaning its cross-discipline – not just a mechanical / geometric representation, but also including the electronics, wiring, software, firmware, etc.
Many people talk about Digital Twins in the context of monitoring, simulation and predictive maintenance which are all incredibly valuable and potentially transformative in their own right, however, there would seem to be much more to it.
“As products of all types move to include connectivity, sensors, and intelligence we can’t just think about the data streaming back from the field.”
Without accurate “Context” – Digital Twin – time series data generated during production and ongoing operation is difficult or even impossible to understand and analyze.
In addition, the ability to interpret and act upon these data often require traceability to prior information from related revisions – Digital Thread.
“To complicate matters further as artificial intelligence / cognitive computing is introduced the necessity for the Digital Twin becomes even greater. If Knowledge = Information in Context, then without a Digital Twin, machine learning won’t work as intended, will be rendered ineffective or worse… potentially leading to risky misinterpretations or misdirected actions.”
Finally, Lind warns, “Because without Context – Digital Twin – the IoT-enabled value proposition is severely limited and could introduce real liability.”
ABB and Hewlett Packard Enterprise (HPE) announced a strategic global partnership that combines ABB’s digital offerings, ABB Ability, with HPE’s innovative hybrid information technology (IT) solutions. I’m in Madrid, Spain with the opportunity to talk with both parties to the announcement. This is yet another example of the so-called IT/OT convergence where we are finally getting real interaction between operations and enterprise IT.
“This strategic partnership marks the next level of the digital industrial transformation. Together, we will bring intelligence from cloud-based solutions to on-premises deployments in industrial plants and datacenters for greater uptime, speed and yield,” said ABB CEO Ulrich Spiesshofer. “ABB and HPE will deliver solutions that span the entire range of computing required by enterprises today, from the edge to the cloud to the core.”
“This alliance between two global leaders is unprecedented in terms of breadth and depth, and it will be ground-breaking for the progress of the Industrial Internet of Things,” said Meg Whitman, CEO, HPE. “Together with ABB, we will shape a digital industrial future where everything computes, equipping machines with intelligence to collaborate, allowing plants to flexibly adapt to changing demands, and enabling global supply chains to instantaneously react to incidents. This partnership will create exciting business opportunities for our joint customers.”
To provide a true end-to-end experience for customers, the ABB-HPE partnership will include co-innovation, co-development, joint go to market and service.
Research firm IDC forecasts that worldwide spending on the Internet of Things (IoT) will grow to $1.4 trillion in 2021 from an expected $800 billion in 2017. The largest investments are being made in areas such as manufacturing, transportation and utilities. To tap into the opportunities of the IoT, companies are investing in new solutions that digitize their industrial equipment and integrate it with their broader IT environments. By joining forces, ABB and HPE are bringing together the capabilities needed to accelerate this transformation.
Computing where it is required
Running data acquisition, analytics, and control processes near industrial equipment helps customers avoid the latency, security and reliability issues associated with data communication through remote IT systems. ABB and HPE will jointly develop, market and service digital industrial solutions that help customers:
• Create deep insights and automatic action from industrial data – by running ABB Ability applications on enterprise-grade IT systems close to the industrial equipment to accelerate the processing of vast amounts of data; and
• Manage and control industrial processes across the supply chain – by leveraging hybrid IT to provide a seamless experience from edge to cloud to core, making critical data available across locations.
The partnership will enable ABB Ability solutions to run on hybrid platforms such as HPE ProLiant for Microsoft Azure Stack, enabling customers to deploy applications to their preferred location – on HPE infrastructure in industrial plants and data centers or in the Microsoft Azure public cloud – to meet the specific requirements regarding performance, security or cross-site collaboration.
ABB and HPE will also deliver joint solutions for data centers, including
• Data center automation – to enable data center power, cooling and building systems automatically adapt to changing IT demands or incidents. To that end, ABB and HPE will integrate ABB Ability Data Center Automation, which controls, monitors and optimizes mission-critical datacenter facilities infrastructure, with HPE OneView, HPE’s IT infrastructure automation software.
• Secure edge data center – specifically designed to run in harsh industrial environments, bringing enterprise-grade IT capacity closer to the point of data collection and machine control. This solution is being developed in collaboration between ABB, HPE and Rittal, the world’s largest manufacturer of IT racks, and will be an off-the-shelf ready IT data center solution for industrial customers enabling real-time insight and action.
While at the Dell IQT kick off event in New York last month, I learned more about the breadth of Dell’s thinking about the Internet of Things.
It began with morphing embedded computer as a gateway (with memory and processing power and multiple connections).,
Added platform (EdgeX Foundry).
But that hardly seemed like something that warranted Michael Dell’s time during his past three Dell World / Dell EMC World keynotes.
Revealing the coming together of the various divisions of Dell Technologies, I learned about VMware Pulse IoT Center, an enterprise-grade Internet of Things (IoT) infrastructure management solution that will enable IT and operational technology (OT) teams to have complete control of their IoT infrastructure and things.
Interestingly, the Internet of Things group has been promoted to division stature led by the VMware CTO Ray O’Farrell.
Here are a few details on the Pulse Center.
Solving the problems
Customers have challenges scaling from IoT proof-of-concept to production.
• on-board and manage of thousands to hundreds of thousands of connected devices;
• make sure those devices are working as they are supposed to; and
• keep the devices and data secure.
Dell cites core strengths of device and application management, infrastructure analytics, and security give us the IP and expertise to address these issues with an easy to use, single-pane-of-glass solution to help customers to more efficiently manage, operate, scale and protect their IoT projects from the edge to the cloud.
VMware Pulse IoT Key Features
• Edge Device Management – Support for heterogeneous things and gateways with different hardware, operating systems, and communication protocols
• Real-Time Infrastructure Analytics – Ability to identify anomalies with real-time monitoring and infrastructure analytics
• Sophisticated & Flexible Rules Engine- Ability to granularly define what, where and when things are updated
• Single point console- A single point of monitoring and management for the IoT infrastructure (across private networks comprising of edge systems and connected devices) for both IT and OT users
• OTA updates – Ability to provide over-the-air, real-time updates to all things/gateways no matter how remote the location
• Smart Data Orchestration- Delivery of relevant data where and when it is needed across the edge and in the cloud by integrating into enterprise systems
• Security Across IoT Value Chain- Provides security at thing, network and user level with software updates and NSX and VMware Identity Manager integration
• Visualize thing – gateway relationships- Provides pictorial representation of the topology of the IoT infrastructure – 2 tier or 3 tier – in a parent child relationship diagram
• Highly scalable – Supports hundreds of thousands of edge systems and IoT connected devices such as sensors and actuators.
• On Prem support– Offered as an on-prem solution for deployment flexibility and security. Future versions will also be offered as cloud-hosted.
• Enterprise Integrations- Quick and easy integration with existing server systems through a comprehensive API abstraction layer
Chief Customer Officer
I also met with Jim Ganthier, a Vice President who works in the office of the Chief Customer Officer. OK, there are lots of “Chiefs” running around corporations today. Since I am most interested in technologies and their uses in manufacturing and industrial, I didn’t have lots of questions. It was interesting to see that there is a “voice of the customer” at the executive level of a major corporation. We talked a lot about whether it was difficult for a global technology company to meet the varying privacy requirements found from nation to nation. He assured us that they had the technology to comply.
Chief Marketing Officer
A comment stood out in our conversation with Jeremy Burton, the corporation Chief Marketing Officer. “The last 20 years has seen technology used for efficiency. Now technology is a differentiator.”
Hmm, sounds like what I heard at Emerson. Maybe it’s a meme.
Disclaimer: Dell pays my expenses to its events and an occasional fee for posts. The views are always mine, and they never review before publication.
Gathering data, visualization on many devices and screens, and connecting with standards including OPC UA and BACnet attracted a crowd of developers and users to the Iconics World Wide Customer Conference this week in Providence, RI.
“Connected Intelligence is our theme at this year’s summit and it has a dual meaning for us,” said Russ Agrusa, President and CEO of Iconics. “First, it refers to our extensive suite of automation software itself and how it provides out-of-the-box solutions for visualization, mobility, historical data collection, analytics and IIoT. The second point is that Iconics, over the last 30 years, has built a community of partners and customers who will have the opportunity to meet our software designers and other employees and have one-on-one discussions on such topics as; Industry 4.0, IIoT, cloud computing, artificial intelligence (AI) and the latest advances in automation software technology. It is truly a high energy and exciting event.”
Key technologies showcased at the Iconics Connected Intelligence Customer Summit included:
1. Industry 4.0 and the Industrial Internet of Things
2. Unlocking data and making the invisible, visible
3. Secure strategies and practices for industrial, manufacturing and building automation
4. Predictive AnalytiX using expert systems such as FDD and AI Machine Learning
5. Hot, warm and cold data storage with plant historians for the cloud and IIoT
Integration With AR, VR, and Mixed Reality Tech
The recent v10.95 release of GENESIS64 HMI/SCADA and building automation suite includes 3D holographic machine interface (HMI), which can be used with Microsoft’s HoloLens self-contained holographic computing device. This combination of Iconics software with Microsoft hardware allows users to visualize real-time data and analytics KPIs in both 2D and 3D holograms. When combined with Iconics AnalytiX software, users can take advantage of additional fault detection and diagnostics (FDD) and Hyper Historian data historian benefits, providing needed “on the spot” information in a hands-free manner.
“These new hands-free and mixed reality devices enable our customers and partners to ‘make the invisible visible’,” said Russ Agrusa, President and CEO of ICONICS. “There is a massive amount of information and value in all that collected and real-time data. Data is the new currency and we make it very easy to uncover this untapped information. We welcome this year’s summit attendees to get a glimpse at the future of HMI wearable devices such as Microsoft’s HoloLens and RealWear HMT1, HP and Lenovo Virtual reality devices.”
Mobile-Head-mounted tablet-style device
The V10.95 release of GENESIS64 HMI/SCADA and building automation suite includes Any Glass technology, which can be used with self-contained head-wearable computing devices. HMT-1 from RealWear demonstrated the visualization of real-time and historical data KPIs with voice driven, hands-free usage.
Featuring an intuitive, completely hands-free interface, the RealWear HMT-1 is a rugged head-worn solution for industrial IoT data visualization, remote video collaboration, technical documentation, assembly and maintenance instructions and streamlined inspections right to the eyes and ears of workers in harsh and loud field and manufacturing environments.
Support for multiple OSs and devices
Iconics has always been Microsoft Windows application and will continue to do so. However, IoTWorX Industrial Internet of Things (IIoT) software automation suite includes support for multiple operating systems including Windows 10 IoT Enterprise and Windows 10 IoT Core, as well as a large variety of Linux embedded operating systems including Ubuntu and Raspbian.
Users can connect to virtually any automation equipment through supported industry protocols such as BACnet, SNMP, Modbus, OPC UA, and classic OPC Tunneling. Iconics’ IoT solution takes advantage of Microsoft Azure cloud services to provide global visibility, scalability, and reliability. Optional Microsoft Azure services such as Power BI and Machine Learning can also be integrated to provide greater depth of analysis.
The following Operating systems are currently being certified for IoTWorX:
• Windows 10 IoT Enterprise
• Windows 10 IoT Core
• Red Hat Enterprise Linux 7
• Ubuntu 17.04, Ubuntu 16.04, Ubuntu 14.04
• Linux Mint 18, Linux Mint 17
• CentOS 7
• Oracle Linux 7
• Fedora 25, Fedora 26
• Debian 8.7 or later versions, openSUSE 42.2 or later versions
• SUSE Enterprise Linux (SLES) 12 SP2 or later versions
Hot, Warm, Cold Data Storage
Hyper Historian data historian integrates with and supports Microsoft Azure Data Lake for more data storage, archiving and retrieval.
When real-time “hot” data is collected at the edge by IoT devices and other remote collectors, it can then be securely transmitted to “warm” data historians for mid-term archiving and replay. Hyper Historian now features the ability to archive to “cold” long-term data storage systems such as data lakes, Hadoop or Azure HD Insight. These innovations help to make the best use of historical data at any stage in the process for further analysis and for use with machine learning.
Among the new analytical features are a new 64-bit BridgeWorX64 data bridging tool, a new 64-bit ReportWorX64 reporting tool, several new Energy AnalytiX asset performance charts and usability improvements. In addition, Iconics has introduced a new BI Server.
• AnalytiX-BI – Provides data aggregation using data modeling and data sets
• ReportWorX64 – Flexible, interactive, drag & drop, drill-down reporting dashboards
• BridgeWorX64 – Data Bridging and with drag-and-drop workflows that can be scheduled
• Smart Energy AnalytiX – a SaaS based energy and facility solution for buildings
• Smart Alarm AnalytiX – a SaaS based alarming analysis product that uses EEMUA
GE Digital continues to build out its platform and ecosystem of applications while new GE CEO John Flannery confirmed his commitment to the digitalization strategy begun under his predecessor.
The sixth Minds + Machines conference featured about 90% growth in attendance from last year. Begun five years ago not long after the company began assembling its digital strategy as a thought leadership gathering, the conference has evolved into a substantial user conference. Attendance was reported at about 3,700 filling much of Moscone Center West in San Francisco.
I’ve summarized the announcements from the event below. My initial takeaway for the biggest news of the day was GE’s emphasis on building a partner ecosystem. As the company built out its Predix platform, it seemed to be on a track for keeping everything close to home. Saying that they could move more quickly to market, they talked about working more with partners. One executive told me that the partnership with Microsoft for Predix on Azure was the most significant announcement of the week.
This is my first time here and reinforces the idea that GE Digital is a major player in the industry segment begging comparison with Siemens. Some thought also ABB (they should not have forgotten Schneider) also.
Most of my discussions involved Asset Performance Management, the new Operations Performance Management (see below), and helping me understand Predix.
Following is a summary of announcements:
Flannery touched on some statistics from a survey concerning the “digital gap” of perceived importance of a digital transformation and how far along companies are.
GE Digital Industrial Evolution Index
The inaugural Index reflects a total score of 63 on a scale of 100 and indicates that while outlook for the Industrial Internet is very strong, scoring 78.3 (out of 100), company readiness significantly lags, scoring 55.2 (out of 100). This disconnect – between outlook and company readiness – presents both a challenge and opportunity for companies seeking to benefit from the IIoT. 86% believe digital industrial transformation is important to the competitiveness of their companies, with the majority (76%) rating the ability to provide higher quality services as the foremost outcome of digital industrial transformation.
GE unveiled expansions to its suite of edge-to-cloud technologies and industrial applications.
Edge-to-Cloud Intelligence on Any Industrial Asset, Anywhere
GE Digital is expanding its Predix Edge capabilities to help run analytics as close to the source of data as possible. Predix Edge gives customers with limited connectivity, latency limitations, regulatory or other constraints a way to deploy applications closer to the originating data.
• Predix Edge Manager allows customers to support large fleets of edge devices – up to 200,000 connected devices from a single console.
• Predix Machine enables microservice-based applications to run at the edge on customers’ virtualized data center infrastructure or on server-class hardware from GE or its partners. This also supports Predix Edge Manager, which was previously available only as a cloud service.
• Predix complex event processing (CEP) allows for faster and more efficient analytics and other event processing at extreme low latency, available at the edge in Q1 2018.
Predix Platform on Microsoft Azure
Announced last year and available generally in 2018, GE Digital and Microsoft partnership extends the accessibility of Predix to Microsoft’s global cloud footprint, including data sovereignty, hybrid capabilities and advanced developer and data services, enabling customers around the world to capture intelligence from their industrial assets.
Alongside its Asset Performance Management (APM) software, the core application, GE Digital introduced Operations Performance Management (OPM), a solution helps industrials optimize the throughput of industrial processes.
OPM uses real-time and historical data – along with advanced analytics – to help customers make better operational decisions. The solution provides an early warning if industrial processes deviate from plan, arms operators with the information and time to troubleshoot operational issues and helps them take preventative actions to meet business goals. GE Digital’s OPM software initially targets the mining industry and will expand to additional industries early next year.
Enhanced Field Service Management Solutions
With service technicians looking to embrace technology to improve their productivity and deliver a better experience for customers, ServiceMax from GE Digital announced several enhancements to its FSM suite – enabling even greater efficiencies and bringing advanced analytics to service operations.
• Artificial intelligence-enabled predictive service times now integrate the Apache Spark AI engine to improve service time estimates.
• Additionally, a new application integration solution enables service providers to launch and share FSM data with third-party mobile applications installed on the same device.
• New capabilities in schedule optimization allow for dependent job scheduling between work orders for multiple visits aimed at improving first-time fix rates.
GE Digital also introduced Predix Studio to help companies build and scale their own industrial applications and extend its Asset Performance Management (APM) suite. Available in Q1 2018, Predix Studio simplifies the development process by giving customers the ability to extend applications and empower industrial subject matters experts to build apps in a low-code, high-productivity environment.
Digital Twin Analytics Workbench
A solution that applies a library of algorithms and templates to make it faster and easier for companies to build their own digital twins on Predix. The Analytics Workbench, currently a technology preview from GE Power, can be used to augment existing digital twins with new data streams. For example, power producers using drones to inspect wind turbine blades, pipelines or fuel reserves can integrate visual inspection data into the digital twins they already use to manage generation assets and grid infrastructure.