The “Edge” is a hot space right now, although sometimes I’m not sure that everyone agrees what “Edge” is as they develop products and solutions. However, first thing this morning I saw this tweet from Tom Bradicich of Hewlett Packard Enterprise (@HPE) referring to an article that mentions him in CouputerWeekly.com. I’ve written about HPE at the edge and with IoT before. Looks like something’s up.
Tweet from @TomBradicichPhD Not only computing at the #edge, but also a new product category of “converging IT with #OT” systems (such as controls, DAQ, industrial protocols). Watch this space, my team’s next innovation is all this as-a-service. #aaS
Here is the rationale from the Computer Weekly article. “The benefits of edge computing have the potential to help businesses dramatically speed up their data analysis time while cutting down costs. @HPE’s Mark Potter and @TomBradicichPhD share how we can make this possible.”
In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.
It is fascinating to see our environment described by an enterprise IT writer. The truth is that following the Purdue Model, suppliers tried to make PLCs and DCSs part of the information infrastructure in parallel to supervising or executing control functions. That proved too unwieldy for control engineers to manage within the programming tools used. It was also too slow and not really optimized for the task.
Along came IT companies. I have followed a few over the past five years. They have had trouble with figuring out how to make a business out of edge compute, gateways, networking, and the like.
In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.
Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.
Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”
I get this analogy to converging technologies into a device such as the iPhone. I don’t know if we want to cede control over to an HPE compute platform (although it has plenty of horsepower), but the idea is tempting. And it would be thoroughly disruptive.
Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.
HPE’s investment in a new company called Pensando, which recently emerged from stealth mode and is founded and staffed by former Cisco technologists with former Cisco CEO John Chambers installed as Chairman, Sunil believes new categories of device will come to market, aimed at edge computing, could lead to a plethora of new devices perhaps to perform data acquisition and real-time data processing.
Mark Potter recently wrote in a blog post, By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in in a way that is completely air-gapped from the core system.
These are critically-important requirements in our hyper-connected, edge-centric, cloud-enabled and data-driven world – where billions of people and trillions of things interact.
This convergence is generating unimaginable amounts of data from which enterprises seek to unearth industry-shaping insights. And as emerging technologies like edge computing, AI and 5G become even more mainstream, enterprises have an ever-growing need to harness the power of that data. But moving data from its point of generation to a central data center for processing presents major challenges — from substantial delays in analysis to security, governance and compliance risks.
That’s where Pensando and HPE are making an industry-defining difference. By moving the traditionally data center-bound network, storage and security services to the server processing the data, we will eliminate the need for round-trip data transfer to centralized network and security appliances – and at a lower cost, with more efficiency and higher performance.
Here are benefits that Potter listed:
- Lower latency to competitive solutions, as operations will be carried at a 100Gbps network line rate speed;
- Controller management framework to scale across thousands of nodes with a federation of controllers allowing scale to 1M+ endpoints; and
- Security, governance and compliance policies that are consistently applied at the edge.
Announcements and discussions at this year’s iteration of the Industry Forum sponsored by ARC Advisory Group were amazingly diverse. Another IT supplier appeared. Security remained an issue. Most conversations revolved around open (open source and open interoperability), edge, 5G, collaboration/partnerships, software-defined, machine learning, MQTT and its companion Sparkplug, and most importantly, solving problems for end users.
Following is a brief recap. Follow the links for in-depth information. Of course, many company announcements fit into more than one bucket.
Examples one of a variety of open include Eclipse Foundation and the Open Group Open Process Automation Forum with the IT technology of Kubernetes thrown in.
The Eclipse Foundation launched the Sparkplug Working Group. Founding members Chevron, Canary Labs, Cirrus Link Solutions, HiveMQ, Inductive Automation, and ORing are defining an open standard specification to create interoperable MQTT IIoT solutions.
The Working Group will encourage the definition of technical specifications and associated implementations that rationalize access to industrial data, improve the interoperability and scalability of IIoT solutions, and provide an overall framework for supporting Industry 4.0 for oil and gas, energy, manufacturing, smart cities and other related industries.
Sparkplug is relatively new which leads to interoperability problems since each supplier and end user must create all definitions of the data. Success of this WG is essential for any widespread adoption of is. The Eclipse Foundation pointed out the intent and purpose of the Sparkplug specification is to define an MQTT topic namespace, payload, and session state management that can be applied generically. By meeting the operational requirements for these systems, Sparkplug will enable MQTT-based infrastructures to provide more valuable real-time information to business stakeholders as well.
The Open Group Open Process Automation Forum progresses. This topic broaches on both open and software-defined control. The Open Group Open Process Automation Forum (OPAF), in the first major update to the standard for open process automation systems since February 2019, has progressed perhaps more than I would have predicted after its unveiling only a few years ago at the ARC Forum.
Its first release focused on interoperability while the O-PAS Standard Version 2.0 provides a vendor-neutral Reference Architecture which enables the construction of scalable, reliable, interoperable, and secure process automation systems. The latest release, which is a Preliminary Standard of The Open Group, has further emphasis on standardized system configuration portability to significantly reduce capital cost and time investment for end-users. With these capabilities, end-users can easily exchange equipment without being tied to a single vendor or requiring individual configuration parameters to be written in different operating languages.
With their standard moving from interoperability to portable configurations, leaders told me that the next release will expand on this portability theme.
Bedrock Automation integrates Flow-Cal Flow Measurement into Open Secure Automation (OSA) Platform.
Speaking of both software-defined and open, Bedrock Automation Founder, CEO, and CTO Albert Rooyakkers explained its extension to the “Open Secure Automation (OSA)” platform with the addition of Flow-Cal algorithms “bringing oil and gas measurement and custody transfer securely into the digital age.” This was essentially a software addition to the platform to bring a new twist on flow computer functionality.
The new OSA +Flow family embeds industry-leading Flow-Cal measurement applications. Flow-Cal’s software has long been the industry’s choice for flow measurement and production-accounting data. Affirming Flow-Cal’s stature is the fact that the American Petroleum Institute (API) has selected it to develop, support, and distribute its standard flow measurement calculations.
The OSA +Flow software has been incorporated across all Bedrock controllers providing scalability for PLC, RTU, or DCS flow control requirements at custody transfer stations, separators, and other oil and gas production facilities. These solutions include full support of multi-drop serial, Ethernet, and HART for Coriolis, ultrasonic, and smart transmitters.
The system API compliant library, OPC UA, Inductive Automation software, MQTT, as well as software-defined I/O.
Diamanti Accelerates Energy and Service Organizations’ Adoption of AI/ML
AI and ML applications often leverage GPU processing for training models and they benefit from containers and Kubernetes—an open source container project. However, these processes are often complicated to adopt and run at scale. With the recent announcement of GPU support in the Diamanti AI/ML platform, enterprises have an easier on-ramp to managing large-scale containerized workloads under Kubernetes.
“We’re pleased to share the early customer traction we are seeing on our newest solutions in a wide range of industries including energy, services and more,” said Tom Barton, CEO of Diamanti. “These customers are validating state-of-the-art technologies internally while also benefiting from the reduced physical footprint and cost-savings that come with the Diamanti AI/ML platform.”
The new solution, announced in late 2019, is in early access today and fully supports Nvidia’s NVLink technology for higher performing workloads, as well as Kubeflow, an open source machine learning framework for Kubernetes that provides highly available Jupyter notebooks and ML pipelines. Combined with Diamanti’s Kubernetes control plane, this allows customers to deliver highly scalable environments for performance-intensive AI/ML workloads, accelerating model development and training.
A major energy company turned to Diamanti for a new workload leveraging AI/ML for optical character recognition (OCR) to scan invoices. The customer needed to scan more than 15,000 invoices a day. The legacy infrastructure could not keep up with the demand and eventually accrued a backlog of more than 200,000 invoices. Deploying the Diamanti solution with GPU support eliminated that backlog within hours.
Edge – 5G
As the other influencers at an HPE event told me once, “Gary, everything you do is the edge.” So it is not surprising that I had many conversations about the Edge. But 5G technology was also on many minds. The consensus opinion–5G will drive decision making to the edge.
As an example of edge at the Forum, here is an announcement from Opto 22. For as long as I’ve known the company, it continues to push the latest IT technologies mashed up with control and automation. This product release highlights its pioneering role in IoT.
Industrial automation manufacturer and industrial internet of things (IIoT) developer Opto 22 announced groov RIO, a family of intelligent, distributed input/output (I/O) for IIoT and automation applications. groov RIO represents a first-in-class solution for its ability to quickly connect traditional wired switches and sensors directly to Ethernet networks, software applications, and cloud platforms without intermediary control or communication hardware, such as PLCs, PACs, or PCs.
The first shipping version of groov RIO is the GRV-R7-MM1001-10, a standalone, 10-channel, multi-signal, multifunction I/O unit for signals including thermocouples (TCs), integrated circuit temperature devices (ICTDs), voltage inputs, current inputs, millivolt inputs, discrete DC inputs, self-wetting discrete inputs, discrete DC sinking outputs, and Form C mechanical relays. In addition, two channels provide special features like pulse counting, on- and off-time totalization, software latching, frequency measurement, and more. GRV-R7-MM1001-10 is completely standalone and software-configurable through a browser-based interface.
“When we designed groov RIO, we were looking for ways to democratize I/O data, because that’s what the IIoT is all about,” said Vice President of Product Strategy at Opto 22, Benson Hougland. “Although groov RIO can be used as remote I/O with our groov EPIC system or another control system, we also wanted it to operate autonomously, facilitating direct connection between I/O signals and databases, business software, or cloud IoT platforms.”
GRV-R7-MM1001-10 supports 12 different types of field I/O circuits. It also provides no-hassle, enclosure-free installation with multiple power options, including standard 802.3af Power-over-Ethernet (PoE); an extended operating temperature range; and UL Hazardous Locations and ATEX approvals.
Once installed, groov RIO can be independently managed and configured through browser-based tools. Per-channel I/O type and signal-processing options through groov Manage eliminate the need for a master control unit, and support for standard enterprise network services like DNS, DHCP, and VPN facilitates network connectivity. Embedded communication options range from efficient data publishing with MQTT Sparkplug to advanced signal processing, data aggregation, and transactions with databases and web services, using the low-code Node-RED environment and runtime.
Data —> Action
It’s all about data they all say. But when I talked with Mike Brooks who is now advising at AspenTech, he counseled, “Not too much data.” The action is in using data, not collecting it. Therefore the drawback (indeed, failure?) of data lakes. Too much storage, not enough usability. AspenTech exemplifies using Machine Learning not just to say they are in AI, but to find usable information for the companies to use to improve operations.
Collaboration – Partnerships
The Eclipse Foundation and OPAF exemplify collaboration and partnerships. Inductive Automation has community as a strategic initiative. Both founder Steve Hechtman and chief strategy officer Don Pearson highlighted it at last year’s Ignition Community Conference.
This announcement highlights community along with edge and other trends. Inductive Automation announced improvements to three products and a new development resource within Ignition by Inductive Automation. Ignition is an industrial application platform with tools for building solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), and the Industrial Internet of Things (IIoT).
The solutions include:
- New and improved products for Ignition Edge.
- An expansion of the Ignition Onboard program.
- Improvements to the Ignition Perspective Module.
- A new, free resource for developers: Ignition Exchange.
Ignition Edge will soon have three new products. Ignition Edge is a line of lightweight, limited, low-cost Ignition software solutions designed for embedding into field and OEM devices at the edge. They allow organizations to extend data collection, visualization, and system management to the edge of the network. With the new products coming soon, the lineup will include Ignition Edge Panel, Ignition Edge Compute, Ignition Edge Sync Services, Ignition Edge EAM (Enterprise Administration Module), and Ignition Edge IIoT.
The Ignition Onboard program now has easier access to industrial hardware that comes with Ignition already installed, configured, and licensed. Numerous device manufacturers are embedding Ignition and Ignition Edge into their devices — including Advantech, Moxa, OnLogic, Opto 22, and ORing.
The Ignition Perspective Module lets users easily build mobile industrial applications in HTML5 for monitoring and control of their processes directly from their mobile phones.
A significant part of the Inductive Automation strategy is to promote community among its customers and partners. The development has been ongoing for some time culminating into Ignition Exchange — a new, online space where developers can get free Ignition resources provided by Inductive Automation and the Ignition community. These resources can save time for developers.
OPAF, Bedrock Automation- as in take hardware platform add flow metering, exemplify the trend toward software-defined hardware.
I discussed ML in relation to AspenTech for decision making. Perhaps the industry is moving past the SciFi “artificial intelligence” part of the technology to emphasize real use cases deployed today.
To name a trend “operations” may sound archaic, but many conversations moved from technology to solving real problems for customers. This announcement from AVEVA exemplifies that trend.
AVEVA unveiled its new Discrete Lean Management software. The new offering improves operational efficiency through the digitalization of lean work management for both manual and automated production lines. AVEVA’s quick-to-deploy and easy to use digital tools enable access to production information, KPIs and notifications on dashboards, workstations and mobile devices to improve overall equipment and labor effectiveness, and to facilitate data-driven continuous improvement.
AVEVA Discrete Lean Management is designed to address the issues faced by operating manufacturing plants still using paper-based systems for lean and work order management, work instructions and data collection procedures. It enables physical records to be replaced with digital tools that mitigate the risk of manual processes and provides real time visibility into production performance allowing team collaboration in response to production issues.
The AVEVA Discrete Lean Management software solution is used in Schneider Electric’s manufacturing plants and has been successfully deployed in more than 70 smart factories globally resulting a 10% productivity increase due to downtime mitigation and 70% improved response-time due to automated escalation of production issues.
I actually visited one of the plants in the deployment—one in Lexington, KY. It was an excellent example of using software tools to enhance a lean process rather than getting in the way.
MQTT was mentioned all over the conference. This is a data transport technology. It is usable for both OPC UA and for Sparkplug. Some companies touting their use of the technology include:
I didn’t have as many security conversations as the past few years, but I did chat with some PAS Global executives, and the company announced several new products, along with some new branding.
PAS, now PAS Global, keeps building on the platform of alarm management and safety and its ability to see what is on the process plant’s network assuring integrity of the process control system.
While at ARC Forum, company executives stressed industrial operations must increase focus on cybersecurity while maintaining continuous vigilance on safety. Stated simply, organizations need to ensure OT integrity in the face of unprecedented opportunity and risk. PAS has introduced new and updated products to optimize the integrity of industrial assets and reduce cyber risk, improve process safety and reliability, and ensure OT data health.
PAS Cyber Integrity prevents, detects, and remediates industrial cyber threats. Version 6.5 introduces an enhanced user experience for OT asset inventory information and data collection and transfer. This release also provides support for multiple integration methods (REST API, Syslog, SQL, CSV, SDK), integration with Darktrace, and Microsoft Windows event analytics;
PAS PlantState Integrity Version 8.7 introduces enhancements to Independent Protection Layer (IPL) Assurance that include sensor monitoring and voting, analysis filtering, and process trip reporting.
PAS Decision Integrity enables trusted data for decision-making. Version 1.0 leverages capabilities from PAS Automation Integrity and adds support for OT data health monitoring (data chain accuracy and visualization) and data lake enrichment.
These new product releases will be generally available by the end of March.
I picked this news item up from The Economist Espresso app.
For years, technologists have gushed about the promise of the “Internet of Things”, enabling ordinary objects—from kettles to cargo ships—to communicate autonomously with each other. The two essential technologies speeding the IOT’s arrival, inexpensive sensors and super-fast networking kit, are advancing rapidly. Gartner, a research group, predicts that the global number of devices embedded with sensors will leap from 8.4bn in 2017 to 20.4bn in 2020. So is 5G, a telecoms-networking technology superior to today’s 4G mobile networks. But the world’s 5G system could split into two different and potentially incompatible entities. One has been developed by Huawei, a Chinese telecoms-equipment giant, at a cost of $46bn. But some are worried about the company’s links to the Chinese Communist Party. Several countries, led by America, have banned the use of Huawei’s gear in their systems for security reasons. The year 2020 could herald the arrival of the Splinternet of Things.
I daresay that most likely many countries in the world are concerned about the ability of the US government to monitor internet traffic through the technology of American companies. These swords always cut two ways when you take the larger view.
More relevant to this topic, though, could a potential splintering into two 5G systems globally impact IoT?
In the short term from what I can gather interviewing technologists, benefits from 5G will accrue from the ability for private, plant-wide broadband rather than from some global linking of sensors.
Perhaps we are a bit early for journalists’ raising fear, uncertainty, and doubt. Listening to people actually building out the technology, I think we are going to experience much benefit from 5G in the not-to-distant future.
Suppliers of manufacturing software, some from surprising places, are putting sizable investments into products that will help customers reap the rewards of digitalization. Today, I’m looking at both ABB and Emerson Automation Solutions. Previously I checked out GE Digital and Rockwell Automation. Each has taken a slightly different course toward the goal, but notice the common thread of enhancing software products to help customers prosper.
ABB enhances manufacturing management technology
The new version of ABB Ability Manufacturing Operations Management will offer new features including:
- Enhanced user experience based on new HTML 5 web client;
- A new smart interactive dashboard application that provides greater visibility and collaboration;
- A new statistical process control (SPC) application, to determine if each process is in a state of control;
- A new Batch Compare application – for advanced batch analysis.
“ABB Ability Manufacturing Operations Management is a comprehensive, scalable and modular software suite that optimizes visibility, knowledge and control throughout the operations domain,” said Narasimham Parimi, Head of Digital Products – Product Management, Process Control Platform. “This release provides a range of rich new functionality and a new enhanced user experience that enables operations to become more productive and responsive.”
ABB Ability Manufacturing Operations Management is designed to simplify production management by enabling performance monitoring, downtime management, and maintenance support, as well as providing statistical production analysis tools. It provides solutions and tools to facilitate the collection, consolidation and distribution of production, quality and energy information via the plant’s web-based reports, trends, and graphs.
A new, self-service dashboard application promotes increased collaboration, providing visibility from shop floor to top floor and spanning IT and OT environments. It increases data connectivity to all apps and modules within the MOM suite, combining historic and manufacturing data and providing the user with improved customization capabilities. Dashboards can be shared amongst users, further promoting collaboration between teams. Trends and events are displayed together, which enables customers to identify issues and opportunities enabling informed and timely decisions.
The new common services platform features an HTML 5 web platform that runs across all suites ensuring customers have a seamless user experience, so that applications can be viewed on different devices right down to a 10-inch tablet.
Statistical data process control (SPC) is used in manufacturing to determine if each process is in a state of control. The new SPC application works across all the different apps and modules and helps the user to improve quality and production related performance.
In addition to the existing Batch View and Batch Investigate features, a comparison option has been added to the platform’s batch analysis applications, allowing different types of comparison.
Cyber security remains one of the key issues in the advancement of Industry 4.0, and the new features in MOM include enhanced security.
Emerson Expands Analytics Platform
Plantweb Insight platform adds two new Pervasive Sensing applications that manage wireless networks more efficiently with a singular interface to the enterprise.
Emerson has added two new IIoT solutions to its Plantweb Insight data analytics platform that will enable industrial facilities to transform the way they manage their enterprise-level wireless network infrastructure.
As digitalization and wireless technology adoption continue to rapidly expand in industrial facilities throughout the world, the need for greater visibility of network infrastructure performance is key. These new Plantweb Insight applications provide a quick-to-implement, scalable IIoT solution that helps customers advance their digital transformation strategies and achieve greater operational efficiencies.
The new Plantweb Insight Network Management application provides continuous, centralized monitoring of WirelessHART networks. This first-of-its-kind application provides a singular, consolidated view of the status of all wireless networks in a facility, with embedded expertise and guidance for advanced network management.
A key feature of the Plantweb Insight Network Management application is a configurable mesh network diagram, providing visualization of network design and connections along with device-specific information. It also provides an exportable record of syslog alerts, network details outlining conformance to network best practices and more.
While the new network management application provides a holistic look at wireless networks, the Plantweb Insight Power Module Management application drills down to the device level, allowing facilities to keep their wireless devices appropriately powered so they can continuously transmit key monitoring data. By aggregating power module statuses, users can evolve traditional maintenance planning and implement more efficient and cost-effective practices.
“We were able to infuse a decade of experience with wireless technology into these new offerings,” said Brian Joe, wireless product manager with Emerson’s Automation Solutions business. “Our customers will now be able to manage and improve hundreds of networks through a singular interface, realizing significant efficiencies in individual network and wireless device management and maintenance.”
These new applications further enhance the Plantweb Insight platform, a set of pre-built analytics primarily focusing on monitoring key asset health. Other applications in the platform include pressure relief valve monitoring, heat exchanger monitoring and steam trap monitoring.
Top Tens and Top Twenties of the past or future year have never been my favorites. However, one can perceive trends and strain out little nuggets of gold by scanning several. Especially industrial taken broadly along with Internet of Things (IoT) and other current digital trends. I just had an interesting chat with Sean Riley, Global Director of Manufacturing and Transportation for Software AG, who released his Top Ten for 2020.
Following are his ideas interspersed with a few of my comments.
Cost Management Becomes Exceptional
As uncertainty enters the global manufacturing outlook, enterprises will become myopically focused on cost reductions. This will drive organizations to find more efficient methods of providing IT support, leveraging supplier ecosystems and simplifying value chains. [GM-much of my early work was in cost management/reduction; this is a never-ending challenge in manufacturing; however, tools continue to evolve giving us more and better solutions.]
A Blurred Line Between Products & Services
Manufacturers continue their product innovation quest and more manufacturers will begin focusing on how to deliver products as a service. The Manufacturers that have already created smart products and have elevated service levels will now begin to work out the financing considerations needed to shift from a sales based to a usage based revenue model. [GM-This is a trend most likely still in its infancy, or maybe toddler-hood; we see new examples sprouting monthly.]
Moving To Redefine Cost Models To Match Future Revenue Streams
Anticipating the shift to continual revenue streams, manufacturers will seek to shift costs to be incurred in a similar manner. This will be initially seen as a continued push to subscription based IT applications. While much progress has already been made, a larger focus will occur. [GM-I like his idea here of balancing capital versus expense budgets, continually finding the best fund source for shifting costs.]
IT Focuses on Rapid Support for Growth
The lines between business and IT users become blurred as no-code applications allow for business users to create integration services. IT professionals will leverage DevOps & Agile methodologies alongside of microservices and containers to rapidly develop applications that are able to generate incremental growth as requested by business users. This will be critical to the near term success for manufacturers, especially with economic headwinds that seem to be growing stronger. [GM-I didn’t ask about DevOps, but this idea is springing into the industrial space; cloud and software-as-a-service provide scalability both up and down for IT to balance costs and services.]
Industrial Self-Service Analytics Become Mission Critical
Industrie 4.0 / Smart Manufacturing initiatives continue to receive greater amounts of investment but in the near term, manufacturers will focus on unleashing the power of the data they already have. Historians, LIMS, CMMS’ have valuable data going to and in them and enabling production engineers to leverage that data rapidly is critical. Industrial Self-Service Analytics that allow production and maintenance professionals to leverage predictive analytics without IT assistance will sought as a powerful differentiating factor. [GM-we are beginning to see some cool no-programming tools to help managers get data access more quickly.]
Industrie 4.0 / Smart Manufacturing Initiatives Continue to Draw Investment
It’s no surprise that Manufacturers will continue to invest in Industrie 4.0 as the promises are great however, the scaled returns have not been realized and won’t be realized in the near term. The difficult of implementing these initiatives has surpassed manufacturers expectations for several reasons. First, traditional OT companies were trusted to deliver exceptional, open platforms and that wasn’t delivered. Secondly, collaboration efforts between IT & OT professionals proved to be more convoluted and difficult than expected. [GM-I’m thinking these ideas became overblown and complex, and that is not a good thing; to swallow the whole enchilada causes stomach pain.]
Artificial Intelligence Enters the Mix
AI won’t allow for users to sit back and relax while AI handles all of their tasks for them but it will make an appearance in back office tasks. Freight payment auditing, invoice payment and, in some select areas, chatbots will be the initial main stream uses of AI and will be seen as not becoming an anomaly but be understood to be more mainstream this year. [GM-I think still an idea looking for a problem; however some AI ideas are finding homes a little at a time.]
3D Printing Find New Uses
While this technology has steadily crept into production lines, the push towards usage based product pricing will have the technology move into after market services. Slow moving parts will be the first target for this technology which will help to free up much needed working capital to support financial transformation. [GM-watch for better machines holding tighter tolerances making the technology more useful.]
5G & Edge Analytics Enable New Possibilities
As Industrie 4.0 is continued to be pursued, Manufacturers will implement new initiatives that could not previously be realized without the high speed data transmission promises of 5G or the ability to conduct advanced analytics at the edge where production occurs. This will also provide manufacturers with new methods to securely implement Smart Manufacturing initiatives and in new locations that were not previously feasible due to connectivity issues. [GM-5G is still pretty much a dream, but there is great potential for some day.]
Security Still Remains a Critical Focus
With the increasing rate of IoT sensors, IT-OT convergence, the usage of API’s and the interconnectivity of ecosystems ensuring data security remains a top priority for manufacturers. As more data becomes more available, the need to increase levels of security becomes ever greater. [GM-ah, yes, security–a never-ending problem.]
Suddenly the wireless networking side of IoT connectivity is hitting my radar. Since the culmination of the “wireless wars” of 10 years ago, this technology/market area has settled into supplying usable products. This information came from Honeywell—In short, by supplying ISA100 Wireless and WirelessHART connectivity to Cisco’s next-generation Wi-Fi Access Point, Honeywell’s OneWireless IoT Module can help users increase industrial plant productivity, worker safety, and digital transformation readiness.
Honeywell is developing a OneWireless IoT Module for the next-generation of Cisco’s industrial access points, the Cisco Catalyst IW6300 Heavy Duty Series Access Point. The Honeywell and Cisco technologies will form the backbone of Honeywell’s OneWireless Network.
The joint wireless solution enables Honeywell customers to quickly and easily deploy wireless technologies as an extension of their Experion Process Knowledge System (PKS). Combining the leading IT network technology by from Cisco and the leading Honeywell OneWireless multi-protocol technology provides customers with a single infrastructure that meets all their industrial wireless needs.
“For the past decade, Cisco and Honeywell have worked together to deliver secure, wireless solutions to connect mobile workers and field instrumentation in the most challenging process manufacturing environments,” said Liz Centoni, senior vice president and general manager, Cisco IoT. “We’ve had great success in bringing IT and operational teams together to reduce complexity and improve efficiency. Now, we are building on that foundation to extend the power of intent-based networking to the IoT edge.”
When combined with the Honeywell OneWireless IoT Module, the Cisco Catalyst IW6300 Heavy Duty Series Access Point offers the security, speed, and network performance needed to allow the seamless extension of the process control network into the field.
“The OneWireless IoT Module is Honeywell’s latest innovation as a leader in wireless technology,” said Diederik Mols, business director Industrial Wireless, Honeywell Process Solutions. “Our customers will benefit from OneWireless functioning as a seamless extension of Experion PKS and simplified deployment made possible by integrating the IoT module and aerials into a single unit.”