Industrial Software Companies Make Financial Moves

Industrial Software Companies Make Financial Moves

For the past couple of years, I’ve been convinced that there is a coming consolidation within the industrial software market. You would think that this would be a profitable business, but evidently it’s harder than it looks.

This thought converges with all the Industrial Internet of Things plays. We have platforms and a large variety of software—not to mention a variety of hardware plays. As buyers begin to sort out preferences, there will be changes.

GE Digital on the block

I was trying to figure out where GE was going to wind up in all this. Last fall I thought that GE Digital’s Minds + Machines conference was doomed. Then the 2018 edition was announced. Then yesterday morning I scan news feeds about 6 am and see that most of the GE Digital assets are on the auction block—evidently including Predix.

GE had a “not invented here” syndrome. Rolling your own platforms when other tried and perfected ones already exist is always shaky. So the new CEO mandated partnerships. There’s no reason to build a platform when Amazon’s AWS and Microsoft’s Azure are available. Now it appears that much of the portfolio is for sale.

Investments

But all is not lost. At the smaller end of the spectrum of industrial software there is investment money available according to a note I received from OSIsoft. The note pointed out IIoT company Seeq raised $23 million; Trendminer, Falkonry and Toumetis all recently received investments; and last year, SoftBank also invested in OSIsoft.

When we are consolidating at the top, that usually means it’s time for innovation in the newly available openings for small companies.

Consolidation

I could obviously point to PTC doing its part to consolidate in the IoT software space. But news just came about Plex Systems, a cloud-based ERP and MES supplier.

It announced it has acquired DATTUS Inc. Its solutions connect manufacturing equipment and sensors to the cloud, manage high-volume data streams, and analyze in-motion equipment data. The acquisition is expected to accelerate Plex’s IIoT strategy, extending the Plex Manufacturing Cloud to new streams of machine data and the underlying intelligence. The acquisition was completed in July 2018.

DATTUS brings to the Plex Manufacturing Cloud three major capabilities that will become central to Plex’s long-term IIoT roadmap: IIoT Connectivity, IIoT Data Management, and IIoT Data Analysis. IIoT Connectivity: DATTUS has simplified machine connectivity, providing plug-and-play solutions that work with the wide variety of protocols and data types used by equipment and sensors on the manufacturing shop floor. IIoT Data Management: the DATTUS IIoT platform captures and manages the extraordinary volume and variety of machine data to support real-time visibility into activity across production operations. IIoT Data Analysis: DATTUS analytics enable operational and business leaders to understand IIoT data in motion, providing decision support in areas such as predictive maintenance and machine performance.

GE Digital Ends Not Invented Here Syndrome

GE Digital Ends Not Invented Here Syndrome

GE Digital initiates a huge turnaround in its attitude toward software and Industrial Internet development. GE invested large sums to build a Silicon Valley presence for its software. Hired many engineers. Took its industrial software base up a notch or two with its Predix platform. Tried to build its own cloud infrastructure. The mantra—not invented here.

[Late Breaking News: I was wrong. There will be another Minds + Machines. San Francisco, October 30-31. That’s an expensive trip. Anyone want to fund me? 😉 ]

During the last Minds+Machines conference in San Francisco new CEO John Flannery, barely two months into the job, said that GE Digital needed to work more closely with partners. Soon thereafter came the axe.

That is the context for this major announcement (this one came from Microsoft, so within it may be a bit of its bias) of a partnership. Following report is based upon a media blog from Microsoft.

GE and Microsoft announced an expanded partnership, bringing together operational technology and information technology “to eliminate hurdles industrial companies face in advancing digital transformation projects.” GE Digital plans to standardize its Predix solutions on Microsoft Azure and will deeply integrate the Predix portfolio with Azure’s native cloud capabilities, including Azure IoT and Azure Data and Analytics. The parties will also co-sell and go-to-market together, offering end customers premier Industrial IoT (IIoT) solutions across verticals. In addition, GE will leverage Microsoft Azure across its business for additional IT workloads and productivity tools, including internal Predix-based deployments, to drive innovation across the company.

GE also plans to leverage Azure across the company for a wide range of IT workloads and productivity tools, accelerating digital innovation and driving efficiencies. This partnership also enables the different GE businesses to tap into Microsoft’s advanced enterprise capabilities, which will support the petabytes of data managed by the Predix platform, such as GE’s monitoring and diagnostics centers, internal manufacturing and services programs.

According to Microsoft, leveraging Azure enables GE to expand its cloud footprint globally, helping the companies’ mutual customers rapidly deploy IIoT applications.

The global IoT market is expected to be worth $1.1 trillion in revenue by 2025 as market value shifts from connectivity to platforms, applications and services, according to new data from GSMA Intelligence. Note: I find this a very interesting comment.

As part of this expanded partnership, the companies will go-to-market together and also explore deeper integration of Predix IIoT solutions with Power BI, PowerApps and other third-party solutions, as well as integration with Microsoft Azure Stack to enable hybrid deployments across public and private clouds.

Process Solutions Evolving

Did Honeywell Process Solutions (HPS) short-circuit the Open Process Automation work? Inquiring minds wonder. Once again, some news and analysis of a conference that I couldn’t attend—three of these the same week in June.

HPS and ExxonMobil sent this release. Subsequently, I talked with some sources at competitor companies who broached the question to me—did this news short-circuit the ExxonMobil-led effort for a new process control solution? An interesting caveat is that there is more than one group within ExxonMobil—and they don’t necessarily agree.

From the first release:

The Open Process Automation group was initiated by ExxonMobil who was trying to find a better (less expensive) upgrade path for its control systems that had fallen behind that of its competitors. The oil & gas supermajor still has in operation a significant number of older systems installed as far back as the 1980s—systems that have served the company well for more than 30 years, but as older electronic components have been replaced by more modern alternatives, spare-parts shortages and looming obsolescence put ExxonMobil and other owner operators in a difficult place.

When facing obsolescence, rip-and-replace is clearly the option of last resort—incurring high costs, protracted downtime and the loss of all the intellectual property invested in developing a system’s displays, databases, control strategies and third-party interfaces, according to David Patin, distinguished engineering associate – control systems, ExxonMobil Research & Engineering.

The company’s installed base of Honeywell TDC 3000 systems, in particular, looked to be facing a critical shortage of spare parts in the year 2025, Patin explained. “So in 2011 we met with Honeywell regarding the future of TDC 3000,” Patin began, addressing a plenary session of the Honeywell Users Group Americas 2018 conference this week in San Antonio.

Challenge issued

Unwilling to settle for rip-and-replace, “We challenged Honeywell to develop and prove a method to migrate TDC forward,” Patin said. The two companies established a joint task team to investigate the problem.

ExxonMobil’s wish list of deliverables included avoiding wholesale system replacement (especially the I/O); preserving the company’s intellectual property investment; allowing for on-process migration of system components (meaning without shutting down the process); enabling new capabilities not currently possible with TDC; and unifying TDC with Honeywell’s current state-of-the-art Experion platform.

This last item encapsulated a desire for a solution that would “be usable by a younger workforce, yet stand the test of time,” Patin said. “I picture a third-grader who’s also a future TDC engineer,” he said. “They just don’t know it yet.”
Also implicit in ExxonMobil’s requirements were continued “rock solid” reliability and security, Patin added.

Solution identified

Since the technical obstacles to bringing TDC forward hinged on hardware obsolescence, notably controller microprocessors and communications chips that would no longer be available, the team settled on an emulation approach that would effectively abstract TDC system functionality from the specifics of the older hardware.

And in February 2018, seven years after that first meeting of the minds—and two years ahead of schedule—Honeywell answered ExxonMobil’s challenge with the release of Experion LCN R501.1. The Experion LCN, or ELCN, effectively emulates the TDC system as software. “It’s 100% binary compatible and interoperable with the old system,” Patin explained. “Current TDC code runs unmodified in this virtual environment, greatly reducing the technical risks. Intellectual property such as application code, databases and displays are preserved.”

In the end, the Experion Station, Server, ACE and APP nodes can take the shape of Windows-based “physical” applications or virtual machines. Application Modules, Network Gateway and Network Interface Module functionality is redeployed on Universal Embedded Appliances or as virtual appliances. Only the Enhanced PLC Gateway cannot be readily virtualized because the emulation of serial network connectivity is not well behaved, Patin explained. “This means you can build an almost 100% virtualized or 100% physical system—or somewhere in between.”

With the new solution, LCN and UCN messages are now encapsulated in standard Internet Protocol. “All the old networks now exist as logical constructs on Fault Tolerant Ethernet,” Patin said. “We’re no longer locked into proprietary networks.”
And to address the challenge of on-process migration, Honeywell has also introduced several bridge devices that effectively facilitate the virtualization of TDC system node functionality—without the need to interrupt the process under control.

Benefits achieved

Virtualization of the TDC environment has come with some added benefits, including the ability to use Honeywell’s cloud-based Open Virtual Engineering Platform to engineer TDC solutions; lower cost, smaller footprint training simulators; peer-to-peer integration of virtualized HPM controller nodes with current-generation C300/ACE nodes; support for OneWireless (ISA 100 and WirelessHART) connectivity; and integration with ControlEdge and Unit
Operations Controllers.

“It’ll be a game-changer,” said Patin. “We don’t know all that’s possible as yet.”
Other benefits include a drastic reduction—or elimination—of spare parts, as well as reductions in cabinet space requirements. “We’ve gone from two nodes to six in a single cabinet,” Patin said. “We’ve not fully realized unification with Experion, but that process has begun.”

Overall, Patin gave high marks to the Honeywell team for its response to ExxonMobil’s needs. “The challenge was met, and expectations exceeded,” he said. “The need to replace an entire system is eliminated, future component issues are virtually eliminated (pun intended), intellectual property is preserved and on
process migration is supported.

“ELCN technology essentially resets the odometer on your TDC 3000 investment,” Patin added. “It’s the best example of Honeywell’s commitment to continuous evolution that I’ve ever seen. And if it were a final exam, I’d give Honeywell an A on this one.”

<End of release>

Safety Manager

HPS also announced Safety Manager SC, the next generation of its flagship Safety Manager platform. Its modular, scalable design enables it to function as a single platform for all enterprise safety applications, allowing customers – who are often using four or five different safety systems – to consolidate and reduce their training and engineering costs, and spare parts inventories.

Safety Manager SC incorporates a new Series C-based controller and Honeywell technologies such as LEAP, Universal Safety IO, offline simulation and Experion integration, which collectively simplify safety system engineering, development and testing.

“Our customers increasingly want integrated safety and control solutions and the simplicity of partnering with one supplier for all their needs,” said Tim LeFevre, global customer marketing manager for safety systems, HPS. “We deliver exactly that by combining unrivaled expertise in distributed control systems (DCS) and safety systems with deep integration know-how. Honeywell is one of the few vendors that can support the full safety lifecycle.”

HUG

The ability of Honeywell Connected Plant’s offerings to deliver higher levels of safety, reliability, efficiency and profitability will continue to be the primary discussion point at the 43rdHoneywell Users Group (HUG) Americas symposium. More than 1,300 delegates from across the oil and gas, chemical, pulp and paper, and metals and mining sectors are attending the event, which features numerous displays of the newest technologies along with dozens of Honeywell- and customer-led sessions and technical discussions.

Throughout the conference, Honeywell will showcase how turning data into actionable insight requires more than just upgrading technology; it requires a system for capturing, retaining and sharing knowledge that allows both the plant and its workers to perform at their best every day.

“Digital transformation has to be about more than just moving data into the cloud,” said John Rudolph, president of Honeywell Process Solutions (HPS). “It ultimately has to be about the outcomes, including driving increased productivity and savings for our customers while allowing them to increase knowledge capture, knowledge sharing and knowledge retention among their employees.”

Rudolph was named president of HPS on May 31, 2018, succeeding Vimal Kapur, who was named president and CEO of Honeywell Building Technologies. Rudolph led the Projects and Automation Solutions, and Lifecycle Solutions and Services businesses for HPS over the past six years, driving significant growth. Rudolph also has held leadership roles with TAS Energy, General Electric and Ingersoll Rand.

Here is a revealing comment from the press release about HPS’s strategy and direction—something we’ve all been wondering about. “HUG attendees will be able to see and experience the Company’s ongoing transformation into a software-industrial provider.”

Announcements in brief:

  • Thermal IQ – Enables maintenance engineers and plant managers to more effectively monitor and manage their thermal process equipment, minimizing unplanned downtime and maximizing uptime.
  • Uniformance Cloud Historian – This software-as-a-service cloud hosting solution for enterprise-wide data capture, visualization and analysis helps customers improve asset availability, optimize processes and increase plant uptime.
  • Asset Performance Management – Integrates asset and process data for actionable insights to improve asset performance and plant profitability.
  • Immersive Competency – This cloud-based simulation offering uses a combination of augmented reality (AR) and virtual reality (VR) to train plant personnel on critical industrial work activities, empowering them to directly improve plant performance, uptime, reliability and safety.
  • Personal Gas Safety – This solution integrates with Honeywell’s leading plant control system to protect workers and speed emergency response in case of hazardous leaks or worker injury.
  • Intelligent Wearables – This hands-free, wearable technology allows industrial workers to more safely, reliably and efficiently accomplish their tasks in the plant or the field. It uses a head-mounted visual display that responds to voice and brings live data, documents, work procedures, as well as health and safety information into view and can connect field workers with remote experts in real time.
  • Experion Batch – Combines Experion distributed control, batch automation, and new visualization technology for improved efficiency, quality and throughput.
  • Measurement IQ for Gas – Provides measurement under control by transforming metering operations with 24/7 real-time condition-based monitoring.
NI Focusing on Test and Measurement Updates LabView

NI Focusing on Test and Measurement Updates LabView

NI Week was last week, and for only the second time in 20 years, I didn’t go. NI, formerly National Instruments, has been focusing more on test and measurement lately. Not so much automation. My interest is mostly on its IoT efforts especially TSN. I figure I can get an interview with Todd Walter or whomever without the expense of a conference.

NI’s core competency lies as the provider of a software-defined platform that helps accelerate the development and performance of automated test and automated measurement systems. At NI Week it announced the release of LabVIEW 2018.

Applications that impact our daily lives are increasing in complexity due to the rapid innovation brought on by industry trends such as 5G, the Industrial Internet of Things, and autonomous vehicles. Consequently, the challenge of testing these devices to ensure reliability, quality and safety introduce new demands and test configurations, with decreased time and budget. Engineers need better tools to organize, develop and integrate systems so they can accomplish their goals within the acceptable boundaries.

Engineers can use LabVIEW 2018 to address a multitude of these challenges. They can integrate more third-party IP from tools like Python to make the most of the strengths of each package or existing IP from their stakeholders. Test engineers can use new functionality in LabVIEW 2018 to strengthen code reliability by automating the building and execution of software through integration with open interface tools like Jenkins for continuous delivery. Capabilities like this empower test engineers to focus on system integration and development where they can offer unique differentiation, rather than get bogged down in the semantics of how to use software tools or move IP from one to another. For test engineers using FPGAs for high-performance processing, new deep learning functions and improved floating-point operations can reduce time to market.

“NI’s continued commitment to its software-centric platform accelerates my productivity so I can focus on the challenges that yield the highest ROIs,” says Chris Cilino, LabVIEW framework architect at Cirrus Logic. “LabVIEW continues to minimize the effort of adding tests and code modifications to our validation framework, delivering a consistent process to maintain our software and incorporate the reuse of valuable IP without rewrites.”

To meet demands like testing higher complexity DUTs and shorter timeframes, engineers need tools tailored to their needs that they can efficiently use through their workflow, helping them to meet their exact application requirements. LabVIEW 2018 is the latest addition to NI’s software-centric platform that features products tailored to needs within distinct stages of their workflow – products that have been adopted in whole or in part by more than 300,000 active users.

With InstrumentStudio software providing an interactive multi-instrument experience, TestStand test management software handling overall execution and reporting and SystemLink software managing assets and software deployments, this workflow improves the productivity of test and validation labs across many industries. Each piece of the workflow is also interoperable with third-party software to maximize code and IP reuse and draws on the LabVIEW Tools Network ecosystem of add-ons and tools for more application-specific requirements.

Engineers can access both LabVIEW 2018 and LabVIEW NXG with a single purchase of LabVIEW.

SAP Introduces Digital Twin and IIoT Solutions at Hannover

SAP Introduces Digital Twin and IIoT Solutions at Hannover

Hannover Messe was the place to learn the latest about all things digital—digital twin, Industry 4.0, Industrial Internet of Things (IIoT). SAP was one of the many stops in my itinerary advancing the trend.

My contact at the SAP booth at Hannover wasn’t around when I arrived for my appointment, so I left—only to get a text a half-hour later that he had arrived. But I was off to another appointment by then. However I did glean this information from the company at and following the show.

SAP enters the digital twin era

SAP SE has introduced SAP S/4HANA Cloud for intelligent product design, a new solution for collaborative research and development.

The solution, which is built on SAP Cloud Platform using SAP’s latest digital twin technology, is one of the building blocks for a network of digital twins to enable new business models.

Powered by SAP Leonardo and integrated with business processes in the digital core, SAP S/4HANA Cloud for intelligent product design enables customers to accelerate product design and development with requirement-driven systems engineering and instant collaboration across an extended network of suppliers and partners.

“The solution provides shared views of digital twin information for customers to gain live insights on new products and to store, share and review engineering documents with internal and external participants,” said Bernd Leukert, Member of the Executive Board of SAP SE, Products & Innovation.

SAP’s network of digital twins synchronizes the virtual, physical, conditional, and commercial definitions of assets and products in real time to accelerate innovation, optimize operating performance, predict service requirements, improve diagnostics and enhance decision-making. It enables new levels of collaboration among manufacturers of products, operators of assets, suppliers and service companies. The approach combines digital twins with manufacturing solutions from SAP, cloud networks and SAP Leonardo capabilities, including machine learning, blockchain and Internet of Things (IoT), to optimize the product lifecycle with:

• Digital representation: SAP synchronizes digital twin business data, product information, asset master data and IoT-connected data from both on-premise and cloud solutions enabling companies to represent the world digitally. Solutions including SAP Predictive Engineering Insights, SAP Predictive Maintenance and Service and the SAP 3D Visual Enterprise applications provide access to rich data processing capabilities and live configuration, state, condition and control information.

• Business process: Rich enterprise-grade data processing capabilities allow customers to create, access and update digital twins to support business processes. SAP solutions provide an integrated data model from design, production and maintenance to service, including packaged integration to existing systems for computer-aided design, ERP, and product lifecycle management. Offerings providing end-to-end process support for manufacturers and operators include SAP S/4HANA, the SAP Engineering Control Center integration tool, SAP Hybris Service Cloud solutions, and the SAP Manufacturing Integration and Intelligence and SAP Manufacturing Execution applications.

• Business networks: With leading network offerings such as SAP Ariba solutions, SAP Asset Intelligence Network, and the SAP Distributed Manufacturing application, SAP is uniquely positioned to provide a virtual platform for collaboration on products and assets. The network of digital twins enables secure data access, sharing and governance on a global scale.

• Networks of digital representation: SAP enables twin-to-twin connections in systems within a specific asset and on an asset-to-asset level. SAP solutions such as SAP Asset Intelligence Network provide semantic and industry-standards support in an asset core modeling environment to enable live enrichment during the product or asset lifecycle.

Digital Manufacturing Cloud

SAP Digital Manufacturing Cloud helps companies optimize performance, elevate production quality and efficiency, and ensure worker safety.

Drawing on SAP’s expertise in the Industrial Internet of Things (IIoT), predictive analytics and supply networks, the solution enables manufacturers to deploy Industry 4.0 technologies in the cloud.

The new cloud solution extends and complements the digital manufacturing portfolio of on-premise solutions from SAP and is available in different bundles to serve manufacturers of varying sizes in both discrete and process industries and roles within their respective organizations.

SAP customers can choose from the SAP Digital Manufacturing Cloud solution for execution, which provides all solutions in the manufacturing cloud portfolio, or the SAP Digital Manufacturing Cloud solution for insights, which focuses on performance management and predictive quality.

“Manufacturers in the era of Industry 4.0 require solutions that are intelligent, networked and predictive,” said Leukert. “Our manufacturing cloud solutions help customers take advantage of the Industrial Internet of Things by connecting equipment, people and operations across the extended digital supply chain and tightly integrating manufacturing with business operations.”

SAP Digital Manufacturing Cloud includes the following:

• SAP Digital Manufacturing Cloud for execution: Industry0-enabled shop floor solution features “lot size one” and paperless production capabilities. It integrates business systems with the shop floor, allowing for complete component and material-level visibility for single and global installations.

• SAP Digital Manufacturing Cloud for insights: Centralized, data-driven performance management enables key stakeholders to achieve best-in-class manufacturing performance and operations.

• Predictive quality: This helps manufacturers gain valuable insights to conform to specifications across processes and streamline quality management. It also allows manufacturers to apply predictive algorithms that can reduce losses from defects, deficiencies or variations, and recommend corrective actions.

• Manufacturing network: The network provides a cloud-based collaborative platform integrated with SAP Ariba solutions connecting customers with manufacturing service providers, such as suppliers of 3D and computer numerical control (CNC) printing services, material providers, original equipment manufacturers (OEM) and technical certification companies.

Also at Hannover Messe 2018, SAP announced SAP Connected Worker Safety, a solution designed to reduce risks, costs and protect employees. Information from wearables and other sensor-enabled equipment can help companies react immediately to a hazardous situation or incident while proactively managing worker fatigue and other hazard inducers. Real-time information allows monitoring of compliance at all times against regulatory and other parameters.

Company Emerges from Stealth to Power Real-Time Apps at the Edge

Company Emerges from Stealth to Power Real-Time Apps at the Edge

The Internet of Things ecosystem is changing computing in almost a seismic shift. But like geology, it builds up over time and then the event happens before you know it.

We had centralized, on-site computing revolutionized by PCs. We networked PCs and wound up with centralized computing in the cloud. Demands from building the Internet of Things (or Industrial Internet of Things for us manufacturing and production geeks) expose the flaws of cloud computing. The next hot thing—edge.

Yesterday the CEO/co-founder of Zededa talked with me about the computing platform his company is building with no less a mission than to build the largest computing company on Earth without owning infrastructure. Its vision—create a new edge economy that allows applications to run anywhere.

Some of what follows may sound familiar. I’ve talked with many companies doing a piece of what Zededa has laid out, but none are as audacious as this.

In brief, Zedeta…

  • Closes $3.06M in Seed Funding
  • Pioneering a secure, cloud-native approach to real-time edge applications at hyperscale for solutions ranging from self-driving cars to industrial robots
  • Built a team comprised of distinguished engineers from top tech companies in cloud, networking and open source to solve the edge computing puzzle and disrupt the status quo
  • Seed round was led by Wild West Capital; other investors include Almaz Capital, Barton Capital and Industry Veteran Ed Zander, former CEO of Motorola and former COO of Sun Microsystems

“Tomorrow’s edge computing environment that enables digital transformation will be distributed, autonomous and cooperative. The edge is complex and not only has to scale out securely, but simultaneously must become friendlier for app developers. That’s the problem we are solving at ZEDEDA,” stated ZEDEDA CEO and Co-Founder Said Ouissal. “It will require a drastic shift from today’s embedded computing mindset to a more secure-by-design, cloud-native approach that unlocks the power of millions of cloud app developers and allows them to digitize the physical world as billions of ‘things’ become smart and connected.”

ZEDEDA will use the funding for continued research and product development, investment in community open-source projects for edge computing as well as further investment in sales and marketing initiatives. ZEDEDA investors include Wild West Capital and Almaz Capital, whose funding was part of a broader group investors, some of whom also invested in IoT/edge companies Theatro and Sensity Systems (now Verizon).

In the coming wave of pervasive computing, real-time apps, cyber-physical systems and data services such as machine learning and analytics will become commonplace. ZEDEDA envisions an open ecosystem and a completely new technology stack that creates a service fabric essential to achieving the hyperscale that will be required in edge computing.

To realize that goal, ZEDEDA has pulled together a distinguished roster of industry veterans from legendary technology companies with expertise in areas of operating systems, virtualization, networking, security, blockchain, cloud and application platforms. This unique blend of skills combines with the team’s deep connections to core open-source projects and standardization bodies. The team’s work has directly contributed to software and system patents as well as industry standards used by billions of people around the world today.

“A new paradigm and massive innovation is needed to meet demand for IoT and edge computing,” said Kevin DeNuccio, Founder of Wild West Capital and ZEDEDA’s lead investor. “Massive shifts in technology, including the proliferation of IoT, paves the way for industry disruption, which large incumbents tend to inhibit. Disruption takes a combination of an entrepreneurial team with a very unique set of collective experience, groundbreaking ideas, and the ability to garner immediate traction with global industrial leaders, who can transform their business with machine learning and artificial intelligence delivered by the Edge connected IoT world. ZEDEDA is simply one of the most promising edge computing startups out there.”

“Operations Technology teams face major challenges when it comes to fully realizing the advantages of an IoT world. Their worlds are becoming massively connected systems dealing with virtualization, networking and security,” stated Christian Renaud, Research Director, IoT at 451 Research. “Our recent research shows that while OT teams have the application plans for leveraging IoT, the vast majority of organizations’ IT resources and capabilities are maxed out. This leaves open the question of how these edge applications and IoT will scale out without compromising security or taxing resources even further in the future.”

Ouissal told me, “Edge is the next big wave, bigger than cloud, simply because of the sheer size of the number of devices. The goal is ubiquitous compute where applications want to interact real-time. The problem with the cloud is that it’s centralized. This ecosystem is truly Cyberphysical—just like your Industry 4.0.”

The current IoT model of sending all data to the cloud for processing, won’t scale due to:

  • Bandwidth
  • Latency
  • Privacy issues

Three problems that the company is attacking:

1. Moving apps now running in the cloud to the edge

2. Edge-to-edge communication, key for autonomous systems, peer-to-peer

3. Security, cloud requires cyber security, but at the edge we must add physical security—someone could walk in and carry out an intelligent device

Ouissal often mentioned the need to rethink management of the edge. There exists a big difference between managing cloud and edge. Zedeta is tacking the variety of management challenges for updating and managing thousands to millions of embedded devices.

Solutions the team are developing include:

1. Security-built on platform, use keys, trusted, health check with every plug in, embedded virtualization

2. management-virtualization->can run multiple sessions on a device, eg robot motion on one session and analytics on another all on same embedded system, can scale this to millions of devices

3. Networking-monitor, watch lists, anomaly detection, analyze why, VPN architecture

This is all fascinating. I can’t wait to talk with competitors and potential competitors in a couple of weeks in Hannover and during some upcoming trips to get responses.

Follow

Follow this blog

Get every new post delivered right to your inbox.