The press release found its way into my email client proclaiming an update to a “modern edge” product from Litmus. Since I’m a fanatic on defining things, I wound up talking with co-founders Vatsal Shah and John Younes. I had written about the company in September, but it has been around since 2014.
The IT company conferences I’ve been attending have been all over the concept of “edge”, and OT companies have recently picked up the phrase. For an IT company, edge is at the end of the network with a device located in close proximity to where the work is done. I think for OT companies, it’s a similar, but opposite direction, view. The Litmus product is software that can reside on an amazing variety of compute devices.
Litmus contains a number of interesting and useful features. I was most captivated with the implementation of an app marketplace. There is a “public” one for apps from Litmus or third parties that a customer can install. There is also a “private” marketplace. For example, say an engineer in one plant solves a problem with an added app for their Litmus application. That engineer can add it to the private app store for use by engineers in other plants.
At the end of this post, I’ll include some edge definitions from Litmus that I found helpful. First, here is the latest news.
Litmus announced the release of Litmus Edge 3.0, a modern edge platform to collect and analyze data, build and run applications, and integrate edge data with any cloud or enterprise system. Litmus Edge 3.0 adds more device drivers to bring the industry-leading total to more than 250, with enhanced analytics, improved integration connectors, digital twin support, and expanded device management features.
“Litmus Edge is the only modern edge platform on the market that connects to all industrial assets and provides a complete data picture to improve industrial operations,” said Vatsal Shah, co-founder and CEO of Litmus. “Version 3.0 expands upon the product that already leads the industry with more device drivers, pre-built analytics and OT/IT integration capabilities, so customers can capture edge data and use it to perform local analytics or advanced use cases like machine learning and AI in the cloud.”
New features of Litmus Edge 3.0 include:
- Launched second generation industrial communication drivers focusing on security and scalability for southbound communications
- Enhanced Ready Analytics which now includes the ability to run Tensorflow and other machine learning algorithms natively on real-time ingested data
- Flows Manager updated to allow multiple instances of Flows – which can be tightly integrated or scaled or isolated with sandbox and production logics
- Enhanced cloud and enterprise Integration connectors including support for Splunk, Oracle DB, and other databases
- Improved user interface for application marketplace for one-click application orchestration
- Device management improvements including security, backup/restore and digital twin templates
Litmus Edge is a modern edge platform that collects data from any industrial asset, offers pre-built applications, KPIs and analytics, provides the ability to build and run custom applications, and integrates data with any cloud or enterprise system. Litmus Edge is easy to use and easy to deploy, offering the edge connectivity and data intelligence needed to power industrial use cases ranging from predictive maintenance to machine learning.
Litmus transforms the way companies enable Industrial IoT, Industry 4.0 and Digital Transformation with one goal in mind – unmatched time-to-value. Our modern edge platform for industry provides instant data connectivity, ready-to-use analytics, and the ability to orchestrate applications at scale. Litmus liberates the data locked in any industrial system to transform critical edge data into actionable intelligence that can power predictive maintenance, condition-based monitoring, and machine learning. Customers include 10+ Fortune 500 manufacturing companies, while partners like Siemens, HPE, Intel and SNC Lavalin expand the Company’s path to market.
The edge is focused on bringing computing as close to the data source as possible. The edge means running fewer processes in cloud and enterprise systems and moving them closer to the devices generating data, such as a standalone computer, an IoT device, or an edge server. Localizing computing minimizes the amount of long-distance communication between a client and server, thus transforming the way data is handled, processed, and delivered.
Industrial edge computing refers to the process of connecting all assets used in manufacturing, oil and gas, energy, transportation and more. Industrial edge computing analyzes all of the data at the asset and processes it instantly for real-time analytics or to integrate optimized data into cloud systems for further processing.
Edge and cloud technologies need to work together. To suggest one offers greater value over the other is simply not true. The edge is valuable for its ability to process high-volume data in real-time and handle complex analytics at the data source. The cloud is valuable for its ability to aggregate and analyze volumes of data from all data sources, including the edge.
The edge has three main components. Edge connectivity is the ability to connect to any industrial system and collect and normalize data for immediate use. Edge intelligence is concentrating data processing and analytics functions at the edge to take action and derive value at the data source. Edge orchestration is the ability to create, deploy, manage and update edge applications.
The vast importance of the edge is beginning to come to light as more industrial use cases are enabled. The edge powers preventative maintenance, condition-based monitoring, OEE, vision systems, quality improvements and more. Edge data can also power more advanced use cases like artificial intelligence and machine learning in the cloud. The intelligent edge is powering significant operations and process improvements.
About six months ago, ABB completed a divestiture of about 80% of its holding in ABB Power Grid business, and Hitachi acquired it. The new business, a joint venture, is called Hitachi ABB Power Grids. Today, it announced the integration of its Digital Enterprise solution with Hitachi Vantara’s Lumada portfolio of digital solutions and services for turning data into insights.
The two Hitachi business entities have agreed to rebrand the DE components as Lumada Asset Performance Management (APM), Lumada Enterprise Asset Management (EAM), and Lumada Field Service Management (FSM), adding to the growing portfolio of DataOps and Industrial IoT solutions.
The DE portfolio of solutions and its predecessors enable customers spanning multiple global industries to operate, analyze and optimize over $4 trillion of assets every day. With the incorporation of the DE portfolio into Lumada, this experience is further complemented by a leading technology engine to deliver access to information, systems, people and analytics across asset-intensive organizations.
With Digital Enterprise’s incorporation into Lumada, Hitachi ABB Power Grids’ energy domain experience will be augmented by Hitachi’s Lumada Industrial IoT platform. Hitachi was recently named a Leader in the 2020 Gartner Magic Quadrant for Industrial IoT Platforms based on Gartner Inc.’s evaluation of the company and its Lumada IoT software.
“Our software solutions and Lumada are highly complementary,” said Massimo Danieli, managing director, grid automation business unit, Hitachi ABB Power Grids. “Combining best-in-class Lumada IoT capabilities and the domain expertise built into Digital Enterprise applications provides both new and existing customers unparalleled flexibility and faster time to value, while preserving the value of their past software investments. The journey we began with our customers as part of the Digital Enterprise evolution story has become broader and more compelling, as we join the Lumada ecosystem.”
“Lumada Enterprise Asset Management and Field Service Management allow us to seamlessly expand our Ellipse EAM, enabling us to share information across all parts of our organization, tearing down silos and giving us the opportunity to formulate a longer-term, holistic strategy that reflects our specific business outcomes,” said Brian Green, general manager, asset management, from the Australian Rail Track Corporation (ARTC). “In addition, implementing these solutions allows us to optimize the quality of the data we collect and ensure safe, compliant and efficient business operations.”
“Bringing these solutions that each encapsulate deep domain expertise into the greater Lumada ecosystem gives customers an extremely powerful combination of tools to modernize their business,” said Chris Scheefer, senior vice president, Industry Practice, Hitachi Vantara. “The holistic view of assets and information provided by Lumada allows leadership to analyze and react in real-time, enabling efficient, effective operations and a foundation to create a more sustainable future.”
DE and Lumada also share core foundational features: a modern microservices design, vendor-agnostic interoperability, and a flexible deployment model, including cloud, on-premises and hybrid.
With the combination of Hitachi ABB Power Grids’ Digital Enterprise application portfolio and Hitachi’s Lumada solutions offered by Hitachi Vantara, customers will be able to benefit from additional data services including data integration, data cataloging, edge intelligence, data management, analytics and more.
The new integrated Lumada portfolio will offer advantages to customers in the following key areas:
1. Digital Transformation & Data Modernization – improving access to and insights from data
2. Connected Asset Performance – helping to predict and prevent asset failures
3. Intelligent Operations Management – improving oversight and maintenance of assets
4. Health, Safety & Environment – enabling safer environments for workers and the public
Amazon popped up on a recent post regarding Amazon Web Services. This news came to me from the analyst firm Interact Analysis. I’ve talked with executives there a few times, and I generally like the approach they take. I’m amused that IT companies think maintenance when they think manufacturing and then add predictive analytics, which they all have, combining them into predictive maintenance looking for a killer app.
Anyway, this analysis by Blake Griffin, senior analyst at Interact Analysis, has food for thought. Just what is Amazon up to with the manufacturing space?
- “The full development of Amazon’s industrial digitalization offering represents the first time a supplier has the ability to provide both the cloud storage and analytic capabilities under one entity”
- “If customers are looking to utilize the cloud for their industrial digitalization initiatives, Amazon would represent the fewest number of touchpoints between customer and supplier during the sales process”
- “Additionally, many manufacturers may already be using AWS for cloud storage but have yet to invest into further industrial digitalization technology. In these scenarios, Amazon would already have a ‘foot in the door’”
- Amazon offers an on-premise version of AWS for low latency applications – AWS Outpost: “In our opinion however, outpost will also serve as an option for customers looking to implement predictive maintenance who may be shy of hosting their operational data on the cloud… AWS Outpost will be regularly updated and patched… which ensures that users are still able to take advantage of the scale at which AWS operates”
On December 1st, 2020, Amazon announced a suite of new AWS machine learning services. To many, this announcement appeared to be Amazon’s launching off point towards being a major supplier of predictive maintenance solutions. However, this announcement follows a long history of Amazon carving out its capabilities in industrial digitalization. Since the ecommerce behemoth’s 2018 release of AWS IoT Sitewise, a service which enables its users to gather and organize asset health related data housed in repositories such as a historian, Amazon has consistently added to its industrial digitalization offering. Now, the company has a highly competitive solution with one capability completely unique to Amazon.
Amazon’s Industrial Digitalization Offering Has Been Developing for Years
In some ways, Amazon’s announcement of its new suite of machine learning services represents a rounding out of a predictive maintenance offering rather than a jumping off point. When manufacturers are looking at implementing predictive maintenance into their facilities, they are asking these fundamental questions:
- Which assets do I have visibility into already? How can I leverage this data?
- Which assets do I not have visibility into? What can I do to change that?
The announcement of AWS IoT Sitewise was Amazon’s solution to the first question. Many manufacturers in process industries generate large amounts of data from the devices controlling their machines. This data is often stored in a historian and without the tooling necessary to effectively manage and analyze such data, much of its value can be lost. AWS IoT Sitewise was developed so manufacturers could more effectively utilize this data for condition monitoring/predictive maintenance purposes. The solution is deployed through software housed in a gateway which then communicates the collected data to the AWS cloud. In our opinion, this marked Amazon’s true entry into the predictive maintenance market. Strategically, this was a logical first move. Amazon already had a wealth of analytical tools it could deploy to make use of data housed in a historian, the only thing needed was a mechanism for gathering and organizing that data to be analyzed.
Fast forward to Amazon’s recent announcement and we see the company moving to provide a solution to question two. One asset that is cited often as being “offline” from a condition data perspective are the mechanical portions of a motor driven system i.e. induction motors, gearboxes, bearings blocks, etc. These components are numerous throughout factory floors and their failure can represent significant loss of production if they are part of an application critical process. The industry has responded to this need by offering smart sensors, a wireless enabled sensor which can be connected to the side of a motor for purposes of gathering data on vibration and temperature behavior. These two data points, when combined with machine learning algorithms, can quickly illuminate what kind of stress motor components are facing and alert its users of problems ahead of failure.
One of the services announced in late 2020 has been coined Amazon Monitron. The solution utilizes smart sensors and gateways produced by Amazon to offer up data on the health of motor system equipment; effectively solving the problem of gathering data on assets not being monitored via historian data. This solution is in direct competition with predictive maintenance providers like ABB, Siemens, SKF, etc. In our view, the announcement of Monitron means Amazon now has a solution which fully addresses the needs of manufacturers looking to invest in predictive maintenance as part of a broader industrial digitalization initiative. Amazon’s utilization of data housed in a historian, combined with its smart sensor offering and vast analytics capability offered through AWS, make this solution as competitive as any on the market. Amazon does however have one distinct advantage over competition however: being a provider of cloud storage.
Amazon’s Unique Capability: Cloud Storage Ownership
Every platform offered by the major providers of predictive maintenance are built on cloud storage technology offered largely by either AWS or Microsoft Azure. ABB Ability cites Microsoft Azure as the landscape in which Ability operates. Similarly, Schneider Electric’s Ecostuxure platform utilizes Microsoft Azure. Siemens Mindsphere has developed the capability to be used with either AWS or Azure, announcing its compatibility with the latter in 2018. The full development of Amazon’s industrial digitalization offering represents the first time a supplier has the ability to provide both the cloud storage and analytic capabilities under one entity.
It is difficult to foresee what impact this will have on the partnerships AWS has in place with current industrial digitalization providers. What is easy to see however are the numerous advantages Amazon will have in potentially winning the business of those investing into industrial digitalization for the first time. If customers are looking to utilize the cloud for their industrial digitalization initiatives, Amazon would represent the fewest number of touchpoints between customer and supplier during the sales process. Additionally, many manufacturers may already be using AWS for cloud storage but have yet to invest into further industrial digitalization technology. In these scenarios, Amazon would already have a ‘foot in the door’ which would yield them an advantage when the time comes for users to begin evaluating providers of digitalization.
Amazon’s AWS Outpost Helps Overcome a Major Barrier to Predictive Maintenance Adoption
One of the largest barriers facing suppliers of predictive maintenance solutions is manufacturers’ reluctance to host its operational data on the cloud. Recently, Interact Analysis partnered with the Association for Packaging and Processing Technologies (PMMI) to produce a white paper and accompanying survey pertaining to adoption of predictive maintenance technology within the packaging industry. The whitepaper and survey results are available for download for free via this link. One of the questions asked in the survey looked at the adoption of predictive maintenance within OEM and system integrator offerings. “Our customers will not allow remote access to their machinery” received the second highest weighted score according to the survey.
Question: To what extent are the following statements describing the adoption of predictive maintenance (PdM) technologies at your company, true or false?
- We are not familiar with PdM technology.
- The added cost of PdM technology is too high to justify.
- We do not want to have to pay for an ongoing subscription to access sensor data from an automation vendor.
- The technology is too new.
- We currently offer machines with PdM technology
- None of our customers have expressed interest in PdM technology
- Our customers will not allow remote access to their machinery (remote monitoring)
This hesitancy by users to allow access to operational data has led suppliers to develop solutions which, instead of aggregating and analyzing data in the cloud, host their data for analysis onsite.
Amazon has addressed this concern by offering an on-premise version of AWS. This on-premise version of AWS, termed AWS Outpost, was released in 2019 and is designed to serve applications requiring low latency. In our opinion however, outpost will also serve as an option for customers looking to implement predictive maintenance who may be shy of hosting their operational data on the cloud. Keeping data onsite as opposed to in the cloud ensures the door to the OT network remains closed; something many manufacturers are keen to maintain.
Having the power of a modular cloud system like AWS on-premise is an incredibly powerful development in the predictive maintenance market. AWS Outpost will be regularly updated and patched by a regional AWS team which ensures that users are still able to take advantage of the scale at which AWS operates. This is an important consideration when working with machine learning algorithms which become more accurate when deployed at scale. Current on-premise predictive maintenance solutions sacrifice this accuracy in favor of the increased security which on-premise brings. With AWS Outpost, users will no longer have to make that sacrifice.
Additionally, if you define an edge device as the point at which data is pushed to the cloud, this solution effectively eliminates the need for such devices thus simplifying the overall architecture.
At the very least, this announcement should be taken as a signpost of future growth within an already fast-growing predictive maintenance market. Amazon does not enter markets which are expected to appreciate modestly; it enters markets whose opportunity could one day be worth billions of dollars. The amount of time spent developing, releasing, and improving upon Amazon’s industrial digitalization offering should be indicative of the faith the company has in the future of this market.
Inductive Automation’s Ignition Users Can Run AWS Infrastructure and Services On-Premises
Amazon has made some interesting moves in the manufacturing space lately. Many companies are porting their software to run on Amazon Web Services (AWS). Then in December Amazon touted its Monitron, Panorama, and Lookout at its developer conference. We can only imagine what sort of impact this company will have in the Industrial Internet of Things and Digital Transformation markets.
Especially when it entices established partners such as Inductive Automation. This week’s announcement I believe is just an initial announcement. I perceive much more potential for the two companies to work together. But then, I always see potential outcomes. This one should pay off in the future. Keep an eye out.
Inductive Automation announced January 20 that it has achieved the AWS Outposts Ready designation, part of the Amazon Web Services (AWS) Service Ready Program. This designation recognizes that Inductive Automation has demonstrated successful integration with AWS Outposts deployments. AWS Outposts is a fully managed service that extends AWS infrastructure, AWS services, APIs, and tools to virtually any datacenter, co-location space, or on-premises facility for a truly consistent hybrid experience.
Inductive Automation is a fast-growing industrial automation software company. Its key product, Ignition by Inductive Automation®, is used in a wide variety of industries, in more than 100 countries. Ignition is an industrial application platform with fully integrated tools for building solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), manufacturing execution systems (MES), and the Industrial Internet of Things (IIoT).
Achieving the AWS Outposts Ready designation differentiates Inductive Automation as an AWS Partner Network (APN) member with a product fully tested on AWS Outposts. AWS Outposts Ready products are generally available and supported for AWS customers, with clear deployment documentation for AWS Outposts. AWS Service Ready Partners have demonstrated success building products integrated with AWS services, helping AWS customers evaluate and use their technology productively, at scale and varying levels of complexity.
“Customers are looking for ways to make their production plants run more efficiently using sensors and machine learning,” said Joshua Burgin, general manager of AWS Outposts, Amazon Web Services, Inc. “We are delighted to welcome Inductive Automation to the AWS Outposts Ready Program. Inductive Automation can help these customers modernize their plant operations using cloud services and do it with the low latency these customers need, by running Ignition on AWS Outposts at the plant location.”
“We’re very proud to achieve AWS Service Ready status,” said Travis Cox, co-director of sales engineering for Inductive Automation. “We’re ready to help organizations achieve their technology goals by leveraging the agility, breadth of services, and constant innovation from AWS.”
To support the seamless integration and deployment of AWS Outposts Ready solutions, AWS established the AWS Outposts Ready Program to help customers identify products integrated with AWS Outposts and spend less time evaluating new tools, and more time scaling their use of products that are integrated with AWS Outpost deployments.
New standard in data virtualization enables organizations to support interactive analytics on the data lake by leveraging Varada ‘dynamic indexing’ technology that automatically accelerates and optimizes analytics workloads with ‘zero data ops’
Data Ops is hot right now. We have our data lakes and ponds and clouds and probably rain, but how to find, break silos, and manipulate all that stuff requires work. This company just crossed my horizon. Varada has built and released a Data Platform to help you out. Check out its press release.
Varada unveiled its data virtualization platform which helps organizations instantly monetize all of their available data with a predictable and controlled budget. Using a dynamic indexing technology, the Varada Data Platform enables data teams to balance performance and cost of queries at massive scale, without ceding control of their data to third-party vendors.
The Varada Data Platform, available today, offers advantages compared with other data virtualization tools:
- Embrace the data lake architecture, allowing organizations to retain full control of their data and avoid vendor lock-in. Because the Varada Data Platform sits atop a customer’s existing data lake, there is no need to move data or budget for additional ETLs and storage, which reduces both cost and complexity while enabling data teams to keep data secure under consistent policies.
- Offers “glass box” visibility into how workloads perform. Data teams get deep visibility into workload performance and cluster utilization. They can easily define workload priorities, business requirements and budget. Varada automatically optimizes workloads to meet those performance and budget requirements. Even without the input of data architects, Varada continuously monitors workloads to identify heavy users, hotspots, bottlenecks and other issues and, using machine learning, elastically adjusts the compute and storage cluster. Alternatively, data teams have the option to exercise fine-grained control of budgets and business requirements, so they can gain full control and flexibility.
- Applies unique “adaptive indexing” technology to effectively accelerate queries. The Varada Data Platform drastically reduces query execution time and the required compute resources. The key is Varada’s proprietary indexing technology, which breaks data across any column into nano blocks and automatically chooses the most effective index for each nano block based on the data content and structure. This unique indexing technology is what makes queries extremely fast without the need to model data or move it to optimized data platforms.
“The beta period for this product has proven two things,” said Eran Vanounou, CEO of Varada. “First, that organizations are desperate for a way to simplify data ops management while getting the cost of query acceleration under control. Second, the path we’ve chosen is striking a chord: Varada is a ‘zero data ops’ approach that eliminates data silos by serving many workloads from one platform. And because all queries will run atop the data lake, there is a single source of truth that eliminates the need to move or model data. With several dozen early users on the platform, it’s time to bring this innovative approach to a market that’s ready for it.”
Pricing and Supported Data Sources
The Varada Data Platform currently runs on AWS and supports reserved, on-demand and spot instances. Pricing is per-node, based on a predefined scaling group. The Varada Data Platform is available on AWS Marketplace with integrated billing through AWS, or via AMI (Amazon Machine Image). Enterprise support is also available from Varada.
The platform supports a wide range of data sources and formats, including:
- Data Formats: ORC, Parquet, JSON, CSV and more
- Data Catalogs: Hive Metastore, AWS Glue
- Additional Data Sources: PostgreSQL, MySQL and more
Coming soon are support for GCP and Azure.
The Varada mission is to enable data practitioners to go beyond the traditional limitations imposed by data infrastructure and instead zero in on the data and answers they need—with complete control over performance, cost and flexibility. In Varada’s world of big data, every query can find its optimal plan, with no prior preparation and no bottlenecks, providing consistent performance at a petabyte scale. Varada was founded by veterans of the Dell EMC XtremIO core team and is dedicated to leveraging the data lake architecture to take on the challenge of data and business agility. Varada has been recognized in the Cool Vendors in Data Management report by Gartner Inc.
Free Step-by-step Wizard Creates RFP Content to Jumpstart IIoT Projects
Two problems consistently present themselves for open, collaborative, and even open-source projects to gain wide adoption. The specifications must be adopted as a corporate standard. The buying authorities, whether corporate or plant, must define the specification as part of the bid package.
Each of these is a hurdle. The first can be overcome by including as many end user corporations as feasible in the standards development process. The second can be a major roadblock, especially if the purchasing authority is decentralized and perhaps not technically aware and more apt to be influenced by the local specified supplier sales team.
Those influences make it imperative that the standards bodies make it as easy as possible to specify the standard as part of the bid package. I’ve seen failure upon failure because of this one roadblock. That makes this new toolkit from IIC that much more valuable.
The Industrial Internet Consortium (IIC) has released its IIC RFP Toolkit, a collection of best practices and online tools to help guide IIoT project managers and procurement managers and buyers through the process of procuring all the different components and resources required for a complete end-to-end IIoT solution.
“Digital Transformation (DX) projects require unique procurement skills to navigate the considerations needed when building an RFP. The procurement process for a typical IoT project is quite different from that of an enterprise software project,” said Dirk Slama, Director of the Co-innovation Hub at Ferdinand-Steinbeis-Institute.
“This IIC toolkit helps IIoT project managers and procurement managers/buyers through the process of procuring all the different components and resources required for a complete end-to-end IIoT solution. The RFP wizard helps users create and manage effective RFPs for IIoT solutions, helping to ensure that users of IIoT technology are using the right partners and getting the best possible IIoT solution for the most affordable price,” said Transforma Insights Founding Partner Jim Morrish.
The IIC RFP Toolkit is comprised of six modules, developed by the IIC member ecosystem of IIoT technology users, vendors, and consultants. These modules are:
- Challenges, risks, and mitigation
- Project planning
- RFP creation
- RFP wizard
- RFP distribution and vendor selection
- Expert advice and discussion
“As companies struggle to ensure they are successfully setting up their digital transformation projects it becomes more important to see what the rest of the market is doing and that’s exactly what we’ve provided with the IIC RFP Toolkit. Our ecosystem of members, from across the IIoT landscape, provide insights and lessons they learnt from their own projects and created the modules in the RFP Toolkit. The IIC ecosystem is unparalleled in its ability to crowdsource solutions and share best practices to solve IIoT and digital transformation challenges,” said IGnPower Executive Vice President Bassam Zarkout.
The IIC RFP Toolkit is accessible for free on the IIC Resource Hub, a central repository for the collective resources of the IIC community. Conversations about common challenges and crowd-sourced answers from IIC members can be found on the IIC Community Forum, the space for industry experts to exchange ideas, to discuss Industrial IoT (IIoT) problems in need of solutions and to network.