Much time was devoted last week at Dell Technologies World to Dell’s Legacy of Good highlighting people and companies doing some really cool and worthwhile things. I’m especially impressed with the AeroFarms people (see photos below) who are using IoT to find a better way to grow wholesome vegetables. Hey engineers–maybe there’s a thought in here to spark your next creative interest.
Let me take you on a photo journey through the prominent booth at the DT World Expo floor highlighting a number of projects.
Plastic waste floating in the ocean is fast becoming an environmental catastrophe. Here is someone doing something about it.
How about genetic mapping improvements for fighting rare diseases?
A bug’s eye view with drones to help the honeybee population.
All kinds of wild robot science fiction stories are hitting main-stream media. How about a reality check?
Oh, another main-stream media hype fest–AI. In reality is can be a boost to business not in a scary way.
Here is a manufacturing product lifecycle story.
And the AeroFarms story.
Innovation springs from small and new companies, so holds Andrew Johnson, CEO of ShelfAware, a software startup in the what could be called the MRO commodity business. His family business, Oringsales.com, is a master distributor of those crucial but often overlooked components called O-rings. Suppliers to maintenance shops and OEMs have been tackling vendor managed inventory and other inventory tracking processes for many years. But how do you economically track something as small as an O-ring?
The brothers running the distributorship business figured out that RFID tags were becoming inexpensive enough to warrant use in small bags of these small components. They wrote an application, embedded RFID tags on the bags, and established a workflow. Originally for their customers of O-rings, customers soon demanded the system for other small components, as well.
The key proposition—remotely monitor consumption of small parts. He calls it get “the dudes in trucks” off the streets to better utilize their time rather than driving around counting parts.
Johnson is entrepreneurial and evangelistic. He told me, ”I’m reaching out to you because I have a message I would like to send to our USA manufacturing friends that I think they would find very interesting. The time is now to innovate our manufacturing infrastructure if we intend on bringing manufacturing back to the USA in a big way. Its easier than ever before to take the tech leap with many Internet of Things systems popping up every month that don’t require integration into your ERP systems to achieve a big ROI. It’s literally plug and play technology for manufactures.
This is his innovation story.
I am a young entrepreneur (32) that has grown up in the industrial manufacturing industry as a member of a family run industrial parts distributor. I spent many summers of my childhood inspecting o-rings, gaskets, and other seals… very exciting summer job. Now I am working with my 3 brothers in our family business as we try to innovate the industrial landscape. We recently invented & patented an intelligent inventory supply chain that is powered by passive RFID technology. We deployed our Internet of Things supply chain system at 3 Midwestern manufactures last year (Eskridge Mfg, Energy Mfg, Oilgear Mfg) and the system is performing better than we could have ever imagined.
Very simply put, our system, ShelfAware, monitors the consumption of commodity inventory in real time using RFID chips that are embedded into the product packaging. This consumption data, Big Data, is then analyzed and fed to the manufacturer’s supply chain partners to guarantee no stock out, lean inventory, and lean inventory pipe-lines which all means the right parts, at the right time, cheaper. Our three key questions we practically chant while working with our system are: Is this Accurate?, Efficient?, Effective?
Yes, RFID inventory systems have been around for years, but never really applied to the consumable commodity products, aka “small parts”. The main tech advancements that have made a system such as ShelfAware viable now are:
- RFID tags are getting really really cheap, sometimes less than $0.05 each
- The internet has allowed the software driving the system to be flexible and easily accessible
- RFID hardware is much less expensive and now highly reliable
The business plan, a bit audacious, announces “The Opportunity to Disrupt a Marketplace with a Collaborative Platform”.
The traditional large industrial supply incumbents who offer vendor managed inventories (VMI) have expanded their product offering horizontally leaving them spread too thin. They are good at some product groups, but great at very few product groups. This has created vulnerabilities related to product expertise like sourcing, engineering, and general product support. ShelfAware intends on exploiting these vulnerabilities by giving many niche product vendors the ability to collaborate on ShelfAware’s IoT Inventory Platform thus creating a more efficient crowd sourced inventory supply chain.
Objective of ShelfAware as a company states “To create an IoT Intelligent Inventory Platform that can support multiple independent product vendors who collectively support large supply chains demanded by Large or complex OEM’s. ShelfAware will create value in the industrial supply market by revolutionizing supply chain theory through the use of a collaborative IoT, RFID Intelligent Inventory Platform.
The Platform must fulfil two primary roles for this supply chain model to be successful with emphasis placed on the IoT vendor collaboration software.
- Deploy an intelligent inventory management system inside manufacturing facilities.
- Deploy a Vendor Side supply chain cooperative system.
I am intrigued by the whole concept. And it seems to be working now, even in its infancy.
Manufacturing technology professionals have been working with data of many types for years. Our sensors, instrumentation, and control systems yield terabytes of data. Then we bury them in historians or other databases on servers we know not where.
Companies are popping up like mushrooms after a spring rain with a variety of approaches for handling, using, analyzing, and finding all this data. Try on this one.
Io-Tahoe LLC, a pioneer in machine learning-driven smart data discovery products that span a wide range of heterogeneous technology platforms, from traditional databases and data warehouses to data lakes and other modern repositories, announced the General Availability (GA) launch of the Io-Tahoe smart data discovery platform.
The GA version includes the addition of Data Catalog, a new feature that allows data owners and data stewards to utilize a machine learning-based smart catalog to create, maintain and search business rules; define policies and provide governance workflow functionality. Io-Tahoe’s data discovery capability provides complete business rule management and enrichment. It enables a business user to govern the rules and define policies for critical data elements. It allows data-driven enterprises to enhance information about data automatically, regardless of the underlying technology and build a data catalog.
“Today’s digital business is driving new requirements for data discovery,” said Stewart Bond, Director Data Integration and Integrity Software Research, IDC. “Now more than ever enterprises are demanding effective, and comprehensive, access to their data – regardless of where it is retained – with a clear view into more than its metadata, but its contents as well. Io-Tahoe is delivering a robust platform for data discovery to empower governance and compliance with a deeper view and understanding into data and its relationships.”
“Io-Tahoe is unique as it allows the organization to conduct data discovery across heterogeneous enterprise landscapes, ranging from databases, data warehouses and data lakes, bringing disparate data worlds together into a common view which will lead to a universal metadata store,” said Oksana Sokolovsky, CEO, Io-Tahoe. “This enables organizations to have full insight into their data, in order to better achieve their business goals, drive data analytics, enhance data governance and meet regulatory demands required in advance of regulations such as GDPR.”
Increasing governance and compliance demands have created a dramatic opportunity for data discovery. According to MarketsandMarkets, the data discovery market is estimated to grow from $4.33 billion USD in 2016 to $10.66 billion USD in 2021. This is driven by the increasing importance of data-driven decision making and self-service business intelligence (BI) tools. However, the challenge of integrating the growing number of disparate platforms, databases, data lakes and other silos of data has prevented the comprehensive governance, and use, of enterprise data.
Io-Tahoe’s smart data discovery platform features a unique algorithmic approach to auto-discover rich information about data and data relationships. Its machine learning technology looks beyond metadata, at the data itself for greater insight and visibility into complex data sets, across the enterprise. Built to scale for even the largest of enterprises, Io-Tahoe makes data available to everyone in the organization, untangling the complex maze of data relationships and enabling applications such as data science, data analytics, data governance and data management.
The technology-agnostic platform spans silos of data and creates a centralized repository of discovered data upon which users can enable Io-Tahoe’s Data Catalog to search and govern. Through convenient self-service features, users can bolster team engagement through the simplified and accurate sharing of data knowledge, business rules and reports. Here users have a greater ability to analyze, visualize and leverage business intelligence and other tools, all of which have become the foundation to power data processes.
Much of the interesting activity in the Industrial Internet of Things (IIoT) space lately happens at the edge of the network. IT companies such as Dell Technologies and Hewlett Packard Enterprise have built upon their core technologies to develop powerful edge computing devices. Recently Bedrock Automation and Opto 22 on the OT side have also built interesting edge devices.
I’ve long maintained that all this technology—from intelligent sensing to cloud databases—means little without ways to make sense of the data. One company I rarely hear from is FogHorn Systems. This developer of edge intelligence software has recently been quite active on the partnership front. One announcement regards Wind River and the other Google.
FogHorn and Wind River (an Intel company) have teamed to integrate FogHorn’s Lightning edge analytics and machine learning platform with Wind River’s software, including Wind River Helix Device Cloud, Wind River Titanium Control, and Wind River Linux. This offering is said to accelerate harnessing the power of IIoT data. Specifically, FogHorn enables organizations to place data analytics and machine learning as close to the data source as possible; Wind River provides the technology to support manageability of edge devices across their lifecycle, virtualization for workload consolidation, and software portability via containerization.
“Wind River’s collaboration with FogHorn will solve two big challenges in Industrial IoT today, getting analytics and machine learning close to the devices generating the data, and managing thousands to hundreds of thousands of endpoints across their product lifecycle,” said Michael Krutz, Chief Product Officer at Wind River. “We’re very excited about this integrated solution, and the significant value it will deliver to our joint customers globally.”
FogHorn’s Lightning product portfolio embeds edge intelligence directly into small-footprint IoT devices. By enabling data processing at or near the source of sensor data, FogHorn eliminates the need to send terabytes of data to the cloud for processing.
“Large organizations with complex, multi-site IoT deployments are faced with the challenge of not only pushing advanced analytics and machine learning close to the source of the data, but also the provisioning and maintenance of a high volume and variety of edge devices,” said Kevin Duffy, VP of Business Development at FogHorn. “FogHorn and Wind River together deliver the industry’s most comprehensive solution to addressing both sides of this complex IoT device equation.”
Meanwhile, FogHorn Systems also announced a collaboration with Google Cloud IoT Core to simplify the deployment and maximize the business impact of Industrial IoT (IIoT) applications.
The companies have teamed up to integrate Lightning edge analytics and machine learning platform with Cloud IoT Core.
“Cloud IoT Core simply and securely brings the power of Google Cloud’s world-class data infrastructure capabilities to the IIoT market,” said Antony Passemard, Head of IoT Product Management at Google Cloud. “By combining industry-leading edge intelligence from FogHorn, we’ve created a fully-integrated edge and cloud solution that maximizes the insights gained from every IoT device. We think it’s a very powerful combination at exactly the right time.”
Device data captured by Cloud IoT Core gets published to Cloud Pub/Sub for downstream analytics. Businesses can conduct ad hoc analysis using Google BigQuery, run advanced analytics, and apply machine learning with Cloud Machine Learning Engine, or visualize IoT data results with rich reports and dashboards in Google Data Studio.
“Our integration with Google Cloud harmonizes the workload and creates new efficiencies from the edge to the cloud across a range of dimensions,” said David King, CEO at FogHorn. “This approach simplifies the rollout of innovative, outcome-based IIoT initiatives to improve organizations’ competitive edge globally, and we are thrilled to bring this collaboration to market with Google Cloud.”
Digital Transformation has generated so much news that company executives have begun ordering projects and task forces within the company to begin that transformation. The pressure on engineers and IT people increases with each new directive. To help clients deal with these new directives, ARC Advisory Group launched the Digital Transformation Council (DTC) at its 2018 Forum.
The council is a member community for industry, energy, and public-sector professionals. Membership is by invitation only and restricted to end users of digital transformation technology, such as professionals working for manufacturers, utilities, and municipalities. There is no fee to join.
“As data-driven market disruption grows, professionals across similar industries need to connect and learn from one another,” according to Jesus Flores-Cerrillo, Associated R&D Director at Praxair, one of the world’s largest providers of industrial gases. He added, “It’s becoming mission-critical to understand how to use data to develop services and products and optimize operations and assets. That can only be accomplished by understanding the possibilities provided by modern data tools such as artificial intelligence, machine learning, and digital twins.”
“We are delighted to support the Digital Transformation Council by bringing members together in person and online,” commented Greg Gorbach, Vice President at ARC Advisory Group. “This community will enable individuals and companies to get up to speed quickly on digital transformation innovations and share ideas about what provides value and what doesn’t.”
Each February, a member-only meeting, anchored to the annual ARC Industry Forum, will bring the Council together to set the focus and agenda for the coming year. Members will also gather via virtual quarterly meetings to discuss research findings, activities, and other topics.
In addition to annual in-person meetings and quarterly virtual meetings, Digital Transformation Council members will have year-round access to research and fellow members via an online community. ARC Advisory Group’s role will be to conduct research, organize meetings, provide venues, and facilitate peer-to-peer discussions. ARC will also deliver technical support for the group’s online presence.
The DTC will address topics such as analytics, industrial Internet of Things (IIoT), artificial intelligence and machine learning, cybersecurity, and additive manufacturing.
I am a sucker for open platforms. When the PR agency wrote with a teaser about discussing open platforms with Marc Lind, SVP Strategy at Aras, a PLM supplier, I bit. They threw in “digital twin” and “digital thread” as the topping and cherry atop the sundae, and the appointment was made.
We talked just before Christmas, but I’ve had such a crazy January that I’ve just now gotten to this in my pile of things to write.
PLM is often thought of as an enterprise application and covered by analysts who also watch such areas as ERP. I’ve talked with suppliers for years as a magazine editor, but they didn’t really seem to fit well within the magazines and they most likely were not advertising prospects, so there wasn’t pressure to write much. I’m saying that I’m not an expert in the area like some of my friends.
But I’ve followed the technology for many years. I’ve seen it coming—this coordination of digital and physical. As soon as the digital folks could get it all together—especially better databases and interfaces—then I knew we’d be much closer to the realization of digital manufacturing.
Lind told me something about the Aras platform. First, he said, was the attempt at doing away with silos where you might have your mechanical CAD, then have your electrical CAD, then perhaps your MES, and your ERP. He said not only was there a problem within manufacturing, think about the next step, say connected cars and other systems of systems, where things really need to interact across boundaries.
Check out the Aras platform. It’s interesting. And once again as I’m seeing more often, it is exploring a different business model that can make its platform and products available to a wider customer base. For other writing I’ve done on open platforms, click the small “ad” on my site to download the MIMOSA white paper.
We also talked digital twin, one of the foundation concepts for digital manufacturing.
He said the term Digital Twin was coined back in 2002 by Dr. Michael Grieves while at University of Michigan. Effectively, the Digital Twin is an exact virtual representation of a physical thing. It’s as if the physical product or system was looking in a virtual mirror.
Grieves describes it as a mirroring (or twinning) of what exists in the real world and what exists in the virtual world. It contains all the informational sets of the physical ‘thing’ meaning its cross-discipline – not just a mechanical / geometric representation, but also including the electronics, wiring, software, firmware, etc.
Many people talk about Digital Twins in the context of monitoring, simulation and predictive maintenance which are all incredibly valuable and potentially transformative in their own right, however, there would seem to be much more to it.
“As products of all types move to include connectivity, sensors, and intelligence we can’t just think about the data streaming back from the field.”
Without accurate “Context” – Digital Twin – time series data generated during production and ongoing operation is difficult or even impossible to understand and analyze.
In addition, the ability to interpret and act upon these data often require traceability to prior information from related revisions – Digital Thread.
“To complicate matters further as artificial intelligence / cognitive computing is introduced the necessity for the Digital Twin becomes even greater. If Knowledge = Information in Context, then without a Digital Twin, machine learning won’t work as intended, will be rendered ineffective or worse… potentially leading to risky misinterpretations or misdirected actions.”
Finally, Lind warns, “Because without Context – Digital Twin – the IoT-enabled value proposition is severely limited and could introduce real liability.”