Andrew Ng of Landing AI on Building Vision AI Project

The new A3 organization (motion/vision/robotic associations) held its annual show virtually over five days this week. I was busy, but I did tune in for some keynotes and panel discussions. I also browsed the trade show.

The platforms are getting better all the time. I was blown away by all the cool things today’s keynoter was able to pull off. But they still can’t quite get the trade show experience up to expectations.

Today’s keynote was given by Andrew Ng, CEO of Landing AI, a machine vision AI company. His talk was a low-key, effective explanation of AI and how to implement a successful AI-enabled vision inspection project. I’d almost call this “beyond hype”. 

Here are a few key points:

75% of AI projects never go live.


Vision inspection has gone from rules-based to deep learning (aka, AI, ML), learn automatically.

Ng polled his audience about experiences with AI projects with the key responses:

  • Lack of data
  • Unreal expectations
  • Use case not well defined
  • Hype—perception of AI as futuristic

Challenges

  • Not sufficiently accurate
  • Insufficient data
  • More than just initial ML code needed
  • System able to learn continuously

AI Systems = Model + Data

Improving the system depends upon improving either Model or Data; experience in manufacturing shows best results come from improving data.

One Landing AI partner estimated 80% of his work was on preparing data (data processing) and only 20% on training a model.

AI Project Lifecycle

Scope  Collect Data  Train Model  Deploy in Production

Train Model feedback to Collect Data

Deploy feedback to train model and also feedback to collect data

Common problem—is data labeled consistently? E.g. are defects consistently defined?

Common data issues: inconsistent label; definition between two defects ambiguous; too few examples

Final advice:

  • Start quickly
  • Focus on data
  • End-to-end platform support (lifecycle)

Coincidentally, Ng was Interviewed at MIT Technology Review and I received an email notice today. I’ve included a link, but you may need a subscription to get in.

Karen Hao for MIT Technology Review: I’m sure people frequently ask you, “How do I build an AI-first business?” What do you usually say to that?

Andrew Ng: I usually say, “Don’t do that.” If I go to a team and say, “Hey, everyone, please be AI-first,” that tends to focus the team on technology, which might be great for a research lab. But in terms of how I execute the business, I tend to be customer-led or mission-led, almost never technology-led.

A very frequent mistake I see CEOs and CIOs make: they say to me something like “Hey, Andrew, we don’t have that much data—my data’s a mess. So give me two years to build a great IT infrastructure. Then we’ll have all this great data on which to build AI.” I always say, “That’s a mistake. Don’t do that.” First, I don’t think any company on the planet today—maybe not even the tech giants—thinks their data is completely clean and perfect. It’s a journey. Spending two or three years to build a beautiful data infrastructure means that you’re lacking feedback from the AI team to help prioritize what IT infrastructure to build.

For example, if you have a lot of users, should you prioritize asking them questions in a survey to get a little bit more data? Or in a factory, should you prioritize upgrading the sensor from something that records the vibrations 10 times a second to maybe 100 times a second? It is often starting to do an AI project with the data you already have that enables an AI team to give you the feedback to help prioritize what additional data to collect.

In industries where we just don’t have the scale of consumer software internet, I feel like we need to shift in mindset from big data to good data. If you have a million images, go ahead, use it—that’s great. But there are lots of problems that can use much smaller data sets that are cleanly labeled and carefully curated.

IIC White Papers Survey Information Models and Digital Transformation

Here are announcements from the Industrial Internet Consortium (IIC) regarding two white papers released. One deals with IioT Models and the other with innovation processes of digital transformation. A lot of thinking went into these.

The Industrial Internet Consortium (IIC) announced the publication of the Characteristics of IIoT Models White Paper. Interoperability between applications, subsystems, and devices in Industrial Internet of Things (IIoT) systems requires agreement on the context and meaning of the data being exchanged, or semantic interoperability, which is typically captured in an information model. The new white paper addresses the challenge of integrating subsystems in IIoT systems that use different information models and examines how standardized information models that use a descriptive or semantic approach enable interoperability and ultimately digital transformation.

The variety of digital data and information systems is an indispensable attribute of the modern world of IIoT. In each industrial vertical, one way or another, work is underway to reach agreements between stakeholders through the development of standards and data schemas. Our white paper provides a simple definition of the characteristics and properties of information models, which can be useful in the design of IIoT systems and, which is especially important, for multiple systems to work seamlessly with each other.

“Semantically based information models can share data across domain boundaries using a descriptive approach (instead of a translational approach) as the data has meaning in both domains, and the full fidelity of the original data are maintained,” said Kym Watson, Co-chair of the IIC Distributed Data Interoperability and Management Task Group, an author of the white paper and Scientist at Fraunhofer IOSB. “Our intent in this white paper is to survey a subset of information models that are relevant to the IIoT and characterize those information models using a meta-model developed for this purpose. With this we capture commonalities and can begin to address the challenge of integrating subsystems that use different information models.”

“An information model is a representation of concepts, relationships, constraints, rules, and operations to specify data structures and semantics,” said Niklas Widell, Co-chair of the IIC Distributed Data Interoperability and Management Task Group, an author of the white paper and a Standardization Manager at Ericsson. “There are multitudes of information models available or under active development for a variety of application domains or industries. We focused on information models above the Industrial Internet of Things Connectivity Framework layer where semantic interoperability, including translation between different models, plays a key role.”

The white paper examines the following standardized information models (among others) that are widely applied in IIoT applications:

• Web of Things – a set of standards by the W3C for solving the interoperability issues of different IoT platforms and application domains.

• SensorThings API – an Open Geospatial Consortium standard providing an open and unified framework to interconnect IoT sensing devices, data, and applications over the Web.

• OPC UA – a machine-to-machine communication protocol for industrial automation developed by the OPC Foundation focusing on communicating with industrial equipment and systems for data collection and control.

• Asset Administration Shell – a key concept of Industry 4.0 used to describe an asset electronically in a standardized manner. Its purpose is to exchange asset-related data among industrial assets and between assets and production orchestration systems or engineering tools.

• IPSO Smart Objects – a lightweight design pattern and object model to enable data interoperability between IoT devices, building on the LwM2M IoT device management standard, specified by OMA SpecWorks

• One Data Model/Semantic Definition Format – an initiative to improve interworking across different ecosystems’ data models using an emerging standard from the IETF. The OneDM Liaison Group adopts and aligns IoT models contributed by participating organizations, so best practice models for desired features or purposes can be identified.

“Standardized information models with defined semantics and APIs are an essential foundation for any form of digital transformation,” said Andrei Kolesnikov, Co-chair of the IIC Distributed Data Interoperability and Management Task Group, an author of the white paper, and director of the Internet of Things Association IOTAS. “There must be a seamless integration across the system life cycles, especially engineering and operations for all data sharing technologies.”

IIC members who wrote the Characteristics of IIoT Models White Paper and a list of members who contributed to it can be found here on the IIC website.

IIC White Paper Identifies Innovation Process For Digital Transformation

BizOps for digital transformation in industry facilitates IT and OT integration with better business outcomes

Industrial Internet Consortium (IIC) today announced the publication of the BizOps for Digital Transformation in Industries white paper. The new white paper identifies the BizOps for Digital Transformation in Industry (BDXI) innovation process, offering examples of a BDXI framework as crucial for IIoT solutions operators undergoing digital transformation.

“Digital transformation is a huge topic influencing almost every department of a firm,” said Co-author of the white paper Kai Hackbarth, Business Owner Industrial at Bosch.IO. “Solutions operators must integrate IT and OT to achieve better business outcomes, especially in asset-driven industries such as agriculture, energy, health care, manufacturing, retail, smart cities, and transportation. This is not an easy task as the process is slow and likely to conflict with existing processes and management systems.”

“The BDXI process is a fast, open, and customer-centric innovation process that considers the constraints and complexity of IT/OT integration and the physical world,” said Co-author of the white paper Chaisung Lim, Group Chair of the IIC BizOps for Digital Transformation in Industry Contributing Group, Chairman of the Korea Industry 4.0 Association, and a professor of Konkuk University. “A BDXI process helps IIoT solutions operators manage the innovation process from idea to launch successfully.”

A BDXI process includes discovering customer needs, developing solutions, learning whether solutions are feasible, and putting them into action. This necessitates dialogue between IT and OT stakeholders who would otherwise be constrained by organizational silos, a customer-centric process of checking solution validity, and fast experimentation with minimum viable products and agile methods. The most common features of BDXI processes includes the adoption of the best innovation practices from design thinking, lean start up and agile methods, and BizDevOps (the integration of IT/OT). BDXI process must be supported by a BDXI framework that offers a guide for implementing BDXI process concretely.

The BizOps for Digital Transformation in Industry white paper delves into the most common features of BDXI processes, examples of BDXI processes and frameworks, conflicts between BDXI processes and management systems, and IIC initiatives to help guide BDXI processes. IIC authors and contributors to the BizOps for Digital Transformation in Industry white paper can be found here on the IIC website.

Collaboration Brings Predictive Maintenance for Planning, Prediction, Prevention of Performance Issues

What do you do when you want to bring the latest IoT, Digital Twin, data analysis to legacy equipment that is still producing in your plant? This announcement just came my way to answer the question. Getting past my particular hang-up on predictive maintenance, there are some really useful solutions in here.

Using advanced 3D Digital Twin, AI, and machine learning technology innovative platform and sensor solution provide real-time insights into legacy equipment for triage and issue resolution

UrsaLeo, an enterprise software company that enables users to visualize operational data in a photorealistic 3D representation of their facility or product, and Shiratech, a world-leading specialist in Industry 4.0-based condition monitoring and predictive maintenance technologies, announced a collaboration to offer advanced 3D Digital Twin, AI, sensor, and machine learning technology. The combination of the UrsaLeo platform with Shiratech’s iCOMOX solution integrated into legacy equipment allows manufacturers to plan, predict, and prevent performance issues.

“For many manufacturers, replacing legacy equipment can cost anywhere from hundreds of thousands to millions of dollars and may not be necessary with machinery that already operates at a high-performance level,” said John Burton, CEO of UrsaLeo. “Many types of older assembly equipment can be IIoT-enabled quickly, easily and cost-effectively, which is why the collaboration with Shiratech is vital to help bring companies with older equipment into the world of Industry 4.0.”

When I asked Burton how this happens, he told me, “We use a Shiratech sensor box that can be attached to the outside of an existing piece of machinery. It monitors, vibration, sound, magnetic field strength and current consumption. Once the ‘patterns’ the machine gives off are learned during normal operation, variations in those patterns can be detected by a human or by a machine learning algorithm. Not as good as having sensors inside equipment, but still good at detecting problems before failures happen.”

“The iCOMOX solution enables the precise monitoring of vibrations, magnetic-field, temperature, sound, and current. Using advanced AI and machine learning technology on edge this innovative solution provides real-time data about machine health, which is relayed directly to the cloud for analysis and real-time issue resolution,” said David Vactor, Managing Director of Shiratech.

Industrial machinery is designed to be a workhorse and can often last for many years before needing to be replaced. Without having to build an advanced factory or invest capital in new equipment, the UrsaLeo/Shiratech solution is ideal for cost-conscious executives looking to reap the benefits of Industry 4.0.
 

The Open Group OSDU Forum Launches the OSDU Data Platform Mercury Release

Open Source, standards-based data platform to stimulate innovation, industrialize data management, and reduce time to market for new solutions in the energy industry.

Readers here know The Open Group perhaps mainly through the Open Process Automation Forum. But this technology consortium has many open-source projects in the works. Today’s news shows the continued vitality of the open-source movement.

The Open Group announced the OSDU Data Platform Mercury Release. Developed by The Open Group OSDU Forum, the OSDU Data Platform is an Open Source, standards-based, and technology-agnostic data platform for the energy industry that stimulates innovation, industrializes data management, and reduces time to market for new solutions.

The OSDU Data Platform will provide over time access to a vast portfolio of open and proven vendor-developed applications from a broad range of energy sources. By accessing this ecosystem, developers no longer have to develop and maintain the monolithic architecture needed to deliver unique value-add services. Now, with a single set of well-defined and industry-specific APIs, organizations can easily accelerate platform design and develop proprietary applications on top of the OSDU Data Platform. 

With an open-source approach, any company – from established corporations to start-up challenger companies – can contribute new features to the platform, supporting a variety of business workflows. All work is validated by the OSDU Program Management Committee (PMC) to ensure it is aligned with the overall direction of the Forum. 

With a single view of industry data, the OSDU Data Platform can be harnessed for innovative business applications. The Mercury Release of the OSDU Data Platform is now available to Operators and Software Developers who want to:

●      Liberate data from traditional silos and make all data discoverable and usable in a single data platform 

●      Enable new integrated exploration and development workflows that reduce overall cycle time

●      Take advantage of emerging digital solutions to provide scalability and accelerate decision making

Steve Nunn, President and CEO of The Open Group, commented: “The OSDU Data Platform Mercury Release represents an important achievement by the OSDU Forum in a very short space of time. Established in 2018, the OSDU Forum has accumulated over 185 Member organizations who are collaborating together to accelerate innovation and reduce costs in the energy sector. With a standard data platform, energy companies will be able to drive innovation by integrating digital technologies and utilizing open standards for better decision making. Looking ahead, this will be imperative to meet the world’s increasing energy demands while reducing greenhouse gas emissions.” 

Johan Krebbers, GM Emerging Digital Technologies / VP IT Innovation, Shell, commented: “At the heart of most energy companies’ strategies is embracing the transformational technologies taking us forward in today’s digital era. This makes the need for a common architectural design clear, one that underpins how our industry works with its data.”

David Eyton, EVP Innovation & Engineering, bp, commented: “Data is at the heart of bp’s transformation into an integrated energy company. We believe that the future of the energy industry will be data driven and dependent on its ability to manage data in a manner that promotes data sharing with partners, innovation through data science, and rapid decision making throughout the lifecycle of the energy value chains. Being a founding member of the OSDU, bp has had an opportunity to be part of an organization that is fundamentally changing the data landscape for our industry. By integrating energy organizations, cloud services providers and software vendors the OSDU is providing an opportunity for collaboration that will be beneficial for all involved. We are very excited about the release of OSDU Mercury and look forward to expanding this approach into engineering, emissions, and new energy.”

To learn more about how to develop applications the OSDU Data Platform, visit the application developer community page here

The current OSDU Forum Member List is available here.

Powerful Electric Vacuum Gripper for Heavy-Duty Palletizing Applications

What news I receive that isn’t related to software or networking seems to be about collaborative robots. Every time I think that there can’t be anything new in grippers, here comes another announcement. Check this from OnRobot.

Large, unwieldy bags of dog food, non-airtight clothing or consumer goods packages, and bulky, porous cardboard boxes. These are just some examples of demanding packaging and palletizing applications that OnRobot’s new VGP20 gripper can address. VGP20 is the world’s most-powerful electric vacuum gripper. Compatible with all leading robot brands, the gripper can handle payloads of 20kg (44.09lbs), making it a great fit for a wide range of applications in industries from cosmetics and electronics to pharmaceuticals and food and beverage. 

“Our customers asked for a cost-effective, easy-to-deploy vacuum gripper that can pick up bulky, heavy-duty payloads while being intelligent enough to handle a wide range of items, including those with irregular shapes and porous surfaces,” says Enrico Krog Iversen, CEO at OnRobot. “The VGP20 combines power, intelligence and ease-of-use that competes with expensive, complex pneumatic grippers.” 

End-of-line operations such as palletizing are labor-intensive and costly. Researchers estimate that, on average, labor costs account for 65% of warehouse facility operating budgets, dwarfing the costs associated with utilities, taxes, distribution and rent combined. 

On this basis alone, automation is a compelling proposition for companies of all sizes. According to researchers, adoption of automated palletizing solutions in the food and beverage sector is estimated to have increased at a CAGR of more than 13% since 2017 and is set to reach USD390 million by 2022. 

OnRobot’s new VGP20 electric vacuum gripper can take on applications that have traditionally been handled by powerful pneumatic grippers–at a fraction of the cost and complexity.

While pneumatic grippers require compressed air to operate, the VGP20 is all-electric and ready to go out of the box, enabling companies to save up to 90% on operating and maintenance costs compared to traditional pneumatic gripper deployments.

The VGP20 provides unlimited cup and airflow customization and multichannel functionality, allowing it to be deployed on multiple items of different shapes and sizes.

Additionally, the VGP20 gripper’s built-in intelligence, combined with its easy-to-use software, provides precise air flow control functionality that is beyond the capabilities of traditional pneumatic grippers. This functionality allows users to vary the type of grip used in different applications, such as the soft grip used to handle delicate items to the hard grip required for handling bulky, heavy cardboard boxes with porous surfaces. 

The costs of carboard for packaging has risen by nearly 40% from 2010-2020. And with continued strong growth of e-commerce demands, estimates of further increases lead shippers to look for lower-cost options for packaging materials. Thinner, more porous cardboard and lightweight shipping bags present challenges for traditional automated packaging and palletizing, however. The powerful, customizable OnRobot VGP20 easily handles these thinner and less expensive packaging materials, allowing shippers to save considerably on both automation and shipping costs. 

OnRobot’s VGP20 gripper also provides an option to enable continuous monitoring of the gripper’s air flow. If this option is selected, and the vacuum is interrupted for any reason, the robot will come to an immediate halt and an alert pop-up window will be displayed in the gripper software. 

“Efficient packaging and palletizing performance is crucial to success for manufacturers, e-commerce and logistics companies. However, labor shortages present an ongoing challenge and performing these jobs by hand is both monotonous and unergonomic,” says Iversen. “The powerful and versatile VGP20 gripper enables companies to automate these tasks, providing relief to workers while improving overall productivity and quality.”

Follow this blog

Get a weekly email of all new posts.