Augmented Reality At Work

I found this article through one of my daily news sources Emerging Tech Brew. It’s a cool idea. Lots of questions. Click the link and see the visual. It’s a laptop without a screen. It projects the screen into the “air” in front of you as if on a giant monitor. I’m not suggesting you run out and buy one. But I think the idea deserves development.

Founded by Magic Leap alums, Sightful is focused on creating a “daily use case” for AR.

By Patrick Kulp

The maker of a new laptop device is trying to expand the size of your computer screen by ditching the monitor display altogether.

The startup Sightful, which announced itself last week with $61 million in initial funding, is rolling out what it bills as the world’s first augmented reality (AR) laptop of its kind.

The $2,000 Spacetop device is essentially the lower half of a laptop with a mounted webcam and a pair of glasses attached by a cord through which users can see a 100-inch projection of a workspace screen.

Sightful’s co-founders said the goal is to establish a natural daily use case for AR technology at a time when data shows that people generally don’t yet see a need for such extra dimensionality at work. The setup is designed to expand the workspace display with relatively compact hardware for the segments of the workforce that have remained largely remote since the onset of the pandemic.

“The reason people are not using AR and VR is because there’s no reason for them to use it,” Sightful CEO and co-founder Tamir Berliner said.

Three Product Announcements From Honeywell User Group 2023

It will take me a few days to compile all my notes from a busy two days at HUG 2023, the annual Honeywell User Group conference this week in Orlando. Rather than a couple of hours of presentations with maybe one or two interviews like many conferences, we had two days of presentations and 1:1 interviews. Well organized and a lot of information. Of course, that means Honeywell has been active in product development.

Digital twins exist not only in discrete manufacturing. The first product covered is an extension of the Honeywell digital twin called Digital Prime. I’ll have more to add after digesting my notes with the team leader. Interesting (to me anyway having just purchased a Hyundai Ioniq 6 EV) is a new battery product called Ionic. The final product is Upstream Production Performance Suite containing interesting new sensing.

CLOUD-BASED DIGITAL TWIN

Honeywell Digital Prime solution—a cloud-based digital twin for tracking, managing, and testing process control changes and system modifications. Digital Prime is a cost-effective solution that allows users to test frequently for more accurate results while resulting in an overall reduction in reactive maintenance.

Digital Prime offers the highest level of quality control through an efficient and collaborative solution for managing changes, running factory acceptance tests and improving project execution and training without having to disrupt the production system. The platform can be used by companies in industries such as Oil & Gas, sheet manufacturing, and chemicals to test modifications during planned shutdown periods to reduce rework.

Most solutions require dedicated hardware, can be vulnerable to security breaches and don’t stay current to match live operations. Digital Prime addresses these challenges by providing a “lab system as a service” that continually updates to reflect changes that occur on the production environment, providing a dependable digital twin.

Digital Prime is a collaborative ecosystem with secure cloud-based connectivity, a virtual engineering platform, and built-in security protection. The digital ecosystem can be accessed by users across the globe through its subscription service using multi-factor authentication, enabling the customers to standardize across the enterprise.

HONEYWELL IONIC, A MODULAR BATTERY ENERGY STORAGE SYSTEM

Honeywell Ionic is a compact, end-to-end modular battery energy storage system (BESS) and energy management tool that offers improved energy density compared to what’s currently available on the market, while delivering a significant reduction of installations costs.

Installed with lithium-ion battery cells, the design emphasizes flexibility and futureproofing. Honeywell Ionic includes Honeywell’s Experion Energy Control System and a chemistry-agnostic Battery Management System (BMS). Experion helps users to manage and optimize energy use by improving uptime, enabling peak shaving and providing the ability to create a Virtual Power Plant. The BMS provides insight into performance at the cell level, and is configurable with advances in battery chemistry, insulating the end user from future supply-chain risks.

Due to its structural design and compact dimensions, the modular architecture provides a solution that can be deployed with a standard forklift and simpler site preparation – which significantly reduces installation costs. The modular design also provides higher energy storage per square foot, scalable from approximately 700 kilowatt hours (kWh) to 300 megawatt hours (MWh) of energy capacity in a standard configuration and lasting up to two hours at maximum rate of discharge. This solution provides customers a lower installed cost per kilowatt hour. The modular approach allows customers to incrementally add storage capacity at their own pace, not having to invest in more capacity than needed.

Honeywell Ionic BESS also offers a sophisticated set of safety capabilities, including three layers of the latest generation BMSt Systems (BMS) ensuring battery health and system safety. Additionally, it can be configured with Li-Ion Tamer, which further improves the safety of Li-ion batteries, sensing thermal runaway before it occurs.

Features of Honeywell Ionic include:

  • Scalable and flexible ~700kWh building block for a forklift-able installation, significantly reducing the cost and time of installation
  • Battery module is designed to accommodate cells from multiple battery suppliers
  • Battery management system supports all battery types and chemistries, helping futureproof the core architecture of the system and facilitating system augmentation
  • Liquid cooling to help equalize battery temperatures through charge and discharge cycles, prolonging battery life
  • 1500VDC battery stack to maximize power conversion efficiency
  • DC augmentation support

UPSTREAM PRODUCTION PERFORMANCE SUITE

Upstream Production Performance Suite (UPP Suite), an end-to-end solution that automates and digitalizes operation from the well head to the control room.

The solution is offered in three tiers – Lite, Supreme and Ultimate, providing a variety of control and monitoring options for operators.

Lite offers users a basic sensing and monitoring solution, complete with Honeywell Versatilis sensors and software. Supreme is a standard solution with process enhancement that adds a control and safety system, advance process control, and asset performance monitoring. Ultimate offers users the most comprehensive and complete solution, adding visual analytics cyber and network security, partner solutions and asset performance monitoring simulation.

Honeywell’s UPP Suite solution is available now. 

Open-Source Software for Miniature Quantum Computers

I don’t know when we’ll begin using quantum computing in industrial applications, but heck, we’re beginning to see ChatGPT usage. This news is an advance in quantum compute state-of-the-art.

  • The Quantum Brilliance Qristal SDK moves from beta into broad release for developing on-premise and edge applications for compact, room-temperature quantum accelerators

Quantum Brilliance, the leading developer of miniaturised, room-temperature quantum computing products and solutions, announced June 8 the full release of the Qristal SDK, an open-source software development kit for researching applications that integrate the company’s portable, diamond-based quantum accelerators.

Previously in beta, the Quantum Brilliance Qristal SDK is now available for anyone to develop and test novel quantum algorithms for real-world applications specifically designed for quantum accelerators rather than quantum mainframes. Potential use cases include classical-quantum hybrid applications in data centres, massively parallelised clusters of accelerators for computational chemistry and embedded accelerators for edge computing applications such as robotics, autonomous vehicles and satellites.

“With enhancements based on input from beta users, the Qristal SDK allows researchers to leverage quantum-based solutions in a host of potential real-world applications,” said Mark Luo, CEO and co-founder of Quantum Brilliance. “We believe this powerful tool will help organizations around the world understand how quantum accelerators can enable and enhance productisation and commercialisation.”

Qristal SDK users will find fully integrated C++ and Python APIs, NVIDIA CUDA features and customizable noise models to support the development of their quantum-enhanced designs. The software also incorporates MPI, the global standard for large-scale parallel computing, allowing users to optimise, simulate and deploy hybrid applications of parallelised, room-temperature quantum accelerators in high-performance computing (HPC) deployments from supercomputers to edge devices.

Quantum Brilliance’s quantum systems use synthetic diamonds to operate at room temperature in any environment. Unlike large mainframe quantum computers, Quantum Brilliance’s small-form devices do not require cryogenics, vacuum systems or precision laser arrays, consuming significantly less power and enabling deployment onsite or at the edge.

Currently the size of a desktop PC, the company is working to further miniaturise its technology to the size of a semiconductor chip that can be used on any device, wherever classical computers exist today, unlocking practical quantum computing for everyone. The Qristal SDK source code can be downloaded here. The source code includes extensive application libraries for VQE, QAOA, quantum machine learning, natural language processing and more.

Nokia and EY Study Reveals Enterprise and Industrial Metaverse Exceed Expectations

Nokia and EY released the results of The metaverse at work a survey of 860 business leaders from six countries soliciting thoughts on application of the “metaverse” to their businesses. In short, many are using some form and many expect business benefits.

  • Across use cases, early metaverse adopters report benefits more often than companies still in the planning phase, with CAPEX reduction (15%) and sustainability (10%) showing largest difference
  • Companies believe in the power of the metaverse – only 2% of respondents see the metaverse as a buzzword or a fad
  • The industrial metaverse is creating substantial business value – 80% of early adopters say use cases tested will have a significant or even transformative impact
  • US and UK lead in terms of actual experience – 65% and 64% of respondents respectively had a pilot or fully deployed at least one industrial or enterprise metaverse use case, while Asia-Pacific is less advanced (Japan, 49%; South Korea, 49%)
  • Cloud computing (72%), AI/ML (70%) and network connectivity, including private 5G/6G (70%), fiber broadband (68%), as well as public 5G/6G (67%), are seen as most important key technical enablers to metaverse use cases
  • Enterprises saw the highest potential in the use of extended reality for training to onboard and upskill the workforce, while three out of the four industries surveyed chose the use of virtual R&D to enhance product design and processes.

I concur that metaverse use cases around training show the most promise. This assumes close linking to the “digital twin.” I can also see use cases in product and machine design.

When I probed into industrial use cases, the only surprising thing was anticipated benefits from “predictive maintenance.” As fate would have it, I interviewed Siemens Digital VP of Industrial Machinery Rahul Gark at the Siemens Digital customer conference and media/analyst symposium. So, I asked him.

We discussed first of all that metaverse need not imply only the use of opaque ski goggles. The displays can be on normal screens. The idea is that metaverse provides more than just a better display. It is enhanced through diagnostics, deep dives into the drawing data, and historical trends. In addition to use by operations staff, maintenance and services staff can more easily find root causes and other diagnostics leading to faster time to repair.

Predictive maintenance is more historical—say looking at a tool or component over time noticing trends and events. Knowing that increased chatter, no matter how minute, in a machining tool or heat/vibration in a bearing, will lead to breakdown, the metaverse in this case can help technicians fix something before it breaks.

Vincent Douin, Executive Director, Business Consulting and Business Transformation, Ernst & Young LLP, said: “The industrial and enterprise metaverses are here, this study shows the clear appetite for these technologies such as extended reality and digital twins to achieve business goals. We are already seeing many organizations going above and beyond the planning stages and recognizing tangible benefits from their initial implementations.”

Thierry E. Klein, President of Bell Labs Solutions Research, Nokia, said: “It is great to see that companies clearly believe in the power of the Metaverse for business value creation in both enterprise and industrial use cases. This strongly aligns with our vision, informed by more than 8 years of research at Nokia Bell Labs, that the Industrial Metaverse is an extension of Industry 4.0. Consequently, those who have already implemented mission-critical communications networks for Industry 4.0 are now well placed to experience the benefits of the Metaverse that clearly some companies are already seeing.”

The metaverse is defined in the report as a fusion of the digital and physical worlds. The enterprise metaverse is driven by demand for better digital collaboration and communication tools. It will envelop the core productivity applications that make business function and allow for the next generation of virtual connections. The industrial metaverse is characterized by physical-digital fusion and human augmentation focused on industrial applications; this includes digital representations of physical industrial environments, systems, processes, assets, and spaces that participants can control, monitor and interact with.

Apple Vision Pro–Useful for Industrial and Manufacturing Applications?

Welcome to the era of spatial computing, where digital content blends seamlessly with your physical space. So you can do the things you love in ways never before possible. This blurb came from Apple PR’s write up of the new Vision Pro—the long-awaited AR/VR headset.

Apple made no mention of the “M” word. Here is what they called it. “Welcome to the era of spatial computing, where digital content blends seamlessly with your physical space. So you can do the things you love in ways never before possible.”

It is a headset. Even though they say that you used to look at the glass on Apple products and this one you look through the glass—that is not what it is. You actually do not see through the glass like you do with the Microsoft HoloLens. There are many cameras and a couple of them send the outside world to the screens (look like eyeglass lenses) in the headset. This is a typical case of great Apple hardware engineering and design.

Still…

What is problem being solved?

Apple didn’t really answer that. What they did was through out a great piece of hardware, an operating system (VisionOS), and a bunch of ideas. Developers will figure out what problems they’d like to solve with this product.

I’m still thinking, but from an industrial/manufacturing point-of-view I don’t see any new applications. Simulation with digital twin for training. Perhaps remote maintenance and troubleshooting. Simulation along with design in order to see the product being designed and perhaps determine interferences and other gotcha’s at an early stage in design.

I have worn HoloLens as an operator interface device. I doubt that this would ever be a viable alternative.

Some people, such as MG Siegler (see link below) see this as a device to consume media. Much is made of the great display capabilities to replace your computer monitors. But I ask…

Do you want your screen attached to your face?

The promo emphasized collaboration with cool “real” avatars of people in the meeting and ability (?) to see people and presentation. I’m not turned on by that. 

They also showed 3D visualization and photography. Is that really useful? Maybe to the dad shooting 3D images of his kids—but I always wonder how much you miss out being present in the moment rather than videoing events. And how often will you actually go back and watch?

Ideas? Send me a note. Right now, will I rush out and spend $3,500 to buy one? I think that if I have that much money laying around to burn, I’ll take a vacation to Europe or South America.

Vision Pro links.

M.G. Siegler, 500ish Blog—Apple’s history, Compute, Collaborate (iPhone, iPad), Consume (Vision Pro)

Another Podcast, Benedict Evans and Toni Cowan-Brown.

Accidental Tech Podcast, John Siracusa, Marco Arment, Casey Liss.

And, most thoroughly, a long report of personal experience with the Vision Pro from John Gruber at Daring Fireball.

Other thoughts on the “metaverse” in general I’ve posted over the past year:

My podcast.

Metaverse Solutions, interview with GridRaster

Open Metaverse Foundation

Initial Thoughts on Industrial Metaverse

Xaba and Rolleri Partner to Develop a Cognitive Autonomous Cobot Workcell

This press release just released today looked interesting, but it also hit some of my anti-hype hot buttons. First, most automation news coming my way lately concerns robotics and more specifically cobots—collaborative robots. The collaborative may be interpreted in one of several ways. Today’s news concerns a collaboration (ahem)  of Xaba, developers of xCognition, the first AI-driven robotics and CNC machine controller, and Rolleri Holding SpA focused on the development of a cognitive, autonomous collaborative robot (cobot) workcell for welding operations in manufacturing. The collaboration enables the integration of xCognition with Rolleri Robotic cobots. 

So, I had to tackle “first” AI-driven robotics and CNC machine controller, and then “cognitive”. I asked for some technical back up. This from Massimiliano Moruzzi, CEO of Xaba. 

Yes, we had AI , ML and neural nets for many years but 95% of the AI in manufacturing belongs to two classes: Predictive Analytics to mainly support maintenance and Object Detection & Recognition in essence Vision System. What we have developed at Xaba is not another AI predictive analytics or Computer Vision application that have flooded the manufacturing industry in the last few years.

Our xCognition technology is composed of two critical and unique AI-powered modules:

  • A proprietary Physic-Informed Machine Learning model to truly model the real physics of any robotics system, in essence its elasto-mechanical-dynamic behaviour.
  • A proprietary Large Language Model to enable any robotics system to auto-generate its own programs and tasks. Our LLM model is unique compared to other LLM models because of our patent pending technology that enables the creation of the “Manufacturing Domain experts data Corpus”, that is absolutely needed to train any LLM models about manufacturing processes (welding, assembling, drilling, polishing, cutting…) and how to automate programming generation.

That is why our xCognition is considered a cognitive brain for robotics, because has both critical component of any brains:  a) The deep side of the brain, in essence the one that controls the body motion and task execution and b) the Cortex side of the brain in essence the capacity to receive instruction from humans or sensors and auto-generating the program for any desired task to be executed.

From the press release:

With Xaba’s xCognition any industrial robot can be empowered with both deep and cortex intelligence, enabling it to fully control its body and understand its environment using sensor data such as images, sounds, temperatures, and accelerations.

Xaba and Rolleri recently completed ISO 9283 tests in Xaba’s robotic lab. A FARO Vantage Laser Tracker System was used to acquire all data needed to train the xCognition machine learning model and to validate trajectory accuracy improvements. The successfully completed tests showed 10 times performance improvements in absolute positioning and trajectory accuracy, and five times improvements in relative positioning and trajectory accuracy.

As a follow up to the initial tests, Xaba and Rolleri will be undertaking Tig and Laser welding tests to further validate welding quality improvements such as improved accuracy and repeatability.

Follow this blog

Get a weekly email of all new posts.