by Gary Mintchell | Mar 18, 2025 | Robots
Teredyne is a conglomerate that has been snarfing up collaborative robotic assets. Many are based on Universal Robotics technology. This news expands AI-powered applications. Of course, AI is the entry point phrase for any new technology currently. That’s OK, it is an advancement in usability. And that’s a good thing.
Teradyne Robotics and its partners are set to unveil a suite of advanced, AI-driven robotics solutions at NVIDIA GTC 2025 March 17-21. The unveiling represents the first public demonstration of the AI Accelerator in commercially viable applications.
“Physical AI equips robots with the capacity to perceive and respond to the real world providing the versatility and problem-solving capabilities that are often required by complex use cases that have been out of scope until now,” says James Davidson, Chief AI Officer, Teradyne Robotics. “Instead of merely executing pre-programmed instructions, robots empowered by AI gain the ability to learn, adapt, and make informed decisions grounded in their sensory input,” he says.
Davidson adds, “Think of a logistics operation where robots navigate through a warehouse with constantly changing layouts and obstacles. Or a construction site where robots assist in assembly tasks within unpredictable and changing environments. The AI Accelerator helps our cobots better understand their surroundings, plan optimal paths, and execute tasks safely and efficiently in previously unmanageable spaces.”
The AI Accelerator is a toolkit designed by Teradyne Robotics company Universal Robots (UR) in collaboration with NVIDIA to enable the development of AI-powered applications by bringing AI acceleration to UR’s next-generation software platform, PolyScope X. The toolkit is powered by NVIDIA Isaac accelerated libraries and AI models, running on the NVIDIA Jetson AGX Orin system-on-module.
Partner displays:
- 3D Infotech – Dynamic Metrology: Showcasing a UR3e cobot that scans workpieces, compares them with CAD models, and highlights dimensional inaccuracies by projecting them onto the workpiece surface. The AI Accelerator enhances the perception capabilities of the cobot, allowing it to more accurately locate the workpiece for inspection.
- T-Robotics – GenAI Driven Programming: Demonstrating a UR5e cobot with LLM-driven CNC machine tending. With T-Robotics ActGPT, users can describe an application setup in natural language and have it translated into a robot program with the help of the AI Accelerator.
- AICA – Reinforcement Learning Assembly: Featuring a UR5e cobot executing single-arm gear assembly using reinforcement learning. The cobot locates a part using AI Accelerator-based perception and then uses a reinforcement learning skill to complete a contact-rich assembly process.
- Acumino – Bimanual Assembly: The AI Accelerator facilitates a bimanual UR5e cobot’s ability to learn complex manipulation tasks from human demonstrations. The system completes an electrical cable handling task and attendees can collaborate with the robot to solve the task.
- Groundlight – Workpiece Detection for Streamlined Picking: The AI Accelerator trains a model to detect a workpiece and generate a robot program for picking, then verifies that the robot workspace is ready, ensuring a robust and easily deployed solution.
While the AI Accelerator is primarily focused on UR cobots, the underlying AI and accelerated computing technology also benefit Mobile Industrial Robots (MiR), another Teradyne company featured at GTC. The MiR1200 Pallet Jack uses 3D vision and the NVIDIA Jetson AGX Orin module for pallet detection, allowing the pallet jack to identify, pick up, and deliver pallets with precision in complex environments.
by Gary Mintchell | Mar 11, 2025 | Robots
The development of collaborative robots in the Odense, Denmark area led to important increases in the development of end-of-arm effectors. OnRobot became a leading developer of these tools. This new release marks OnRobot’s highest payload gripper to date, with intelligent control and seamless integration for palletizing and handling.
OnRobot’s new VGP30 vacuum gripper robust at 30 kg (66 lbs.) payload is designed to excel at palletizing boxes and handling irregular shapes and porous surfaces, even those constructed from cost-saving, thinner cardboard. The VGP30 is ready for immediate deployment out of the box and includes all the hardware and software needed for all leading robot brands.
Key features and benefits of the VGP30:
- High Payload: Capable of handling up to 30 kg (66 lbs.).
- Immediate Deployment: Ready to use out of the box, including all necessary hardware and software for seamless integration with all leading robot brands.
- Intelligent Vacuum Control: Automatically adjusts to any box size or interlayer, optimizing air consumption and reducing energy costs.
- Multichannel Capability: Provides failsafe and flexible operations. The VGP30 has two channels that can be operated together or independently, convenient for handling small boxes.
- Seamless Integration with D:PLOY: The VGP30 seamlessly integrates with OnRobot’s D:PLOY platform, the first automatic software development platform that enables the configuration of complete, off-the-shelf robotic systems for high-mix manufacturing applications.
by Gary Mintchell | Feb 17, 2025 | Robots, Safety
It seemed as if technology and applications for industrial robotics had stagnated for quite some time. The last few years have witnessed advances on many robotic fronts. It seems fitting that updated industrial robot safety standards are releasing.
The Association for Advancing Automation (A3) has announced the conclusion of eight years of work with the publication of the revised ISO 10218, the global flagship standard for industrial robot safety. In its first major revision since 2011, these new documents offer a leap forward in ensuring the safety of robotics in industrial environments.
The two main areas of concern include Safety Requirements for Industrial Robots (Manufacturers) and Safety Requirements for Industrial Robot Applications and Robot Cells (System Integrators).
Here’s what’s new:
- Clarified functional safety requirements for easier compliance and reduced risk
- First-time inclusion of cybersecurity standards for industrial robot safety
- Simplified guidance for collaborative robots (previously separate in ISO/TS 15066)
- New robot classifications & test methodologies
“Working alongside hundreds of global experts, A3 played a pivotal role in shepherding this update to publication, to refine safety requirements in response to evolving automation technologies and workplace demands,” said Jeff Burnstein, A3 president. “This effort reflects A3’s ongoing commitment to enhancing robotic safety and supporting the widespread adoption of automation.”
The new ISO 10218 Parts 1 and 2 feature extensive updates that focus on making functional safety requirements more explicit rather than implied. This shift enhances clarity and usability, making compliance more straightforward for manufacturers and integrators alike.
In North America, ISO 10218 had been previously adopted as ANSI R15.06 in the United States and CSA Z434 in Canada. Work is underway to adopt the new 10218 in both jurisdictions with new versions of R15.06 and Z434 expected to be released later this year.
Key Updates in ISO 10218 (2025)
ISO 10218 consists of two parts:
- Part 1: Safety Requirements for Industrial Robots (Manufacturers)
- Part 2: Safety Requirements for Industrial Robot Applications and Robot Cells (System Integrators)
- Clarified functional safety requirements that offer more precise safety guidelines to enhance compliance and risk mitigation.
- Integrated safety requirements for collaborative robot applications that consolidates the previously separate ISO/TS 15066.
- Incorporated safety guidance for manual load/unload procedures and end-effectors (sometimes called end-of-arm tooling or EOAT) from previously separate technical reports (TR 20218-1 and TR 20218-2).
- New robot classifications with corresponding functional safety requirements and test methodologies.
- Cybersecurity requirements pertaining to industrial robot safety.
Carole Franklin, director of standards at A3 Robotics, emphasized the significance of these updates: “With automation evolving at an unprecedented pace, it is essential that safety standards keep up with the latest advancements. This is a critical step in ensuring that as automation grows, worker safety remains a top priority. These revisions provide clearer guidelines and new classifications that will help manufacturers and system integrators implement the latest technology for safer robotic solutions.”
The 2025 edition of ISO 10218 is now available for purchase to U.S. customers through A3. Companies and professionals looking to stay compliant with the latest safety requirements can acquire the standard through our webstore. Pricing starts at $244 USD, with options available for digital bundles.
by Gary Mintchell | Feb 5, 2025 | Robots, Software
No-Code tools appear in many applications. I’ve written about a software company with no-code tools talking with a customer about how easy (on a relative scale, of course) it was to develop new input forms for an added use case.
No-code has now invaded the robot interface arena. This from ABB.
- Unique software tool features drag and drop functionality for easy creation of customized operator interfaces for ABB robots and cobots
- Intuitive functionality and collaborative cloud-based library for application templates reduce set-up times by up to 80 percent
- No-code programming lowers the barriers to automation for beginners and experts
ABB launched AppStudio, an intuitive no-code software tool designed to empower users of all experience levels to quickly and easily create customized robotic user interfaces. With intuitive functionality and features including a collaborative cloud-based library enabling users to share application templates, AppStudio will reduce setup times by up to 80 percent.
“A growing shortage of skilled labor requires the further simplification of automation and programming, especially among small and medium size enterprises (SMEs) where complexity is seen as a major barrier to implementing robotic automation,” said Marc Segura, President ABB Robotics Division. “Designed for novices and experts alike, AppStudio is an exciting addition to ABB’s current software offering. By making it easier to create robot interfaces, it will save users significant time on setup and allow for fast and seamless robot integration across diverse applications.”
Compatible with all ABB robots on the OmniCore controller platform, AppStudio offers unprecedented flexibility and ease for creating customized robotic user interfaces. After installing the software, users can repurpose a previously used setup or select from a cloud-based library of templates, models, modules, and examples enabling them to select options in twenty languages. Alternatively, customized interfaces can be created to fit any device and application, including the OmniCore FlexPendant, tablets, and mobile phones.
AppStudio also supports customers migrating from the IRC5 controller to the OmniCore platform, significantly reducing the time needed to rebuild interfaces from days to minutes. This feature ensures a smooth transition to the latest technology, enhancing efficiency and productivity.
With intuitive drag-and-drop functionality, AppStudio offers simplified configuration where users can add icons, dropdown menus, buttons, and other functions to tailor the interface to their specific needs. It also supports more advanced users, who can build custom interface elements using a JavaScript-based component kit, enabling a high degree of customization. Once developed, these elements can be shared with other users for streamlined project collaboration.
Once an interface has been created, it can be deployed to the ABB robot or a digital twin in ABB’s RobotStudio programming tool. Through this connection, the robot can be programmed to carry out specific commands, such as performing an action or opening a gripper.
Together with ABB’s Wizard Easy Programming tool and RobotStudio, AppStudio makes the ABB robot portfolio one of the easiest to program, marking the start of a new chapter that will see robots being opened to a new audience of potential users.
AppStudio is available as a free download from ABB’s website.
by Gary Mintchell | Jan 31, 2025 | Robots
I’ve been watching developments of robotics and peripherals with anticipation. Engineers continually get closer to mechanisms that can make life so much better for handicapped or aged people. Not to mention extensive use cases in manufacturing.
This news concerns a company emerging from stealth with a hand exoskeleton with exciting sensory input along with a its operating system (Alan Kay—People who are really serious about software should build their hardware.).
Haptikos (Sunnyvale, CA and Athens) emerged from stealth January 27 at the 1st Annual MIT Experiential Innovation Event in Cambridge, MA. The company specifically announced a groundbreaking new hand exoskeleton, which is part of the Haptikos ecosystem of integrated hardware and software (priced at $2,500 per evaluation unit and available singly as well), with the eventual goal of a sub-$1000 final price point by 2026 when the final production units ship.
Partnered with the company’s AI-infused Haptik_OS (immediately available for license), the exoskeleton and OS combination brings a complete sense of touch to VR and AR apps and use cases. Hepaticas exoskeleton prototypes are already in use by leading companies such as Siemens and Leonardo and are designed for markets ranging from medical, defense and robotics. Whether it’s a single drop of water or the pulse of a heartbeat, Haptikos makes every virtual interaction deeply immersive and allows users to “touch tomorrow, today.”
Haptikos Exoskeleton:
The exoskeleton prototype makes it possible to feel the distinct sensations of different materials, from smooth surfaces to intricate textures. Whether handling virtual objects in training or design, Haptikos makes digital environments feel tangible and lifelike, enhancing overall realism. With a rise of 12 ms and a fall time of 55 ms, Haptikos ensures every touch and interaction feels immediate and smooth.
The Haptikos hand exoskeleton has 24 Degrees of Freedom per hand (DoF), with sub-millimeter motion accuracy and 8 hours continuous usage that is unmatched in the industry, even by long-shipping commercial alternatives. With control over every joint, users are ensured of lifelike movements from the most delicate gestures to the most complex actions.
Included in the exoskeleton are:
- Tracking sensors
- Haptic sensors
- Kinaesthetic sensors (will be added later in 2025)
Through these sensors, Haptikos tracks the movement of each hand joint with incredible detail. This means full, natural control over every motion, from finger curls to complex hand gestures, making digital interactions feel fluid and intuitive. With sub-millimeter precision, Haptikos captures even the smallest user motions.
This high level of accuracy ensures that every gesture, no matter how subtle, is translated into the virtual world with fidelity.
HAPTIK_OS:
The haptik_OS app is the central hub for connecting Haptikos exoskeleton and seamlessly integrating their capabilities into projects and motion control, redirecting this data into the acclaimed Unity® Software Development Kit (SDK) with ease.
The haptik_OS is itself at the core of the Haptikos ecosystem, connecting developers, designers, and end-users to create lifelike, intuitive and immersive experiences. Through built-in AI functionality, haptik_OS automatically adapts pre-built libraries into new haptic XR applications in Unity, simplifying both integration and scaling.
The haptik_OS is available immediately via three tiers of license:
- Freemium: Basic tools for developers to start exploring.
- Basic Tier: SDK for integration into applications.
- Pro Tier: Full access to AI-driven tools for automated customization.
Quotes:
“Haptikos exoskeletons have expanded the training program for the postpartum balloon procedure,” said Dr. Aoife Mcevoy, Specialist Registrar & Clinical Tutor in Obstetrics & Gynaecology National Maternity Hospital | UCD School of Medicine. “Both clinicians and students experienced a significant improvement in their procedural skills following the use of Haptikos. The realistic tactile feedback and precise control enabled by Haptikos allowed for a more immersive and effective learning experience, greatly enhancing trainee confidence levels in their skillset for this critical procedure.”
“Haptikos really impressed the entire MIT Innovation Technology team with their advanced haptics, and we are extremely pleased they saw the value of launching at our premiere event in tandem with the MIT Reality Hack,” said Maria Rice, Executive Director of the MIT Reality Hack. “Haptics have been a long-overlooked component of a complete XR solution, and their combination of haptics, operating system and AI combined with the industry-leading Unity SDK is a win for all developers seeking the next XR frontier.”
“We are extremely excited to finally be emerging from stealth in the hallowed halls of MIT, where their slogan of “mens et manus” – Latin for “to mind and hand” – has taken on a new and very literal meaning,” said Greg Agriopoulos, CEO and co-founder of Haptikos, Inc. “Competitive haptics solutions are normally glove-based, and never fit correctly on most hands – our exoskeleton fits everyone perfectly, is far more accurate, far less costly and is the only solution coupled with an actual software solution that makes it genuinely useful across a range of markets.”
by Gary Mintchell | Jan 21, 2025 | Robots, Technology, Workforce
I receive the Peter Diamandis Abundance newsletter. He’s an over-the-top optimist—but we need a dose of that in these pessimistic times. (In the “it’s a small world” category, a daughter of a couple who regularly attend a morning coffee group with me in Elgin, IL works for him.) He recently included a link to a “Metatrend Report” on humanoid robots.
Reports of robotic advances targeting human assistance have trickled my way and piqued my interest. This seems to me to be a great field for some of our best robotic engineering minds.
This is from Diamandis’s report.
I was compelled to create this Metatrend report because the coming wave of humanoid robots will have a vast impact on society that is widely underappreciated. It will transform our lives at home and work.
How Many?: In my conversations with Elon Musk, Brett Adcock, Cathie Wood, and Vinod Khosla, the predictions on how many humanoid robots we will have working alongside us by 2040 is shocking at best. At the lowest bound, the number is 1 billion (which is more than the number of automobiles on Earth) and at the upper bound, proclaimed by Musk and Adcock, the number will exceed 10 billion.
How Much?: But equally impressive as the sheer number of robots is the price point, predicted to be between $20,000 to $30,000 which translates to a leased cost on the order of $300 per month, for a robot helper working 24 hours per day, 7 days per week.
Why Now?: The first question to ask is why now? Why are we seeing such an explosion of activity in the humanoid robot field now? Beyond any single technical advancement, the convergence of 5 major technological areas are super-charging this field: multimodal generative AI, high-torque actuators, increased compute power, enhanced battery life, cameras and tactile sensors.
This, in combination with AI voice recognition, is transformative: As Brett Adcock recently told me, “We can literally talk to our robot and it can implement the tasks you request — the end-state for this is you really want the default UI to be speech.”
Impact on Jobs: Naturally, the prospect of billions of humanoid robots raises questions about their impact on jobs and society. According to Adcock: “Our goal is to really be able to do a lot of the jobs that are not desirable by humans.” As of Q3 2024, there are nearly 8 million US job openings — jobs that people just don’t want to do.
Creating a Future of Abundance: As Musk has commented regarding a future involving humanoid robots: “This means a future of abundance, a future where there is no poverty, where people, you can have whatever you want, in terms of products and services. It really is a fundamental transformation of civilization as we know it.” Adcock echoes this vision, “You can basically create a world where goods and services prices are trending to zero in the limit and GDP spikes to infinity … You basically can request anything you would want and it would be relatively affordable for everybody in the world.”
Included in the introduction are 7 Key Takeaways:
1 Market Explosion: The humanoid robots market is poised for exponential growth, with projections ranging from $38 billion by 2035 (Goldman Sachs) to a staggering $24 trillion (Ark Invest). In the U.S. alone, at the lower-bound, Morgan Stanley estimates 63 million humanoid robots could be deployed by 2050, potentially affecting 75% of occupations and 40% of employees. On the upper bounds, Brett Adcock and Elon Musk predict as many as 1 billion to 10 billion humanoid robots by 2040.
2 Technological Convergence: The rapid advancement of humanoid robots is driven by converging breakthroughs in AI, hardware components (actuators, sensors), and battery technology. Multimodal generative AI in particular is enhancing robots’ adaptability and decision-making capabilities, while hardware costs are plummeting.
3 Labor Shortage Solution: Humanoid robots are emerging as a critical solution to global labor shortages, particularly in elderly care, manufacturing, and dangerous jobs. By 2030, the U.S. is projected to have a 25% “dependency ratio” of people over 70, driving demand for robotic assistance in healthcare and social care. In China and other parts of Asia and Europe, an aging population and lower birth rates make humanoid robotics critical for their economy.
4 Cost Reduction Trends: The cost of humanoid robots is plummeting rapidly, with high-end models dropping from $250,000 to $150,000 in just one year: a 40% decrease compared to the expected 15-20% annual decline. Ambitious targets, such as Tesla’s goal of a $20,000 selling price for its Optimus robot, suggest mass adoption will become feasible across various sectors.
5 Investment Opportunities: The humanoid robot sector is attracting significant investment, exemplified by Figure AI’s recent $675 million funding round at a $2.6 billion valuation. Morgan Stanley’s “Humanoid 66” list provides a roadmap for investors interested in both robotics developers and potential beneficiaries across various industries.
6 Broad Societal Impact: The widespread adoption of humanoid robots has the potential to usher in an era of unprecedented abundance, dramatically reducing the cost of goods and services while freeing humans to focus on creative and fulfilling pursuits. This transformation could reshape our concept of work and fundamentally alter the structure of our economy and society.
7 Job Disruption: The speed at which multimodal generative AI and humanoid robot development is progressing, paired with the lack of public discourse on this subject, indicates that there will be significant job disruption and societal upheaval. Mechanisms to address these concerns such as universal basic income (UBI), will need to be addressed. Some have proposed funding such UBI programs by taxing companies which utilize “robots and AIs” to displace previously human-filled jobs.