Software Development Kit Boost Usage Of Robotic Platform

Software Development Kit Boost Usage Of Robotic Platform

Whenever people hear about automation or manufacturing technology, they always respond with “robots?”. Reading mainstream media where writers discuss manufacturing without a clue, they also seem fixated on robots. And most all of this ranges from partial information to misinformation. I seldom write about robotics because the typical SCARA or six-axis robots are still doing the same things they’ve always done—pick-and-place, welding, painting, material handling. They are better, faster, more connected, and in different industries, but in the end it’s still the same thing.

That is why I’m a fan of Rethink Robotics. These engineers are out there trying new paradigms and applications. Here is a recent release that I think bears watching. This news is especially relevant in the context of the visit I made last week to Oakland University and conversations with some students.

Rethink Robotics unveiled the Sawyer Software Development Kit (SDK), a software upgrade designed for researchers and students to build and test programs on the Sawyer robot. With a wide range of uses for university research teams and corporate R&D laboratories around the world, Sawyer SDK offers further compatibility with ROS and state-of-art Open Source robotics tools, as well as an affordable solution to increase access to advanced robotics in the classroom.

Sawyer SDK includes several advanced features that allow users to visualize and control how the robot interacts with its environment. Sawyer SDK now integrates with the popular Gazebo Simulator, which creates a simulated world that will visualize the robot and its contact with the environment, allowing researchers to run and test code in the simulation before running it on the robot. Sawyer’s Gazebo integration is completely open source, allowing students to run simulations from their individual laptops without a robot until they’re ready to test the code in real time. This approach allows professors to provide students with access to the industry-leading collaborative robots.

In addition to the Gazebo integration, Sawyer SDK includes a new motion interface that allows researchers to program the robot in Cartesian space. This development lowers the barriers for motion planning for programmers without a full robotics background. The new release also allows researchers to leverage new impedance and force control. Sawyer SDK also includes support for ClickSmart, the family of gripper kits that Rethink announced in 2017 to create a fully integrated robotic solution.

“Rethink’s robots are used in the world’s leading research institutions, which provides us with a wealth of feedback on what our research customers really want,” said Scott Eckert, president and CEO, Rethink Robotics. “As we have with all of our SDK releases, we’re continuing to set the standard in research with industry-leading features that allow universities and corporate labs to push the field of robotics forward and publish their research faster.”

Sawyer SDK is being piloted in robotics programs at multiple universities, including Stanford University, University of California at Berkeley, Georgia Institute of Technology and Northwestern University. Stanford’s Vision and Learning Lab works on endowing robots with diverse skills for both industrial and day-to-day personal robotics applications.

“Robotics is a field that combines technological and engineering skills with creativity, and the inventiveness our students have shown so far with the robots has been astounding,” said Dr. Animesh Garg, postdoctoral researcher in the Stanford University department of computer science. Animesh and his team of researchers have put Sawyer to use executing tasks directly from virtual reality (VR) input using automatic decomposition in simpler activities. Sawyer is also used for ongoing work in learning to use simple tools, such as hammers and screwdrivers.

Stanford University’s Experimental Robotics class allows students to think beyond day-to-day industrial tasks. They’ve trained Sawyer to draw, and track moving targets and hovering drones. Rethink’s Sawyer has enabled faster learning curves for researchers and students alike, making it easier than ever with the Sawyer SDK release.

The SDK will be available on all Sawyer robots, allowing access to both the Intera manufacturing software and the SDK software, starting in March 2018.

Software Development Kit Boost Usage Of Robotic Platform

Digital Transformation Impacts Manufacturing Innovation and Supply Chain

Digital transformation professional services companies provide value to owner/operators and manufacturing companies. But their services are often expensive. I have done some work on platforms such as MIMOSA’s OIIE (which is still in development) that are designed to use standards and interoperability to help these customers reduce their expense and dependence on these firms.

It’s sort of a “good news, bad news” thing.

At any rate, the PR firm representing Cognizant contacted me toward the end of December with an opportunity to interview an executive. The purpose of this interview would be to update me on the company and talk a little about digital supply chain, workforce, and other manufacturing innovation topics.

Anxious to get something done before the end of the year (billable hours?), they even offered times between Christmas and New Years. Prasad Satyavolu, global head of innovation, manufacturing, and logistics practice, talked with me shortly before Christmas.

When I laid out the conversation on a mind map, the map was huge. So I thought about it off and on for the past couple of weeks. These thoughts reflect about half of the conversation. There is a lot to think about.

Cognizant was a very familiar name, but I couldn’t place it. “We are familiar with SCADA and plant floor,” Satyavolu told me. “We acquired Wonderware’s R&D operations. In fact, we still work with Schneider Electric. We also work with Rockwell Automation.”

Core Manufacturing services include:

• Transportation/Material Handling

• Process industry

• Energy/Oil & Gas

• Aerospace (some)

• Automation

• Utilities/Smart Grid/Smart Meters

When Cognizant evaluates a customer’s processes and lays out a plan, it includes everything from incoming supply to manufacturing to shipping to customer. The demand and supply chain.

One opportunity Satyavolu sees considers more instrumentation leading to additional sensing of movement of materials and workers in order to capture better decisions and enable efficiencies.

Then consider the confluence of changing workforce and technology. “Consider reality on shop floor. 5-10 years ago a maintenance engineer listened to a machine, diagnosed the problem, and fixed the machine.

The next generation doesn’t have that knowledge. Today the time to fix has gone from 15 minutes to 4 hours. How can we tackle knowledge gap? Further, is the next generation even interested in this sort of work?

Looking ahead, by 2025 we will be short 8 million people with manufacturing skills. How does this impact global mid-sized companies? How can we further leverage robotics to help solve this problem? Would robotics technology even make the work more attractive to a new generation of workers from the world of gaming and drones?

Huge opportunities exist with visibility outside the plant to planning and execution. It’s the Amazon effect—velocity so high that you almost have to produce on demand. Predictive maintenance systems enable managers to manage schedules and demands. This leverages infastructure such as cloud, digital technologies. These improve scheduling, reschedules lowering carrying costs; aids risk management / mitigation; global organizations bringing parts from around the world, global demand/supply increases uncertainty.

On shop floor, plant has fixed schedules / horizons. Scheduling systems and a lot of modeling bring stability and improve effectiveness. You can simulate production quickly, get status of inbound parts, changes in demand side, sync with labor requirements. With better scheduling, you get better visibility—you can save 12-13% of costs with sync. You can track supply chain, transportation, and change schedules in advance improving risk management.

Software Development Kit Boost Usage Of Robotic Platform

Human And Machine Future Collaboration – A Further Look

The future of work—humans and machines working together in a truly collaborative fashion. A “partnership” said Dell Technologies.

My blogs on the future have not necessarily brought in tons of new page views (but this blog compares pretty well with industry trade publications), but they have drawn the attention of PR agency account managers looking for a way to prove their worth. Sometimes one of the inquiries strikes gold, as they say.

Yesterday James Lawton, Chief Product and Marketing Officer, Rethink Robotics, gave me 30 minutes of his time to talk about a vision of the future for humans and robotics. Rethink Robotics was founded by Rodney Brooks, an MIT professor and co-founder of iRobot (Roomba). I have written about the company several times including here, here, and here. You may remember Sawyer and Baxter, its two robots.

 

Human-like Robot

Baxter the Robot

 

We began talking a little about why robotics conversations got stale for several years. “Perhaps people had written robots off because of the way they were used” he told me. Those were mainly pick-and-place and welding. “If we can break the traditional barriers with automation technologies, then we can begin to see many new applications.”

Rethink Robotics has bee focusing on a handful of form factors—safe by design.

I asked about one of the hot topics among traditional robotic suppliers—collaboration. “Collaboration has been internalized it as ‘run without a cage’ instead of ‘people and robots working closely together’. We need to break that mental model. Some writers are worried about robots replacing people, but it’s about machines making people ‘super people.’ Both physical and cognitive assist to humans,” Lawton mentioned.

Lawton offered this challenge to the industry, “Make it so that ‘robot’ isn’t a custom construction project, but make more like hiring a temp worker. To do this, we need a common language interface, something like Amazon Echo, ‘Alexa do this.’ We have linguistics to overcome to accomplish this. We say pick up box and have an idea of box; a robot needs to be taught about what box means in all of the word’s many varieties.”

What about the near future? “In 5-10 years we’ll see these technologies begin to build out. We’ll also start to see more robotic technology in homes,” he predicted.

As far as collaborating, Peter Senge wrote in his book 5th Discipline that it is hard for humans to learn from patterns when they are separated by time and space. Lawton explained, “Computers, on the other hand, can do that. I can’t pull in all the data and derive insights. Say I’m applying torque to a part I’m assembling. But a computer can archive all the past data and then compare that with customer feedback on the assembly. With robot assist on the production process, it can reconfigure its own work on the fly and we can assemble the part in a way that better satisfies the customer.”

“This change in how we interact with machines will change how we live and how we work, as well as how we create value.”

 

Software Development Kit Boost Usage Of Robotic Platform

Dell Anticipates Tech Trends to 2030 Seeing Human Machine Partnerships

In 2030 every organization will be a technology organization and as such businesses need to start thinking today about how to future-proof their infrastructure and workforce, according to a report published by Dell Technologies. The research, led by the Institute for the Future (IFTF) alongside 20 technology, academic and business experts from across the globe, looks at how emerging technologies such as artificial intelligence, robotics, virtual reality, augmented reality and cloud computing, will transform our lives and how we work over the next decade. The report, titled ‘The Next Era of Human-Machine Partnerships‘ also offers insight on how consumers and businesses can prepare for a society in flux.

Interesting thing about this report is that it is not simply Dell’s technology or market strategy wrapped in the guise of a “research” report like the typical analyst job.

The report forecasts that emerging technologies, supported by massive advancements in software, big data and processing power, will reshape lives. Society will enter a new phase in its relationship with machines, which will be characterized by:

  • Even greater efficiency and possibility than ever before, helping humans transcend our limitations
  • Humans as “digital conductors” in which technology will work as an extension of people, helping to better direct and manage daily activities
  • Work chasing people, in which by using advanced data-driven matchmaking technologies, organizations can find and employ talent from across the world
  • People learning “in the moment,” as the pace of change will be so rapid that new industries will be created and new skills will be required to survive

Dell Technologies commissioned the study to help companies navigate an uncertain world and prepare for the future. Today, digital disruption is ruthlessly redrawing industries. For the first time in modern history, global leaders can’t predict how their industry will fare further down the line. According to Dell’s Digital Transformation Index, 52 percent of senior decision makers across 16 countries have experienced significant disruption to their industries as a result of digital technologies. And nearly one in two businesses believe there’s a possibility their company will become obsolete within the next three to five years.

Not your usual analyst firm, Institute for the Future (IFTF) is an independent, nonprofit 501(c)(3) strategic research and educational organization celebrating nearly 50 years of forecasting experience. The core of our work is identifying emerging trends and discontinuities that will transform global society and the global marketplace. The Institute for the Future is based in Palo Alto, California.

IFTF relied on its decades-long study on the future of work and technology, in-depth interviews with key stakeholders, and the opinions and ideas generated during an all-day facilitated workshop with a diverse set of experts from across the globe.

They studied robotics, artificial intelligence and machine learning, augmented reality and virtual reality, and cloud computing with the goal of projecting the impacts of these technologies by 2030. I had the opportunity to talk with Liam Quinn, sr. vice president and CTO of Dell Technologies about this report and he added comments about the Internet of Things. More on that interview in my next reflection of the report.

“Never before has the industry experienced so much disruption. The pace of change is very real, and we’re now in a do-or-die landscape. To leap ahead in the era of human-machine partnerships, every business will need to be a digital business, with software at its core,” said Jeremy Burton, chief marketing officer, Dell. “But organizations will need to move fast and build capacity in their machines, ready their infrastructure and enable their workforce in order to power this change.”

“We’ve been exposed to two extreme perspectives about machines and the future: the anxiety-driven issue of technological unemployment or the over optimistic view that technology will cure all our social and environmental ills,” said Rachel Maguire, research director, Institute for the Future. “Instead we need to focus on what the new relationship between technology and people could look like and how we can prepare accordingly. If we engage in the hard work of empowering human-machine partnerships to succeed, their impact on society will enrich us all.”

Other report highlights include:

  • In 2030 humans’ reliance on technology will evolve into a true partnership with humans, bringing skills such as creativity, passion and an entrepreneurial mindset. This will align with the machines’ ability to bring speed, automation and efficiencies, and the resulting productivity will allow for new opportunities within industries and roles.
  • By 2030 personalized, integrated artificial intelligence (AI) assistants will go well beyond what assistants can do now. They’ll take care of us in predictive and automated ways.
  • Technology won’t necessarily replace workers, but the process of finding work will change. Work will cease to be a place but a series of tasks. Machine learning technologies will make individuals’ skills and competencies searchable, and organizations will pursue the best talent for discrete tasks.
  • An estimated 85 percent of jobs in 2030 haven’t been invented yet. The pace of change will be so rapid that people will learn “in-the-moment” using new technologies such as augmented reality and virtual reality. The ability to gain new knowledge will be more valuable than the knowledge itself.

Exploring the technology areas:

Robotics—Buoyed by their commercial success, the adoption of robots will extend beyond manufacturing plants and the workplace. Family robots, caregiving robots, and civic robots will all become commonplace as deep learning improves robots’ abilities to empathize and reason.

Artificial Intelligence and Machine Learning—According to Michelle Zhou, an expert in AI, development can be thought of in three stages. The first is recognition intelligence—algorithms that recognize patterns; followed by cognitive intelligence—machines that make inferences from data; with the final stage being virtual human beings. It is plausible that, by 2030, we will enter the second stage in AI as this technology progresses.

Virtual Reality and Augmented Reality—Despite the difference, both technologies represent a fundamental shift in information presentation because they allow people to engage in what Toshi Hoo, Director of IFTF’s Emerging Media Lab, calls “experiential media” as opposed to representative media. The information layer that both technologies create will accelerate the melding of digital and physical identities, with digital drails and traces forming a digital coating over individuals’ physical environments.

Cloud Computing—It’s important to recognize that Cloud Computing isn’t a place, it’s a way of doing IT. It is already in wide use. For example, Chitale Dairy (in India) launched a ‘cow to cloud’ initiative in which each cow is fitted with RFID tags to capture data that is held in the cloud. The relevant analysis of this data is then sent to the local farmers via SMS and the web, to alert farmers when they need to change the cows’ diet, arrange vaccinations, etc. The timely delivery of this information is increasing the cows’ yield, supporting local farmers, whose livelihoods depend on the dairy farms, and enabling Chitale to manage a part of the supply chain which is normally fraught with uncertainty.

You can also check out the Dell blog.

Robots Rock at Automate

Robots Rock at Automate

Automate is the biennial trade show featuring robots, vision, and motion control technology and products. It is sponsored by the Association for Advancing Automation. I wrote about it Wednesday discussing statistics about robots and jobs.

Even though I was deeply involved in robotic technology and did some vision implementations in past lives, this all became sort of boring to me for quite a few years. Probably ever since the delta robot. Recent developments have made robots much more interesting.

Several companies were exhibiting some innovations at this year’s event. Here are a few I saw.

Autonomous Mobile Robots

One area that holds much promise is autonomous mobile robots (AMR). I walked a booth with several little “pets” wandering around freely going from station to station.

This was in the booth of the Danish company, Mobile Industrial Robots (MiR). It launched its newest robot, the MiR200. It is an upgrade to the company’s flagship MiR100, which has already been installed in more than 30 countries by companies such as Airbus, Boeing, Flex, Honeywell, Michelin, Procter & Gamble, Toyota and Walmart.

AMRs are a dramatic improvement over legacy automated guided vehicles (AGVs), which require the expensive and inflexible installation of sensors or magnets into factory floors for guidance. MiR products are designed to give owners the flexibility to easily redeploy the robots to different processes or facility layouts to support changing business needs and agile manufacturing processes.

“Our robots are changing the game for any size business, from small, regional companies to large multinationals,” said MiR CEO Thomas Visti. “With the new interface of the MiR200, it’s even easier for companies to program the robot themselves and adapt its deployment as their business evolves. That’s critical for their competitiveness, and supports extremely fast ROI. The robot typically pays for itself in less than a year, even in very dynamic environments.” 

Gripper Technology

Another recent development that shows great promise is robots working nicely with people. This technology is known as collaborative robots (cobots). I ran into a new company from Denmark (a hotbed of robotic development—I wrote about Universal Robots last September) called On Robot. Its two-finger RG2 grippers—available in both single and dual versions— mount easily on the arms of cobots without any external wires; for robots that have infinite rotation of the last joint, this enables unprecedented flexibility and productivity.

The RG2 grippers can be easily programmed directly from the same interface as the robot, and the gripper can be modified without previous programming experience, making them ideal for collaborative robot users.

“User-friendly robot arms need user-friendly grippers, and until RG2, the ease-of-use and flexibility required just wasn’t available,” said Torben Ekvall, On Robot CEO. “Most traditional grippers work by using compressed air, which takes up a lot of space, is energy-intensive and is far too complicated for many users. On Robot’s RG2 is an electronic solution that is easy to mount, is very flexible and can be modified by an operator on the factory floor without the assistance of an engineer. This ease-of-use will help speed development for an increasing number of manufacturers’ applications.”

Unique features of the RG2 gripper include:

  • Simple and intuitive programming: RG2 lets operators easily choose what they need the gripper to do and the gripper responds with motion as flexible as the cobot itself.
  • Angle mounting: From 0° to 180° in 30° steps, in both the single- and dual-gripper setup, the gripper ensures great flexibility and adaptability for comprehensive tasks.
  • Customizable fingertips: The gripper fingers support the use of customized fingertips, which can be designed by end users to fit production requirements.
  • Assisted center-of-gravity calculation: Users enter the value of the payload and the robot calculates the rest, making programming easier, enhancing overall productivity and improving safety by enabling more accurate robot arm movements.
  • Continuous grip indication: The gripper can discern any lost or deliberately removed object.
  • Automatic Tool Center Point (TCP) calculation: Automatic calculation of how the robot arm moves around the calculated TCP of an object, depending on the position in which the gripper is mounted, for easier programming and use.

Exhibiting along with On Robot was OptoForce. This is a Hungarian company with an ingenious force sensor that can assemble on the robot end of arm with the On Robot grippers. The combination enables many really cool applications.

Dual Arm Cobot

DEONET factory, the Netherlands

Speaking of collaborative robots or cobots, ABB introduced YuMi. “The new era of robotic coworkers is here and an integral part of our Next Level strategy,” said ABB CEO Ulrich Spiesshofer. “YuMi makes collaboration between humans and robots a reality. It is the result of years of research and development, and will change the way humans and robots interact. YuMi is an element of our Internet of Things, Services and People strategy creating an automated future together.”

While YuMi was specifically designed to meet the flexible and agile production needs of the consumer electronics industry, it has equal application in any small parts assembly environment thanks to its dual arms, flexible hands, universal parts feeding system, camera-based part location, lead-through programming, and state-of-the-art precise motion control.

YuMi can operate in very close collaboration with humans thanks to its inherently safe design. It has a lightweight yet rigid magnesium skeleton covered with a floating plastic casing wrapped in soft padding to absorb impacts. YuMi is also compact, with human dimensions and human movements, which makes humans coworkers feel safe and comfortable—a feature that garnered YuMi the prestigious “Red Dot ‘best of the best’ design award.” Check out the YuMi Information Portal for more information.

Follow this blog

Get a weekly email of all new posts.