Established in 2020 Thursday, January 21, 2021

Deep learning helps robots grasp and move objects with ease
UC Berkeley engineers have created new software that combines neural networks with motion planning software to give robots the speed and skill to assist in warehouse environments. Screenshot courtesy: UC Berkeley video courtesy Ken Goldberg lab.

BERKELEY, CA.- In the past year, lockdowns and other COVID-19 safety measures have made online shopping more popular than ever, but the skyrocketing demand is leaving many retailers struggling to fulfill orders while ensuring the safety of their warehouse employees.

Researchers at the University of California, Berkeley, have created new artificial intelligence software that gives robots the speed and skill to grasp and smoothly move objects, making it feasible for them to soon assist humans in warehouse environments. The technology is described in a paper published online in the journal Science Robotics.

Automating warehouse tasks can be challenging because many actions that come naturally to humans — like deciding where and how to pick up different types of objects and then coordinating the shoulder, arm and wrist movements needed to move each object from one location to another — are actually quite difficult for robots. Robotic motion also tends to be jerky, which can increase the risk of damaging both the products and the robots.

“Warehouses are still operated primarily by humans, because it’s still very hard for robots to reliably grasp many different objects,” said Ken Goldberg, William S. Floyd Jr. Distinguished Chair in Engineering at UC Berkeley and senior author of the study. “In an automobile assembly line, the same motion is repeated over and over again, so that it can be automated. But in a warehouse, every order is different.”

In earlier work, Goldberg and UC Berkeley postdoctoral researcher Jeffrey Ichnowski created a Grasp-Optimized Motion Planner that could compute both how a robot should pick up an object and how it should move to transfer the object from one location to another.

However, the motions generated by this planner were jerky. While the parameters of the software could be tweaked to generate smoother motions, these calculations took an average of about half a minute to compute.

In the new study, Goldberg and Ichnowski, in collaboration with UC Berkeley graduate student Yahav Avigal and undergraduate student Vishal Satish, dramatically sped up the computing time of the motion planner by integrating a deep learning neural network.

Neural networks allow a robot to learn from examples. Later, the robot can often generalize to similar objects and motions.

However, these approximations aren’t always accurate enough. Goldberg and Ichnowski found that the approximation generated by the neural network could then be optimized using the motion planner.

“The neural network takes only a few milliseconds to compute an approximate motion. It’s very fast, but it’s inaccurate,” Ichnowski said. “However, if we then feed that approximation into the motion planner, the motion planner only needs a few iterations to compute the final motion.”

By combining the neural network with the motion planner, the team cut average computation time from 29 seconds to 80 milliseconds, or less than one-tenth of a second.

Goldberg predicts that, with this and other advances in robotic technology, robots could be assisting in warehouse environments in the next few years.

“Shopping for groceries, pharmaceuticals, clothing and many other things has changed as a result of COVID-19, and people are probably going to continue shopping this way even after the pandemic is over,” Goldberg said. “This is an exciting new opportunity for robots to support human workers.”

This work was supported, in part, by the National Science Foundation’s National Robotics Initiative Award #1734633: Scalable Collaborative Human-Robot Learning (SCHooL) and by donations from Google and the Toyota Research Institute Inc.

Today's News

November 23, 2020

Remains of two victims of 79 AD volcanic eruption unearthed at Pompeii

Climate change devastated dinosaurs not once, but twice

Deep learning helps robots grasp and move objects with ease

Fish carcasses deliver toxic mercury pollution to the deepest ocean trenches

US approves Regeneron antibody treatment given to Trump

Researchers find that a type of RNA monitors the genome to help ensure its integrity

Zebra finches amazing at unmasking the bird behind the song

Kyoto University's seemingly random photonic crystals greatly improve laser scanning

Biofriendly protocells pump up blood vessels

New process narrows the gap between natural and synthetic materials

Energy drinks can raise stillbirth risk and caffeine guidance confuses mums-to-be, new research warns

New project aims to develop a new treatment option for brain cancer

Sugar work: U-M study finds sugar remodels molecular memory in fruit flies

New found ability to change baby brain activity could lead to rehabilitation for injured brains, say researchers

The immune system and stem cells join forces to repair your teeth

Biologists at U of T and University of Illinois shed light on how microbes evolve and affect hosts

NASA, US and European partners launch mission to monitor global ocean

Can animals use iridescent colours to communicate?


Editor & Publisher: Jose Villarreal
Art Director: Juan José Sepúlveda Ramírez

Tell a Friend
Dear User, please complete the form below in order to recommend the ResearchNews newsletter to someone you know.
Please complete all fields marked *.
Sending Mail
Sending Successful