Robot Hand Teaches Itself How to Get a Grip

This University of Washington built a five-fingered robot hand that uses machine learning to teach itself how to grasp and manipulate objects on its own, rather than having humans program its actions.


Robots still have trouble gripping and manipulating objects. It’s one of the biggest challenges that needs to be solved.

But it appears the University of Washington (UW) has found a way to help robots get a better grip. UW built a five-fingered robot hand that teaches itself how to grasp and manipulate objects on its own, and it gets better with more practice.

The robot hand uses machine learning algorithms to model both the physics involved and to plan its course of action. At the 1:47 mark of the video above, for example, the robot hand gets better at spinning a tube. As the robot hand performs different tasks, the system collects data from various sensors and motion capture cameras, using machine learning algorithms to continually refine and develop more realistic models.

“A lot of robots today have pretty capable arms but the hand is as simple as a suction cup or maybe a claw or a gripper,” said lead author Vikash Kumar, a UW doctoral student in computer science and engineering.

UW’s hand is quite expensive at roughly $300,000, so don’t expect to see it in real-world applications anytime soon. It uses a Shadow Hand skeleton actuated with a custom pneumatic system and can move faster than a human hand. The UW team plans to use the robot hand “to push core technologies and test innovative control strategies.”

Robot Hand AIUW’s robot hand has 40 tendons, 24 joints and more than 130 sensors. (Credit: University of Washington)

The team emphasized how different its autonomous learning approach is to dexterous manipulation. “Usually people look at a motion and try to determine what exactly needs to happen - the pinky needs to move that way, so we’ll put some rules in and try it and if something doesn’t work, oh the middle finger moved too much and the pen tilted, so we’ll try another rule,” said senior author and lab director Emo Todorov, UW associate professor of computer science and engineering and of applied mathematics.

At this point, UW has tested the robot hand’s ability to improve its manipulation of the same object. The next step will tackle global learning - it’s ability “to manipulate an unfamiliar object or a new scenario it hasn’t encountered before.”

“There are a lot of chaotic things going on and collisions happening when you touch an object with different fingers, which is difficult for control algorithms to deal with,” said co-author Sergey Levine, UW assistant professor of computer science and engineering who worked on the project as a postdoctoral fellow at University of California, Berkeley.  “The approach we took was quite different from a traditional controls approach.”




About the Author

Steve Crowe · Steve Crowe is managing editor of Robotics Trends. Steve has been writing about technology since 2008. He lives in Belchertown, MA with his wife and daughter.
Contact Steve Crowe: scrowe@ehpub.com  ·  View More by Steve Crowe.




Comments



Log in to leave a Comment

Article Topics

Education · Robot Arms · News · Media · Videos · All Topics


Editors’ Picks

FAA’s Recreational Drone Registration Struck Down in Court
A federal court rules the FAA's mandatory recreational drone registration violates Section 336...

Self-Driving Cars Approved for Public Tests in Germany
Germany passed a law that allows self-driving car tests on public roads....

Smart Exoskeleton Prevents Elderly Falls
The Active Pelvis Orthosis is a smart exoskeleton that recognizes in just 350...

EduExo DIY Kit Lets You Build Exoskeletons
EduExo is a 3D-printable, Arduino-powered kit for students, hobbyists and educators that...