Scientists have analysed more than 5000 one-handed human grasps and object handovers in an attempt to help robots better take hold of objects.
Grasping something, such as a cup or ball, may sound like a simple skill, but most robots find it tricky.
By analysing how people do it, researchers led by Francesca Cini from the Biorobotics Institute from Italy’s Scuola Superiore Sant’Anna and Valerio Ortenzi from the Australian Centre for Robotic Vision at the Queensland University of Technology (QUT) mapped out human hand placement for grasping and handing over objects, as well as the hand movements of the person on the receiving end.
The study, published in the journal Science Robotics, involved objects as diverse shaped as a pen, a screw driver, a bottle and a toy monkey.{%recommended 6508%}
Humans intuitively grasp and exchange objects using unconscious and intuitive signals and actions, such as leaving handles or space available for the receiver to grab, thus facilitating the smooth completion of what is essentially a collaborative task.
“A handover is a perfect example where little adjustments are performed to best achieve the shared goal to safely pass an object from one person to the other,” says Ortenzi.
However, attempts to replicate the procedure in robots has so far produced unnatural and often unsafe results.
Choosing the right grasp is not simple for robots, because choices depend on the situation, and there may be multiple factors in play.
Cini and Ortenzi evaluated 5202 different grasps of various objects, involving 17 pairs of people passing them back and forth.
The passer first grasped the objects from a table and performed two different tasks with each, then handed them to a receiver, who subsequently performed the same two tasks.
The researchers found that in 73% of handover passers favoured precision grasps, in which only the tips and upper halves of fingers were used.
These behaviours are generally unconscious to humans, because they are patterns learned over time through repetition and routine. “Learning this grasping and manipulation process is subtle and elusive for robots,” conclude the researchers.
Example of trials with the views of the object tracking (the trials shown in the video were recorded to explain the experimental procedure and were not included in the analysis of grasp type and location). Credit: Cini et al., Sci. Robot. 4, eaau9757 (2019)
Originally published by Cosmos as Get a grip! Designing robots that can pick things up
Ben Lewis
Ben Lewis is a science communicator with the Royal Institution of Australia.
Read science facts, not fiction...
There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.