Breaking: robot makes breakfast

Researchers have taught a robot to make breakfast using coordinated two-handed movements previously beyond robotic ability.

Until now, robots have not been able to use two hands in the way that humans do.

The human brain controls both hands as a coherent whole in a task such as unscrewing a lid on a jar, applying appropriate levels of strength and tension to each hand and adjusting for feedback as the lid loosens.

Robots, however, treat such a two-handed, or bimanual, task as two independent limbs undertaking two separate tasks with limited coordination.

As a result, many tasks that humans take for granted have been beyond the motor skills of robots.

The research team led by Daniel Rakita from the University of Wisconsin–Madison, US, set out to find a way to replicate the so-called “gestalt” effect of human two-handed movement, in which arms and hands move together to achieve what each individual limb cannot do alone.

The team identified the most common two-handed movements, including transferring an object from one hand to the other, and holding an object steady while performing an action on it, such as stirring a pot. 

From this analysis the team created a list of movements it called a “bimanual action vocabulary”, which was programmed into the robot’s neural network – a computer system modelled on the human brain and nervous system.

And then it was time for breakfast.

Volunteers wearing motion capture gloves attempted to control the robot to complete 15 different breakfast-making tasks.

The robot cracked open two large prop eggs and released the contents into a bowl, mixed them by pouring them from one cup into another three times, removed the lid of a canister and poured the contents into a bowl, flipped the top off of a container, unscrewed the lid from a bottle and poured the contents into a cup, and removed three plates from a drying rack and set them out on the table.

The robot captured the human’s poses and inferred the correct motion by pulling from its bimanual vocabulary.

Importantly, the machine could even “adapt” the participant’s motions – sometimes overriding human commands entirely – to perform the task more efficiently.

In a paper published in the journal Science Robotics, the researchers say their work could lead to robots helping people at home with everyday tasks.

Rakita says the researchers chose the breakfast-making task because a key target area for their work is in-home care.

“Consider a bimanual robot platform that is installed in a home environment to provide assistance for an older adult,” he says.

“The robot would need to perform a wide variety of bimanual tasks in this scenario, such as opening pill bottles, carrying a laundry basket, or stirring a meal while keeping the pan stable on the stove.”

Using the approach developed by the researchers, “novice” users would be able to control a robot to carry out such tasks, the team concludes.

Please login to favourite this article.