Robots often use imagery of objects to move them – but when people turn things over in our hands, we use touch.
A team of US researchers has added cheap touch sensors to a robotic hand, and found it can smoothly rotate objects without any visual input.
They’ve added 16 touch-based sensors, each about US$12 (A$18), to a robotic hand, and shown it can rotate things like toys, cans, fruits and vegetables without damaging them.
The sensors use binary information – touching or not touching – to do their job.
“Here, we use a very simple solution,” says research lead Professor Xiaolong Wang, a researcher in electrical and computer engineering at the University of California, San Diego, US.
“We show that we don’t need details about an object’s texture to do this task. We just need simple binary signals of whether the sensors have touched the object or not, and these are much easier to simulate and transfer to the real world.”
The researchers first ran simulations of a virtual hand rotating objects with this sensor set up, training the system.
They then applied the system to a real robotic hand, and gave it objects it hadn’t been trained on, like a tomato and a can. The hand had the most trouble with a rubber duck, taking longer to rotate it and other challenging objects.
Next, Wang and colleagues are hoping to make robotic hands throw, catch and juggle.
“In-hand manipulation is a very common skill that we humans have, but it is very complex for robots to master,” says Wang. “If we can give robots this skill, that will open the door to the kinds of tasks they can perform.”
They’ve presented their research at the 2023 Robotics: Science and Systems Conference, held recently in Korea.