Robotic feelings?

Suppose you’re part of a team in a group activity. One of your team members is uncommunicative, even taciturn. This behaviour has a knock-on effect for you and the others.

You’re not certain about how to behave. Talk more or talk less? Look at the grumpy one, or avoid their gaze? Overall, it’s a less pleasant experience than one in which people are honest and open and the verbal interaction is universal and graceful.

Turns out that robots can affect humans and human interactions in pretty much the same way.

A new study published in the journal Proceedings of the National Academy of Sciences shows humans on teams that include a robot expressing vulnerability communicate with each other more, and have a more positive group experience than people teamed with silent robots or with robots that make neutral statements.

“We know that robots can influence the behaviour of humans they interact with directly, but how robots affect the way humans engage with each other is less well understood,” says lead author Margaret Traeger from Yale University, US.

“Our study shows that robots can affect human-to-human interactions.”

Traeger says it’s important to understand how AI shapes human behaviour because robots are increasingly pervading human society. People are encountering them in shops, hospitals and other everyday places, she says.

“In this case, we show that robots can help people communicate more effectively as a team,” Traeger says.

The study saw 153 people divided into 51 groups composed of three humans and a robot. Each group played a tablet-based game in which members worked together to build the most efficient railroad routes over 30 rounds. 

Groups were assigned to one of three different types of robot behaviour. At the end of each round, robots either remained silent; uttered a neutral, task-related statement (such as the score or number of rounds completed); or expressed vulnerability through a joke, personal story, or by acknowledging a mistake. All of the robots occasionally lost a round.

Results showed that people teamed with robots that made vulnerable statements spent about twice as much time talking to each other during the game and reported enjoying the experience more compared to people in the other two kinds of groups.

Conversation among the humans increased more during the game when robots made vulnerable statements than when they made neutral statements and was more evenly distributed when the robot was vulnerable instead of silent.

The experiment also found more equal verbal participation among team members in groups with the vulnerable and neutral robots than among members in groups with silent robots, suggesting that the presence of a speaking robot encourages people to talk to each other in a more even-handed way.

“We are interested in how society will change as we add forms of artificial intelligence to our midst,” says senior author Nicholas Christakis.

“As we create hybrid social systems of humans and machines, we need to evaluate how to program the robotic agents so that they do not corrode how we treat each other.”

Understanding the social influence of robots in human spaces is important even when the robots do not serve an intentionally social function, says co-author Sarah Strohkorb Sebo.

“Imagine a robot in a factory whose task is to distribute parts to workers on an assembly line,” she says.

“If it hands all the pieces to one person, it can create an awkward social environment in which the other workers question whether the robot believes they’re inferior at the task. Our findings can inform the design of robots that promote social engagement, balanced participation, and positive experiences for people working in teams.”

Please login to favourite this article.