Unfair! When AI-controlled Tetris turns sour

When a two-player Tetris game unfairly allocates turns, the player with the smaller share perceives their team mate as less likeable, even when they know an algorithm made the decision.

US researchers used the timeless video game in an experiment to better understand the effects of artificial intelligence (AI) decisions on human relationships. 

“When it comes to allocating resources, it turns out Tetris isn’t just a game – it’s a powerful tool for gaining insights into the complex relationship between resource allocation, performance and social dynamics,” says co-author Dr Houston Claure, an engineer and computer scientist from Cornell University in the US.

The researchers invited 232 participants to play a collaborative version of the best-selling puzzle game Tetris, where two players work together to move falling geometric blocks into place.

In the Co-Tetris game (developed for the study), the two players work together, but only one player has control at any given time. 

Players were made aware of an ‘allocator’ – either a human or AI – responsible for determining how many turns each player receives (even though in all cases this was done by an algorithm). 

In the experiment, Tetris turn allocation wasn’t always done fairly. In some games, one player was allocated 90% of the turns, and the other 10%. In others games, the turns were distributed equally.

The level of resources (in this case, turns) a participant received didn’t affect players’ perception of the allocator – which made the decision – but it did influence the relationship between the players. 

When the allocation was done by AI, the player receiving more turns saw their partner as less dominant. Yet when the allocation was done by a human, perceptions of dominance weren’t affected.

Co-author of the study, Assistant Professor Rene Kizilcec, hopes the work leads to more research on the effects of AI decisions on people and their relationships.

AI tools such as ChatGPT are increasingly embedded in our everyday lives, where people develop relationships with these tools over time,” Kizilcec says. 

“How, for instance, teachers, students, and parents think about the competence and fairness of an AI tutor based on their interactions over weeks and months matters a great deal.”

The studypublished in Computers in Human Behaviour, investigates the effect on human responses and behaviour when artificial intelligence systems allocate resources, sometimes unfairly. 

Subscribe to energise from riaus

Are you interested in the energy industry and the technology and scientific developments that power it? Then our new email newsletter Energise, launching soon, is for you. Click here to become an inaugural subscriber.

Please login to favourite this article.