Blaming the driver in a ‘driverless’ car

Many people worry about the concept of a “driverless car”, but a more immediate concern may be our fear about the fallibility of human drivers, new research suggests.

When an international team led by Massachusetts Institute of Technology, US, asked people to assign blame in hypothetical cases where people were killed in accidents that involved errors by both driver and car, they pointed their fingers more at the person than the machine.

And that, the researchers suggest, may have implications not just for how the law approaches issues with human-machine shared-control vehicles, but also how they are manufactured, purchased and driven.

In their study, they ran a number of different scenarios, but the main interest was in situations where either a machine was the primary driver and a human had the power to override, or vice versa.

Some cases involved “bad interventions”, where the primary driver made the right decision and the second driver incorrectly changed this, and others “missed interventions”, where the primary driver made an error and the secondary didn’t pick it up.

Not surprisingly, in the first situation, where only one driver made an error, blame tended to be equally attributed. Whether human or machine intervened, they were in the wrong

However, in the case of missed interventions – where both made an error – humans were blamed a lot more, whether or not they made the initial bad call or failed to intervene. 

“Our central finding (diminished blame apportioned to the machine in dual-error cases) leads us to believe that, while there may be many psychological barriers to self-driving car adoption, public over-reaction to dual-error cases is not likely to be one of them,” they write in the journal Nature Human Behaviour. 

“In fact, we should perhaps be concerned about public underreaction.” 

There are real-life parallels, the researchers say, in two well-publicised cases: a fatal accident in 2016 involving a Tesla vehicle where the autopilot made an error and the driver did not respond; and the death in 2018 of a pedestrian who was not seen by either the automated car’s system or its Uber driver. 

“The set of anecdotes around these two crashes begins to suggest a troubling pattern, namely that humans might be blamed more than their machine partners in certain kinds of automated vehicle crashes,” they write. 

“Was this pattern a fluke of the circumstances of the crash and the press environment? Or does it reflect something psychologically deeper that may colour our responses to human-machine joint action and, in particular, when a human-machine pair jointly controls a vehicle?”

The broader risk, they add, is that if there is something that colours our responses to human-machine joint action, it could “potentially slow the transition to fully automated vehicles if this reaction is not anticipated and managed appropriately in public discourse and legal regulation”. 

“Moreover, manufacturers that are working to release cars with a machine secondary driver should plan appropriately for the probable legal fall-out for these unique cases where that driver receives more blame than a human,” they write. 

“Our data portend the sort of reaction we can expect to automated car crashes at the societal level (for example, through public reaction and pressure to regulate). 

“Once we begin to see societal level responses to automated cars, that reaction may shape incentives for individual actors. For example, people may want to opt into systems that are designed such that, in the event of a crash, the majority public response will be to blame the machine. 

“Worse yet, people may train themselves to drive in a way that, if they crash, the blame is likely to fall to the machine (for instance, by not attempting to correct a mistake that is made by a machine override).”

The research team brought together expertise in psychology, economics, brain and cognitive science, data and systems, and media, from institutions in the US, Canada, the UK, France and Germany.

Please login to favourite this article.