Shallow and mechanical: no evidence that sex with robots can be therapeutic

Given onscreen depictions of robots designed for pleasure in Blade Runner and the recent Westworld series, and even artificial intelligence (AI) romance seen in Spike Jonze’s Her, the world seems ready for ‘sexbots’. But a new paper reveals that we have no idea what’s coming – the world of machine intimacy is far more complex and poorly understood than imagined.

With an already established sex technology industry worth roughly US$30 billion and a quickly expanding Virtual Reality sex market, robots designed for sexual gratification are sure to have consumer appeal. There are plenty of ‘robosexuals’, as Futurama’s Bender might say. 

Already, four companies are manufacturing and selling “female” customisable sexbots to an overwhelmingly male market. Matched with AIs, they might provide ever more human interactions – making eye contact, speaking and responding to moods and needs of the individual user.

Much of the marketing for the technology hinges on health claims: sexbots will help make sex safer, play a therapeutic role for couples, the anxious and the lonely, and even potentially aid in curbing and treating dangerous sexual deviancy.{%recommended 7221%}

Chantal Cox-George of St George’s University Hospitals NHS Foundation Trust and Susan Bewley from the Women’s Health Academic Centre at King’s College London, both in the UK, decided to see if there is any scientific evidence to back up these claims. 

Their findings, reported in the journal BMJ Sexual & Reproductive Health, shows that there is exactly zero real research on the topic.What they did find was interesting arguments for and against the sex robot industry.

Some claim sexbots could potentially end sex tourism and prostitution, removing a vector for the spread of sexually transmitted infections. They may even help to destroy sex trafficking. Yet the authors conclude that it is unknown if sexbots “will lead to lesser risk of violence and infections, or drive further exploitation of human sex workers”.

The therapeutic value of a bit of robo-action is also questionable. While the technology may help to relieve the sexual frustration of isolated individuals, it equally might drive them to “become even more isolated by the illusion of having a substitute satisfaction”. This could lead to a dangerous situation, given the rise of the violent, misogynist “incel” movement.

Similarly, therapeutic use for companionship and intimacy might prove hollow or even increase distress, as a human may genuinely desire a robot, but “reciprocation can only be artificially mimicked”. Advocates of Strong AI – a philosophy that seeks a state in which a “machine’s intellectual capability is functionally equal to a human’s” – might beg to differ.

More controversially still, some claim that sexbots could be prescribed to treat rapists and paedophiles. One company, headed by a CEO with admitted paedophilic tendencies, is already manufacturing child sex dolls, arguing such technology helps to “redirect dark desires”. 

Given the lack of evidence, however, the authors “strongly caution against the use of paedobots as putative ‘treatment’”. 

The health claims made for sexbots, write the authors, “are rather specious”.

“Currently the precautionary principle should reject the clinical use of sexbots until their postulated benefits, namely ‘harm limitation’ and ‘therapy’ have been tested empirically,” they conclude.

Please login to favourite this article.