Domo arigato, Mr. Roboto: Gendering robots is a cognitive shortcut that can lead to short circuits

Nursing assistant, Moxi projects a guileless and friendly demeanour as it glides down hospital corridors delivering medical supplies and fresh linen or dropping off samples to the lab.

The robot’s open face features large and expressive pale blue LED ‘eyes’ that transition from wide-open, to blinking, and occasionally form rather endearing heart-shapes. It has a bright voice that “meeps” or uses short phrases to communicate, along with robotic arm gestures, like waving at people as it passes by.

Moxi was designed by Austin-based Diligent Robotics as a teammate and helper to nursing staff.

Diligent’s head of design and author of My Robot Gets Me, Carla Diana says the team talked about and tested every element of the robot’s appearance and social interactions, including its gender likeness.

Because Moxi was designed to work in the female dominated field of nursing, the robot was deliberately made gender-neutral. For example, Moxi’s pronouns are ‘it’ for a single robot, or ‘they’ for many. Diana says the rationale was to be inclusive and “sensitive to the potential for this robot to reinforce a stereotype that discourages men from being in the profession”.

A nurse robot
Moxi, the robot nurse / Credit: Diligent Robotics

No longer constrained to industrial tasks, a growing population of service and social robots like Moxi are moving out of research labs and into ‘the wild’ as mainstream consumer products. When robots enter human settings like homes and professional workplaces, their human-like features and characteristics help people to like them.

Yet the tendency to anthropomorphise machines brings with it the potential for bias and stereotyping, particularly in relation to gender. And as the robot population grows, researchers are investigating the broader implications of human-robot interactions and calling for more reflective and inclusive design approaches.

The ‘Cambrian explosion in robots’ predicted by Toyota Research Institute’s Gill Pratt in 2015 appears to be unfolding. There are about as many service robots globally – 23 million in 2019 – as there are Australians, and their numbers are rapidly growing. These robots are working alongside and in many cases interacting with people in homes, offices, shops, restaurants and hospitals, and already vastly outnumber industrial robots in factory settings.

Andra Keay has had a “front row seat on watching innovation take shape” as the managing director of Silicon Valley Robotics, and a founder of professional network Women in Robotics.

“There are studies which show people anthropomorphise a Roomba vacuum cleaner … They give it a name, they attribute a gender, they get incredibly attached to it. To the point that when they have to bring it to repair, they just want to make sure they will receive the same robot, that it’s not going to be replaced with another one.”

Giulia Perugia, assistant professor at Eindhoven University of Technology

Before moving to California more than a decade ago, Keay’s masters research at the University of Sydney found the practice of naming robots was widespread and often reflected gender biases.

She says, when people name a robot they also take great delight in accessorising them, compounding the anthropomorphism. “The naive user not only gives the robot an anthropomorphic name but goes out and buys a couple of big googly eyes and sticks them on it, and then wraps a feather bower boa around it, and sticks a hat on,” she says.

“And one of the plusses is like, it’s great, the robot’s now part of the team and has been accepted.”

But naming robots is a cognitive shortcut that can lead to short circuits, she says. Because when people name things, they begin to attribute character traits and intention to the machines, “like the way it always gets lost when it goes into the corners.”

Giulia Perugia is an assistant professor at Eindhoven University of Technology who researches people’s interactions with robots and the implications for ethics, gender and inclusion.

“There are studies which show people anthropomorphise a Roomba vacuum cleaner,” she says. “They give it a name, they attribute a gender, they get incredibly attached to it. To the point that when they have to bring it to repair, they just want to make sure they will receive the same robot, that it’s not going to be replaced with another one.”


Read more: “The robot broke the child’s finger… this is of course bad”


Human likeness in robots elicits positive perceptions from people, who find humanoid robots more intuitive and easier to interact with. “The more a robot resembles a human, the more it is liked,” Perugia says, referring to robotics professor Masahiro Mori’s famous essay The Uncanny Valley, initially published back in 1970.

She says, “this works up to the point where the robot is virtually indistinguishable from a human but some of its features, for instance its movements, reveal its artificiality. And then the likeability drops in what is called the uncanny valley, and the robot elicits eerie feelings.”

Once a designer gives a robot a human shape, it’s difficult to get away from gendered characteristics, she says.

Perugia prefers to use terms such as “genderedness” given robots don’t have internal states, self-awareness or any capacity to choose their gender. In an analysis of 35 papers on human-robot interactions she co-authored with Dominika Lisy, Perugia found designers manipulate a robot’s genderedness using cues like voice pitch, name, pronouns, facial features, colour and body shape.

Their analysis revealed perceived gender did not affect people’s evaluation of a robot, for example its likeability or acceptance. However, it had a crucial effect on stereotyping, she says. 

A white roboto and plush dog on a countertop
Jibo, a countertop social robot which received a neutral classification / Credit: Seyemon [CC BY-NC-ND 2.0]

In further research, Perugia and co-authors looked at how users perceived the 251 humanoid robots in the Anthropomorphic Robot Database – or, ‘ABOT’ database – a collection of real world robots catalogued for research or commercial purposes. They asked than 150 Italian participants to categorise robots according to perceived age and gender. Their resulting dataset is publicly available.

Conversational robot, Erica, built by Hiroshi Ishiguro Laboratories, received one of the highest femininity scores. Erica has long hair, red lips and cheeks with an appearance described by the lab as having a “beautiful and neutral female face”. At the other end of the scale – high masculinity – is table tennis playing robot Topio by robotics firm TOSY, a 1.88m tall robot with arms, legs and a plastic six pack.

In general, participants perceived the presence of body manipulators like arms and legs – giving the robot agency and the ability to act on the environment – as masculine. Whereas surface features like eyelashes or clothes led to a higher feminine ranking.

Robots classed as neutral tended to be less humanlike in their appearance and attributes, like the countertop social robot Jibo created by Cynthia Breazeal.

”This gave rise to a hypothesis we are now testing that maybe the way we imbue gender into robots has a role in stereotyping,” Perugia says.

This is significant, because robots carry symbolic meaning and represent humans in a very powerful way, she says. For example, if a female robot deployed in a shopping centre is bullied – “because we know that people bully a robot”, she says – that type of interaction has a symbolic meaning, that is different from when a non-humanoid robot is bullied.

The experience with voice assistants and smart speakers like Alexa, Cortana, Siri and Google helps illustrate the risks.

“The conflation of feminised digital assistants with real women carries the risk of spreading problematic gender stereotypes and regularising one-sided, command-based verbal exchanges with women,” the United Nations Educational, Scientific and Cultural Organisation says in its report, I’d blush if I could.

The reports states these commercial smart speakers – gendered female by their names, voices, characters and responses – might be hard-coding connections between a woman’s voice and subservience.

“Maybe the way we imbue gender into robots has a role in stereotyping.”

Giulia Perugia, assistant professor at Eindhoven University of Technology

And when a voice assistant inevitably makes a mistake, the risk is that users interpret the error as made by a woman, even though the companies that designed these technologies are often male-dominated.

More troubling are the submissive, even flirty ways in which female voice assistants are scripted to respond to sexual harassment from men (while reacting negatively towards the same treatment from women), a problem highlighted in an article published in Quartz.

If a male user tells Siri, ‘you’re a sl*t’, or asks for sexual favours, the voice replies with different options, including ‘I’d blush if I could’. Meanwhile, Alexa responds, ‘well thanks for the feedback’. 

Picture1
Credit: UNESCO 2019

“All of the issues associated with virtual assistants are applicable to robots. But in addition, robots are embodied”, says the John L. Hinds professor of history of science at Stanford University, Londa Schiebinger.

Her paper Gendering Social Robots says when a robot gendered in a binary way as female or male is assigned to stereotypically gendered tasks, it can reinforce social norms.

Schiebinger puts forward a range of alternative approaches that could help promote social equality and rethink or reconfigure gender norms and stereotypes. Options include designing robots to challenge stereotypes, be gender fluid, or gender neutral, or simply to have robot-specific rather than human identities.

“I suggest a virtuous design cycle. Roboticists should be aware of the social norms that they’re tapping into, with their robots—gender norms, norms for race, and ethnicity, norms for abilities, for sexuality, for many different things. Then roboticists can design critically so that their robots embody features that challenge the current social norms. And this will help society and users rethink social norms to create more equitable societies.” 

Keay says one approach is to make robots friendly, while minimising human features. For example, she says the company Savioke, “build robots that look appropriate for their tasks, and they don’t over anthropomorphise – and this is actually really good practice for almost every robot company”.

Savioke’s Relay is a butler that delivers to hotel rooms. It looks and sounds like a droid and bleeps and bloops instead of speaking.

After seeing that people wanted to get a photo for Instagram or TikTok with the robot after it delivered something to their hotel room, the designers programmed in a couple of seconds of wait time and a little robot happy dance. She says, “so the robot would stay around afterwards and do a little bit of shimmying to and fro. And that would give people the photo opportunity.”

Robot designer and author Carla Diana says: “I just think as a designer, the effort to replicate a human is really misguided. And to literally represent a human has few practical benefits and a lot of engineering challenges.”

She is keen on the use of abstraction in design. For example, a flashing red light can communicate something’s wrong, more efficiently than speech which says, “warning there is an alert.”

Like Moxi, voice operated companion robot ElliQ draws on some of these ideas.

Ccbysa20 v rao banitha
TOPIO Tosy ping pong playing robot / Credit: V Rao Bathina CC BY-SA 2.0

Dor Skuler, co-founder and CEO of Intuition Robotics is clear that while ElliQ is a loyal, loveable character with strong opinions and a sense of humour, she is “an entity and not a human.”

The voice-operated companion robot – which could be mistaken for a designer table lamp – was conceived as “a product for the older adult to delight them to improve their independence and their health and their quality of life”, says Skuler.

He says: “this is a very complex problem. You know, it’s part psychology, it’s part caregiving. It’s part entertainment. And it’s a big part technology.”

That’s why Intuition drew on a multidisciplinary team bringing together a gerontologist, script writers, animators, a behavioural psychologist, artificial intelligence algorithm experts and user experience research.

While ElliQ is a ‘she’ – a decision based on extensive customer research which found that while men had no preference, women preferred a female robot persona. Skuler is clear however that ElliQ communicates as a machine entity and never tries to deceive users into thinking she is human or alive.

“We’re seeing a lot of technologists trying to make AIs and also robots in a way that will fool us to think it’s a human … I do not understand why it’s happening. And I think it’s ethically questionable. Why would we possibly want to develop technology, which is purpose built to deceive people?”

A robot
ElliQ / Credit: Intuition Robotics

“We go very great lengths to make sure that we never project ElliQ as a human […] her name doesn’t sound human. She does not have eyes. Her name actually seems technical – ElliQ sounds mechanic. We added a filter to her voice to make her sound more robotic, rather than more human.

“And whenever you talk to her, she will remind you that she’s a robot.”

For instance, if a user tells ElliQ “I love you”, her answer will remind the user that she is a robot. She’ll say something like “stop that will make my processor overheat” or “that makes my fans spin out of control”, Skuler says.

Perugia says change starts by creating awareness about the problem of gendering robots. While she is not opposed to gendering per se, she thinks a more reflective and inclusive design process which brings in a range of different stakeholders and perspectives would be helpful.

“If the field of robotics becomes more inclusive, not just to women, but also of gender nonconforming, non-binary, transgender people… then also the design of robots becomes more inclusive, it goes hand in hand in hand,” she says.

Please login to favourite this article.