The threat of robots which embody “gendered stereotypes” is already emboldening the purple-haired brigade to make an early move against the prospect of robot catgirls.
The writer in this case is convinced that robots will reflect the human race as a whole and that their impending gender stereotypes will prove to be a problem:
“A robot is a mirror held up not just to its creator, but to our whole species: What we make of the machine reflects what we are. That also means we have the very real opportunity to screw up robots by infusing them with exaggerated, overly simplified gender stereotypes. So maybe robots aren’t simply a mirror.”
Bringing up facts from “research”, the article points out that individuals prefer male voices for an authoritative presence and female voices for guidance, this preferred distinction apparently bound to bar women from accessing more commanding positions should it be applied to robots:
“Perhaps the biggest issue—yet most subtle—is gender. How gender biases manifest in the design of voice assistants is well-worn territory. Research shows that users tend to like a male voice when an authoritative presence is needed and a female voice when receiving helpful guidance. Scientists are just beginning to consider how these gender biases materialize in physical robots.”
Believing that robot creators may end up using these preferences to “give robots a gender” and actually make female simulacra which appear to have agreeable personalities, there is posited to be a clear case of cisploitation underway:
“Gender is a complicated mix of biology, which robots don’t have, and how we feel about that biology, feelings that robots also lack. Yet we are already finding ways to mirror our social problems in our robots. One study, for instance, found that participants judged a robot programmed to perform security work as more masculine, while they judged the same robot instead programmed for guidance to be more feminine (echoing the gender preferences toward voice assistants). The danger is that robot makers, consciously or not, may exploit gender stereotypes to try to make their machines more effective—designing a receptionist robot to be more feminine and therefore more ‘welcoming,’ or a security robot to be more broad-shouldered and therefore more ‘authoritative.’
It doesn’t have to be this way. Robots could just as easily be used to confront, and begin changing, those stereotypes.”
Robots with female figures are particularly threatening to those who don’t have them it would seem:
“With human-like robots, even subtle design choices can telegraph gender. A recent study found that when shown images of humanoid robots, people consistently chose a particular pronoun to go along with them: They referred to a robot with a straight torso with a male pronoun almost 90 percent of the time, but robots with a more pronounced waist were deemed more feminine. Big shoulders, on the other hand, were classified as more masculine.”
Genderfluid feminators can presumably expected to be stolidly cylindrical and purple-topped.