icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
22 May, 2019 16:37

‘How can you harass code?’ UN report calling ‘feminine’ Alexa & Siri SEXIST prompts ridicule online

‘How can you harass code?’ UN report calling ‘feminine’ Alexa & Siri SEXIST prompts ridicule online

Default feminine voices used in AI assistants like Amazon’s Alexa or Apple’s Siri promote gender stereotypes of female subservience, a new UN report has claimed, prompting the internet to ask the question: “Can you harass code?”

The report, released Wednesday by the UN’s cultural and scientific body UNESCO, found that the majority of AI assistant products – from how they sound to their names and personalities –were designed to be seen as feminine. They were also designed to respond politely to sexual or gendered insults from users, which the report said lead to the normalization of sexual harassment and gender bias. 

Using the example of Apple’s Siri, the researchers found that the AI assistant was programmed to respond positively to derogatory remarks like being called “a bitch,” replying with the phrase “I’d blush if I could.”

“Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,” the study said.

The report warned that as access to voice-powered technology becomes more prevalent around the world, this feminization could have a significant cultural impact by spreading gender biases.

However, many have responded with ridicule to the UN report on social media, asking questions like “how can you sexually harass code?” and accusing the UN of assuming Siri’s gender.

Others lamented the futility of the report, pointing that as long as the voice is changeable, they don’t see how it could be made into a problem.

Meanwhile, Amy Dielh, a researcher on unconscious gender bias at Shippensburg University in Pennsylvania suggested that manufacturers should “stop making digital assistants female by default & program them to discourage insults and abusive language.”

But the UN’s calls for gender-neutral digital assistants may already be becoming a reality. In March, researchers unveiled Q, a voice that can be used by AI assistants and smart speakers and developed to sound “neither male nor female.” In an eerie introductory video, Q says it’s been created “for a future where we’re no longer defined by gender, but rather how we define ourselves.”

Think your friends would be interested? Share this story!

Podcasts
0:00
23:13
0:00
25:0