Published: Fri, May 24, 2019
Electronics | By Kelly Massey

Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN

Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN

Artificial female voice assistants like Siri, Alexa and Cortana reinforce harmful gender biases, according to a report by the UN.

"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation", the report says.

Tech juggernauts like Apple and Amazon have spent decades finessing voice assistant technology, but they've kept one aspect of this in the dark ages: These "assistants" are always female and they're perpetuating a damaging myth. In "The Rise of Gendered AI and Its Troubling Implications" section, the report says it's a problem that millions of people are getting accustomed to commanding female-voiced assistants that are "servile, obedient and unfailingly polite", even when confronted with harassment from humans. It honours commands and responds to queries regardless of their tone or hostility.

Titled "I'd blush if I could", after a response Siri utters when receiving certain sexually explicit commands, the paper explores the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children, to treat these digital voice assistants as unquestioning helpers who exist only to serve owners unconditionally.

Many tech companies have chosen female voices over male ones because women are seen as "helpful", while men's voices are seen as "authoritative".

Pochettino: No decision on Spurs future until after Champions League final
The 13-time European champions have not received Pochettino's words well and have called out the Tottenham manager's comments in an official statement.


"The assistant's submissiveness in the face of gender abuse remains unchanged since the technology's wide release in 2011", the United Nations said, adding that the companies were "staffed by overwhelmingly male engineering teams".

A handful of media outlets have attempted to document the many ways soft sexual provocations elicit flirtatious or coy responses from machines. Why are female-sounding voice assistants so ubiquitous?

Voice assistants with a female voice, such as Siri or Alexa, confirm gender bias. Samsung's personal assistant Bixby received praise for allowing the user to select a male or female voice at the outset. When asked, "Who's your daddy?', Siri answered, "You are".

The report analyzes inherent gender bias in voice assistants for two purposes: to demonstrate how unequal workplaces can produce sexist products, and how sexist products can perpetuate unsafe, misogynistic behaviors.

According to CNet, Amazon and Apple didn't respond to its requests for comment and Microsoft declined to provide comment following its coverage of the report. Women only constitute 12% of the population of AI researchers.

Like this: