Hey Siri, you're sexist, finds UN report on gendered technology

27 May, 2019

Popular digital assistants that reply in a woman's voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report released on Wednesday. The vast majority of assistants such as Apple's Siri, Amazon Alexa and Microsoft's Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.
They are programmed to be submissive and servile - including politely responding to insults - meaning they reinforce gender bias and normalise sexist harassment, said researchers from the UN scientific and cultural body UNESCO.
"Siri's submissiveness in the face of gender abuse - and the servility expressed by so many other digital assistants projected as young women - provides a powerful illustration of gender biases coded into technology products," it said.
Apple, Amazon and Microsoft were all not immediately available for comment.
A spokeswoman for Microsoft has previously said the company researched voice options for Cortana and found "a female voice best supports our goal of creating a digital assistant".
Voice assistants have quickly become embedded into many people's everyday lives and they now account for nearly one-fifth of all internet searches, said the report, which argued they can have a significant cultural impact.
As voice-powered technology reaches into more communities worldwide, the feminisation of digital assistants may help gender biases to take hold and spread, they added.

Read Comments