AIRLINK 75.75 Increased By ▲ 0.32 (0.42%)
BOP 5.11 Increased By ▲ 0.04 (0.79%)
CNERGY 4.69 Decreased By ▼ -0.06 (-1.26%)
DFML 32.53 Increased By ▲ 2.43 (8.07%)
DGKC 88.49 Decreased By ▼ -1.99 (-2.2%)
FCCL 22.70 Decreased By ▼ -0.20 (-0.87%)
FFBL 33.20 Increased By ▲ 0.25 (0.76%)
FFL 10.03 Decreased By ▼ -0.02 (-0.2%)
GGL 11.25 Decreased By ▼ -0.09 (-0.79%)
HBL 114.50 Increased By ▲ 1.01 (0.89%)
HUBC 137.00 Increased By ▲ 0.49 (0.36%)
HUMNL 9.52 Decreased By ▼ -0.38 (-3.84%)
KEL 4.66 No Change ▼ 0.00 (0%)
KOSM 4.70 Increased By ▲ 0.01 (0.21%)
MLCF 40.59 Decreased By ▼ -0.51 (-1.24%)
OGDC 136.91 Increased By ▲ 2.11 (1.57%)
PAEL 27.35 Decreased By ▼ -0.26 (-0.94%)
PIAA 24.70 Decreased By ▼ -0.77 (-3.02%)
PIBTL 6.93 Increased By ▲ 0.01 (0.14%)
PPL 124.60 Increased By ▲ 0.15 (0.12%)
PRL 27.60 Increased By ▲ 0.20 (0.73%)
PTC 14.25 Decreased By ▼ -0.25 (-1.72%)
SEARL 61.60 Increased By ▲ 1.40 (2.33%)
SNGP 72.50 Increased By ▲ 1.95 (2.76%)
SSGC 10.63 Increased By ▲ 0.07 (0.66%)
TELE 8.83 Decreased By ▼ -0.06 (-0.67%)
TPLP 11.73 Decreased By ▼ -0.05 (-0.42%)
TRG 67.25 Decreased By ▼ -0.41 (-0.61%)
UNITY 25.16 Decreased By ▼ -0.01 (-0.04%)
WTL 1.44 Decreased By ▼ -0.04 (-2.7%)
BR100 7,786 Increased By 60.9 (0.79%)
BR30 25,685 Increased By 84.6 (0.33%)
KSE100 74,373 Increased By 573.5 (0.78%)
KSE30 23,895 Increased By 271.7 (1.15%)

Popular digital assistants that reply in a woman's voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report released on Wednesday. The vast majority of assistants such as Apple's Siri, Amazon Alexa and Microsoft's Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.
They are programmed to be submissive and servile - including politely responding to insults - meaning they reinforce gender bias and normalise sexist harassment, said researchers from the UN scientific and cultural body UNESCO.
"Siri's submissiveness in the face of gender abuse - and the servility expressed by so many other digital assistants projected as young women - provides a powerful illustration of gender biases coded into technology products," it said.
Apple, Amazon and Microsoft were all not immediately available for comment.
A spokeswoman for Microsoft has previously said the company researched voice options for Cortana and found "a female voice best supports our goal of creating a digital assistant".
Voice assistants have quickly become embedded into many people's everyday lives and they now account for nearly one-fifth of all internet searches, said the report, which argued they can have a significant cultural impact.
As voice-powered technology reaches into more communities worldwide, the feminisation of digital assistants may help gender biases to take hold and spread, they added.

Copyright Reuters, 2019

Comments

Comments are closed.