Digital assistants like Siri and Alexa entrench gender biases, says UN
by Kevin Rawlinson from on (#4FKVF)
Female-voiced tech often gives submissive responses to queries, Unesco report finds
Assigning female genders to digital assistants such as Apple's Siri and Amazon's Alexa is helping entrench harmful gender biases, according to a UN agency.
Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries - including outright abusive ones - reinforce ideas of women as subservient.
Continue reading...