'I think my blackness is interfering': does facial recognition show racial bias?
by Nellie Bowles in San Francisco from on (#19VBP)
The latest research into facial recognition technology used by police across the US has found that systems disproportionately target vulnerable minorities
Cameras are used routinely by police across the US to identify citizens, their faces cross-matched against databases of suspects and past criminals.
Yet researchers claim there is too little scrutiny of how these tools work, and have found inherent racial bias in the system. So does a sophisticated, visual analysis tool reflect human prejudice and if so, who does that effect?
Continue reading...