Facial recognition must not introduce gender or racial bias, police told
by Frances Perraudin from on (#4G34X)
Benefits should be great enough to outweigh any public distrust, says ethics report
Facial recognition software should only be used by police if they can prove it will not introduce gender or racial bias to operations, an ethics panel has said.
A report by the London policing ethics panel, which was set up to advise City Hall, concluded that while there were "important ethical issues to be addressed" in the use of the controversial technology, they did not justify not using it at all.
Continue reading...