Article 56SY7 Police use of facial recognition violates human rights, UK court rules

Police use of facial recognition violates human rights, UK court rules

by
Kate Cox
from Ars Technica - All content on (#56SY7)
GettyImages-1199025045-800x533.jpg

Enlarge / A close-up of a police facial recognition camera in use at the Cardiff City Stadium on January 12, 2020 in Cardiff, Wales. Police used the technology to identify individuals who were issued with football banning orders in an attempt to prevent disorder. Critics argued that the use of such technology is invasive and discriminatory. (credit: Matthew Horwood | Getty Images)

Privacy advocates in the UK are claiming victory as an appeals court ruled today that police use of facial recognition technology in that country has "fundamental deficiencies" and violates several laws.

South Wales Police began using automated facial recognition technology on a trial basis in 2017, deploying a system called AFR Locate overtly at several-dozen major events such as soccer matches. Police matched the scans against watchlists of known individuals to identify persons who were wanted by the police, had open warrants against them, or were in some other way persons of interest.

In 2019, Cardiff resident Ed Bridges filed suit against the police, alleging that having his face scanned in 2017 and 2018 was a violation of his legal rights. Although he was backed by UK civil rights organization Liberty, Bridges lost his suit in 2019, but the Court of Appeal today overturned that ruling, finding that the South Wales Police facial recognition program was unlawful.

Read 9 remaining paragraphs | Comments

index?i=JOBbI2j-iGQ:5pqvoAYvN2k:V_sGLiPB index?i=JOBbI2j-iGQ:5pqvoAYvN2k:F7zBnMyn index?d=qj6IDK7rITs index?d=yIl2AUoC8zA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments