Article 6HDZ7 A New Olympics Event: Algorithmic Video Surveillance

A New Olympics Event: Algorithmic Video Surveillance

by
Lucas Laursen
from IEEE Spectrum on (#6HDZ7)
an-illustration-of-the-olympics-rings-wi

As skiers schussed and swerved in a snow park outside Beijing during the 2022 Winter Olympics, a few may have noticed a string of towers along the way. Did they know that those towers were collecting wavelengths across the spectrum and scouring the data for signs of suspicious movement? Did they care that they were the involuntary subjects of an Internet of Things-based experiment in border surveillance?

This summer, at the Paris Olympic Games, security officials will perform a much bigger experiment in the heart of the City of Light, covering the events, the entire Olympic village, and the connecting roads and rails. It will proceed under a temporary law allowing automated surveillance systems to detect predetermined events" of the sort that might lead to terrorist attacks.

This article is part of our special report Top Tech 2024.

This time, people care. Well, privacy activists do. AI-driven mass surveillance is a dangerous political project that could lead to broad violations of human rights. Every action in a public space will get sucked into a dragnet of surveillance infrastructure, undermining fundamental civic freedoms," said Agnes Callamard, Amnesty International's secretary general, soon after the law passed.

Yet the wider public seems unconcerned. Indeed, when officials in Seine-Saint-Denis, one of the districts hosting the Olympics, presented information about a preliminary AI-powered video surveillance system that would detect and issue fines for antisocial behavior such as littering, residents raised their hands and asked why it wasn't yet on their streets.

Surveillance is not a monolithic concept. Not everyone is against surveillance," says anthropology graduate student Matheus Viegas Ferrari of the Universidade Federal da Bahia, in Brazil, and the Universite Paris 8: Saint-Denis, in Paris, who attended the community meeting in Seine-Saint-Denis and published a study of surveillance at the 2024 Olympics.

Anyone who fumes at neighbors who don't pick up after their dogs can identify with the surveillance-welcoming residents of Seine-Saint-Denis. If, however, the surveillance system fines one neglectful neighbor more than another because its algorithm favors one skin color or clothing style over another, opinions could change.

Indeed France and other countries in the European Union are in the midst of hammering out the finer details of the European Union's AI Act, which seeks to protect citizens' privacy and rights by regulating government and commercial use of AI. Already, poor implementation of an AI law related to welfare policy has felled one European government.

Countries often treat the Olympics like a security trade fair.

It seems the temporary surveillance law-the video-processing clause of which expires in March 202?-was written to avoid that outcome. It insists that algorithms under its authority do not process any biometric data and do not implement any facial recognition techniques. They cannot carry out any reconciliation, interconnection or automated linking with other processing of personal data."

Paolo Cirio, an artist who once printed posters of police officers' faces and put them up around Paris in an unsanctioned exercise in crowd-sourced facial recognition, sees such language as progress. The fact that even during the Olympics in France, the government has to write in the law that they're not going to use biometric tech, that's already something incredible to me," he says. That's the result of activists fighting for years in France, in Europe, and elsewhere."

Safety in Numbers?

What officials can do instead of biometric analysis and face recognition is use computers for real-time crowd analysis. The technique goes back a long time, and many aspects of many kinds of crowd behavior have been studied; it has even been used to prevent hens from murdering each other. And while crowds may be irrational, the study of crowds is a science.

A crowd, however, may not really offer anonymity to its members. European civil-society groups argued in an open letter that the surveillance would necessarily require isolating and therefore identifying individuals, depriving innocent people of their privacy rights.

Whether this is true is unclear; the fast evolution of the technologies involved makes it a difficult question to answer. You don't have to identify the people," says data scientist Jonathan Weber of the University of Haute-Alsace, in Mulhouse, France, and coauthor of a review of video crowd analysis. Instead, programmers can train a neural network on people-like shapes until it reliably identifies human beings in subsequent video. Then they can train the neural network on more sophisticated patterns, such as people falling over, running, fighting, even arguing, or carrying a knife.

The alerts we raise are not based on biometrics, just a position, such as whether a person is lying on the ground," says Alan Ferbach, cofounder and CEO of Videtics, a company in Paris that submitted a bid for part of the 2024 Olympics security contract. Videntis is already selling software that detects falls in buildings, or illegal dumping outdoors, neither of which requires identifying individuals.

a-white-ovoid-hangs-from-a-horizontal-fiA surveillance camera watches over the sledding center at the 2022 Winter Olympics.Getty Images

But that might not be enough to satisfy critics. Even just categorizing people's behavior can be equally invasive and dangerous as identifying people because it can lead to errors, discrimination, violation of privacy and anonymity in public spaces and can impact on fair trial rights and access to justice," says Karolina Iwaska, the digital civil space advisor at the European Center for Not-for-Profit Law, a civil-society organization based in the Hague, Netherlands. It has filed an amicus brief on the Olympics surveillance law to France's Constitutional Council.

Weber is particularly concerned with how skewed training data could lead to problematic crowd-analysis AIs. For example, when the ACLU compared photos of U.S. congressional representatives to mug shots, the software disproportionately falsely identified darker-skinned people as matches. The potential biases in such an algorithm will depend on how its software developers train it, says Weber: You have to be very careful and it's one of the biggest problems: Probably you won't have tons of video of people with dangerous behavior available to train the algorithm."

In my opinion, we have to certify the training pipeline," Ferbach says. Then different companies could develop their own models based on certified training sets. If we need to certify each model the cost will be huge." EU regulators have yet to resolve how the AI Act will address that.

If software developers can put together enough real-life or simulated video of bad behavior to train their algorithms without bias, they will still have to figure out what to do with all the real-world data they collect. The more data you collect, the more danger there is in the future that that data can end up in the public or in the wrong hands," Cirio says. In response, some companies use face-blurring tools to reduce the possibility of a leak containing personal data. Other researchers propose recording video from directly overhead, to avoid recording people's faces.

Maybe You Need Biometrics

Other researchers are pulling in the opposite direction by developing tools to recognize individuals or at least differentiate them from others in a video, using gait analysis. If this technique were applied to surveillance video, it would violate the French Olympics law and sidestep the privacy-preserving effects of face blurring and overhead video capture. That the law proscribes biometric data processing while permitting algorithmic event detection, seems to be nothing more than wishful thinking," says Iwaska. I cannot imagine how the system is supposed to work as intended without necessarily processing biometric data."

Surveillance Creep

Another question that troubles Olympics security watchers is how long the system should remain in place. It is very common for governments that want more surveillance to use some inciting event, like an attack or a big event coming up, to justify it," says Matthew Guariglia, senior policy analyst at the Electronic Frontier Foundation, a civil-society organization in San Francisco. The infrastructure stays in place and very easily gets repurposed for everyday policing."

The French Olympics law includes an expiration date, but Iwaska calls that arbitrary. She says it was made without any assessment of necessity or proportionality" to the two months of the Olympics and Paralympics."

Other historians of security technology and the Olympics have pointed out that countries often treat the Olympics like a security trade fair. And even if France stops using its video-processing algorithms in public places after the Olympics law expires, other countries may purchase it from French companies for their domestic use. Indeed, after China's 2008 Olympics, Ecuador and other countries with mixed human rights records purchased surveillance equipment based on systems displayed at the 2008 Olympics. The surveillance industry, in France and elsewhere, stands to gain a lot from the exposure. Human rights in other countries may suffer.

The Olympics have also served as a testbed for ways to subvert annoying security measures. When officials installed a fence around the Lake Placid Olympics Village in 1980, athletes kept leaning against the fence, setting off alarms. After some time, security officials noticed the alarms weren't working at all. It turned out that somebody, perhaps even a security official, had unplugged the alarm system.

This article appears in the January 2024 print issue.

External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments