Spain Is Using AI For Domestic Violence Risk Assessment And It’s Going As Well As You’d Expect
There are some decent uses of AI. And then there's everything else.
While those with horses in the AI race continue to proclaim the miracles their horses will soon be performing at some indefinite point in the future, a whole lot of entities just see AI as a way to shed human capital and replace faulty" people with equally faulty intelligence."
Law enforcement is just as susceptible to be suckered in by unfulfilled promises and shiny tech as anyone else. In Spain, the shift from humans to AI isn't quite as dramatic as replacing workers with rooms full of GPUs. It's a blend of people and processes - one that values processes more than people and delivers exactly the results anyone should expect from outsourcing intuition and compassion to a string of 1s and 0s.
This report from the New York Times opens with the story of Spanish resident Lobna Hemid, who reported an attack by her husband to police. During this attack, her husband (Bouthaer El Banaisati) smashed a wooden shoe rack and beat her with one of the pieces while calling her a worthless whore."
This is how things went for Lobna Hemid:
Before Ms. Hemid left the station that night, the police had to determine if she was in danger of being attacked again and needed support. A police officer clicked through 35 yes or no questions - Was a weapon used? Were there economic problems? Has the aggressor shown controlling behaviors?-to feed into an algorithm called VioGen that would help generate an answer.
VioGen produced a score:
LOW RISK
Lobna Hemid
2022MadridThe police accepted the software's judgment and Ms. Hemid went home with no further protection. Mr. el Banaisati, who was imprisoned that night, was released the next day. Seven weeks later, hefatally stabbedMs. Hemid several times in the chest and abdomen before killing himself. She was 32 years old.
As the report notes, Spain has relied so heavily on this combination of questionnaire and software for so long, it's difficult to tell just how much direct interaction by police officers is actually occurring. The case load is large - 92,000 cases - but the outcomes are far from optimal.
[R]oughly 8 percent of women who the algorithm found to be at negligible risk and 14 percent at low risk have reported being harmed again, according to Spain's Interior Ministry, which oversees the system.
At least 247 women have also been killed by their current or former partner since 2007 after being assessed by VioGen, according to government figures.
This is definitely ugly. And it's an indictment of a system put in place by law enforcement and legislators that think domestic violence is a problem that can be solved by a short series of questions and the disinterested input of an algorithm that tells cops whether or not they can go back to ignoring this problem.
There will be no rant against AI here. This isn't an AI problem. It's a law enforcement problem. There's no reason to believe the outcomes would be any better if this were handled by police officers alone, without the aid of pre-written questions and/or algorithms Spanish citizens are definitely overpaying for.
The problems with police and domestic violence run much deeper than underperforming software. The simple fact is that most cops don't care much about domestic violence and care even less about what happens to women.
For most of its existence, law enforcement has been almost exclusively male. Even the relatively recent addition of women to police forces hasn't really changed the underlying current - one that is predominately male and one that overwhelmingly protects male officers.
There are plenty of stories about female police officers who have been forced outside of the Thin Blue Line, considering less deserving of the protections given to male officers. Female officers have been harassed, marginalized, mistreated, even raped by fellow officers.
And things get no better on the home front. Data (what little there is of it) suggests law enforcement officers engage in domestic violence at a higher rate than the rest of the population. Even if this actually isn't the case (the data is old and disputed), it's undeniable that law enforcement agencies and unions do their best to protect the abusers in their midst from being punished - much less fired - for engaging in acts of domestic violence.
When it's just a female citizen asking for help, the sad truth is most officers couldn't care less. If anyone doubts this assertion, I would not-so-kindly point them in the direction of multiple agencies with months- or years-long backlogs of untested rape kits.
There's a problem here that can't be solved the way we're handling things now. Sprinkling AI on top of the underlying issues doesn't make them go away.
One of the biggest problems is that we expect cops to not only give a shit about domestic violence but actually do something about it. But they're not trained to handle this job - something that would be better served by counselors, social workers, and people actually invested in saving someone from perpetual violence. Cops can certainly carry out arrests of perpetrators, but for the rest of the job - actually protecting someone from future violence - they're pretty much completely useless. For everything else, nearly anyone else would be better equipped.
But that's not the system we have in place. Lobna Hemid's case clearly indicates that what's in place doesn't work, even when sprinkled with AI pixie dust. Hemid wanted to be protected from further abuse. The officer she spoke to just wanted to complete a report. Providing a questionnaire may make it easier for cops to ask questions they wouldn't think to ask on their own, but it's incapable of making them actually care about the person they're speaking to, much less what might happen to them if their risk assessment is wrong.
All this combination of questions and AI does is allow officers to believe whatever horrific violence is inflicted on victims like Hemid isn't their fault. They did all they could... or, at least, all they were trained to do.
Hopefully, this reporting will open the eyes of those in the Spanish government with the power to change things. This can't be handled by AI and yes/no check boxes. And it damn sure can't be handled by cops, who not only give every indication they don't really care what happens to domestic violence victims, but actively protect fellow officers who perpetuate this sort of violence on their own families. If anyone's serious about protecting people from domestic violence, they need to turn the job over to people who actually care, rather than those who just view it as another crime on par with vandalism or a stolen stereo.
And in cases like these - where intuition and compassion are a must - AI should never be allowed to replace these under-utilized aspects of humanity.