Cops bogged down by flood of fake AI child sex images, report says
Enlarge (credit: SB Arts Media | iStock / Getty Images Plus)
Law enforcement is continuing to warn that a "flood" of AI-generated fake child sex images is making it harder to investigate real crimes against abused children, The New York Times reported.
Last year, after researchers uncovered thousands of realistic but fake AI child sex images online, quickly every attorney general across the US called on Congress to set up a committee to squash the problem. But so far, Congress has moved slowly, while only a few states have specifically banned AI-generated non-consensual intimate imagery. Meanwhile, law enforcement continues to struggle with figuring out how to confront bad actors found to be creating and sharing images that, for now, largely exist in a legal gray zone.
Creating sexually explicit images of children through the use of artificial intelligence is a particularly heinous form of online exploitation," Steve Grocki, the chief of the Justice Department's child exploitation and obscenity section, told The Times. Experts told The Washington Post in 2023 that risks of realistic but fake images spreading included normalizing child sexual exploitation, luring more children into harm's way, and making it harder for law enforcement to find actual children being harmed.