Thousands of realistic but fake AI child sex images found online, report says
Enlarge (credit: zf L | Moment)
Child safety experts are growing increasingly powerless to stop thousands of "AI-generated child sex images" from being easily and rapidly created, then shared across dark web pedophile forums, The Washington Post reported.
This "explosion" of "disturbingly" realistic images could help normalize child sexual exploitation, lure more children into harm's way, and make it harder for law enforcement to find actual children being harmed, experts told the Post.
Finding victims depicted in child sexual abuse materials is already a "needle in a haystack problem," Rebecca Portnoff, the director of data science at the nonprofit child-safety group Thorn, told the Post. Now, law enforcement will be further delayed in investigations by efforts to determine if materials are real or not.