Study uncovers presence of CSAM in popular AI training dataset
LAION-5B contains 1,008 verifiable instances of illegal pictures of children, likely lots more, say researchers
A massive public dataset that served as training data for popular AI image generators including Stable Diffusion has been found to contain thousands of instances of child sexual abuse material (CSAM)....