Cannibal AIs Could Risk Digital 'Mad Cow Disease' Without Fresh Data
A new article in ScienceAlert describes new research into the dangers of "heavily processed sources of digital nourishment" for generative AI:A new study by researchers from Rice University and Stanford University in the US offers evidence that when AI engines are trained on synthetic, machine-made input rather than text and images made by actual people, the quality of their output starts to suffer. The researchers are calling this effect Model Autophagy Disorder (MAD). The AI effectively consumes itself, which means there are parallels for mad cow disease - a neurological disorder in cows that are fed the infected remains of other cattle. Without fresh, real-world data, content produced by AI declines in its level of quality, in its level of diversity, or both, the study shows. It's a warning about a future of AI slop from these models. "Our theoretical and empirical analyses have enabled us to extrapolate what might happen as generative models become ubiquitous and train future models in self-consuming loops," says computer engineer Richard Baraniuk, from Rice University."Some ramifications are clear: without enough fresh real data, future generative models are doomed to MADness." The article notes that "faces began to look more and more like each other when fresh, human-generated training data wasn't involved. In tests using handwritten numbers, the numbers gradually became indecipherable. "Where real data was used but in a fixed way without new data being added, the quality of the output was still degraded, merely taking a little longer to break down. It appears that freshness is crucial." Thanks to long-time Slashdot reader schwit1 for sharing the news.
Read more of this story at Slashdot.