New Hutter Prize Awarded for Even Smaller Data Compression Milestone
Since 2006 Baldrson (Slashdot reader #78,598) has been part of the team verifying "The Hutter Prize for Lossless Compression of Human Knowledge," an ongoing challenge to compress a 100-MB excerpt of Wikipedia (approximately the amount a human can read in a lifetime). "The intention of this prize is to encourage development of intelligent compressors/programs as a path to Artificial General Intelligence," explains the project's web site. 15 years ago, Baldrson wrote a Slashdot post explaining the logic (titled "Compress Wikipedia and Win AI Prize"):The basic theory, for which Hutter provides a proof, is that after any set of observations the optimal move by an AI is find the smallest program that predicts those observations and then assume its environment is controlled by that program. Think of it as Ockham's Razor on steroids. The amount of the prize also increases based on how much compression is achieved. (So if you compress the 1GB file x% better than the current record, you'll receive x% of the prize...) The first prize was awarded in 2006. And now Baldrson writes:Kaido Orav has just improved 1.38% on the Hutter Prize for Lossless Compression of Human Knowledge with his "fx-cmix" entry. The competition seems to be heating up, with this winner coming a mere 6 months since the prior winner. This is all the more impressive since each improvement in the benchmark approaches the (unknown) minimum size called the Kolmogorov Complexity of the data.
Read more of this story at Slashdot.