Article 4TYZG The post-exponential era of AI and Moore’s Law

The post-exponential era of AI and Moore’s Law

by
Jon Evans
from Crunch Hype on (#4TYZG)

My MacBook Pro is three years old, and for the first time in my life, a three-year-old primary computer doesn't feel like a crisis which must be resolved immediately. True, this is partly because I'm waiting for Apple to fix their keyboard debacle, and partly because I still cannot stomach the Touch Bar. But it is also because three years of performance growth ain't what it used to be.

It is no exaggeration to say that Moore's Law, the mindbogglingly relentless exponential growth in our world's computing power, has been the most significant force in the world for the last fifty years. So its slow deceleration and/or demise are a big deal, and not just because the repercussions are now making their way into every home and every pocket.

We've all lived in hope that some other field would go exponential, giving us another, similar, era, of course. AI/machine learning was the great hope, especially the distant dream of a machine-learning feedback loop, AI improving AI at an exponential pace for decades. That now seems awfully unlikely.

In truth it always did. A couple of years ago I was talking to the CEO of an AI company who argued that AI progress was basically an S-curve, and we had already reached its top for sound processing, were nearing it for image and video, but were only halfway up the curve for text. No prize for guessing which one his company specialized in - but it seems to have been entirely correct.

Earlier this week OpenAI released an update to their analysis from last year regarding how the computing power used by AI1 is increasing. The outcome? It "has been increasing exponentially with a 3.4-month doubling time (by comparison, Moore's Law had a 2-year doubling period). Since 2012, this metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase)."

That's " a lot of computing power to improve the state of the AI art, and it's clear that this growth in compute cannot continue. Not "will not"; can not. Sadly, the exponential growth in the need for computing power to train AI has happened almost exactly contemporaneously with the diminishment of the exponential growth of Moore's Law. Throwing more money at the problem won't help - again, we're talking about exponential rates of growth here, linear expense adjustments won't move the needle.

The takeaway is that, even if we assume great efficiency breakthroughs and performance improvements to reduce the rate of doubling, AI progress seems to be increasingly compute-limited at a time when our collective growth in computing power is beginning to falter. Perhaps there'll be some sort of breakthrough, but in the absence of one, it sounds a whole lot like we're looking at AI/machine-learning progress leveling off, not long from now, and for the foreseeable future.

1It measures "the largest AI training runs," technically, but this seems trend-instructive.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=GQniliPSbUA:dnNUcYwbgTQ:-BT Techcrunch?i=GQniliPSbUA:dnNUcYwbgTQ:D7D Techcrunch?d=qj6IDK7rITsGQniliPSbUA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments