Algorithms vs Moore’s Law
I saw an impressive chart once of how numerical linear algebra algorithm efficiency have improved over time. I haven't been able to find that chart, but here is something similar. Thanks to Naveen Palli for pointing this out.
Even more remarkable [than Moore's law] - and even less widely understood - is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed.
" Here is just one example " Gritschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later - in 2003 - this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Gritschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.
From a federal report with too long a title, page 71.
As I mentioned in my previous post, many of the speed-ups in optimization are the result of speed-ups in numerical linear algebra, because like so much other applied math, it all boils down to linear algebra.