Hacker News new | past | comments | ask | show | jobs | submit login
Progress In Algorithms Beats Moore's Law (agtb.wordpress.com)
10 points by yarapavan on Dec 25, 2010 | hide | past | favorite | 8 comments



I'm a fan of Proebsting's Law:

Proebsting's Law: Compiler Advances Double Computing Power Every 18 Years

I claim the following simple experiment supports this depressing claim. Run your favorite set of benchmarks with your favorite state-of-the-art optimizing compiler. Run the benchmarks both with and without optimizations enabled. The ratio of of those numbers represents the entirety of the contribution of compiler optimizations to speeding up those benchmarks. Let's assume that this ratio is about 4X for typical real-world applications, and let's further assume that compiler optimization work has been going on for about 36 years. These assumptions lead to the conclusion that compiler optimization advances double computing power every 18 years. QED.

This means that while hardware computing horsepower increases at roughly 60%/year, compiler optimizations contribute only 4%. Basically, compiler optimization work makes only marginal contributions.


Moore's Law doesn't just bring processor speed, but also vast quantities of RAM. How much of the improvement credited to algorithms came from newly-affordable time/space tradeoffs brought by Moore's Law? I like the question this post is raising, but would like to see it approached in a more careful manner.


Also with more RAM, more and more datasets are fitting in memory. You can have all of Wikipedia in memory on a machine that costs as much as a Venti Latte per day.


good point: moore's law has quadratic effects on the ability to use computers. We could take this a step further and say there's an email law which is thousands of times further from the increase in algorithms or clock speeds: more emails sent than in 1990 by a staggering amount :P


Perhaps Moore's law is a consequence of people's ability to design algorithms. Hardware design needs similar resources of the mind. Circuit routing, simulations, benchmarking, design re-use, HDL code bases ( http://en.wikipedia.org/wiki/Hardware_description_language )...


Digging deeper, you will hit the characteristic comparative pattern of platonic and the physical. Chips are build using matter and the limiting factors are ultimately physical. Algorithms are embodied on physical machines, but the bottlenecks are primarily conceptual. In that sense, the takeaway from this for me is a verification of the intuition that conceptual models offer greater degrees of freedom. (duh.)


This is really interesting, since a lot of the fundamental algorithms you learn about in school (sorting, Dijkstra's, etc.) haven't been improved for decades, and in some cases are even theoretically impossible to improve. What kinds of algorithms have steadily improved over time?


> What kinds of algorithms have steadily improved over time?

What? are you requesting evidence for their claim? How dare you!




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: