Hacker News new | past | comments | ask | show | jobs | submit login

Quicksort is the exception, in my experience big-O usually refers to worst case running time as opposed to average case or expected running time. CLR(S) [1] sticks to worst case because 1) worst case is a guarantee; 2) the worst case can be frequent; 3) the average case is often roughly as bad as the worst case. This is a widely used textbook, so I generally assume a lot people follow its conventions.

[1] https://mitpress.mit.edu/books/introduction-algorithms (Introduction to Algorithms)




It's also not obvious what "average case" means. For example, if we're talking about the asymptotic behavior of the function

  A(n) = avg { RunningTime(Algo, x) : x is valid input to Algo and len(x) = n }
are we assuming that the inputs are drawn uniformly and therefore have equal weight or are we drawing them from a non-uniform distribution? What if some inputs never occur in practice?

Worst case and best case are totally unambiguous and don't need any additional clarification. There are just so many knobs one can adjust when defining "average".




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: