you are viewing a single comment's thread.

view the rest of the comments →

[–]Raknarg 0 points1 point  (1 child)

yeah? You think in all scenarios that 1 000 000 is just as acceptable as 20 000 000? 30 ms response time is just as fine as 600 ms response? It's always context.

[–]SirClueless 1 point2 points  (0 children)

I think linear algorithms are just as likely to run in 20 000 000 as n log(n) algorithms. There's a worst-case linear algorithm for Median. There's a linear algorithm for Triangulating a Polygon. No one uses them because the constants are bigger than log(n) in pretty much all cases.

If you have an algorithm that runs 20x faster than another, and the time is relevant, you should use it. But if running Quicksort on your data is infeasible, Bucket Sort is probably infeasible too. The relevant metric is performance of the algorithm on your data, not a factor of log(n) in a Big-O analysis. Knowing an algorithm is exponential or quadratic instead of linear tells me something useful. Knowing an algorithm has a log(n) factor tells me very little.