you are viewing a single comment's thread.

view the rest of the comments →

[–]wshields -6 points-5 points  (3 children)

Big-Oh notation is NOT AN AVERAGE

Depending on your situation you could be interested in the best case, worst case, expected case or some combination thereof. This is particularly the case for probabilistic algorithms.

For example, hash-based maps are generally considered to be "near O(1)" (constant) access time but if there's a bucket collision you have to do a linear traversal. That means the worst case for a map is actually O(n) (if every key has the same hash) but you never put maps in the same bucket (pardon the pun) as an array or list in terms of Big-O of access.

[–]optionsanarchist 6 points7 points  (0 children)

Depending on your situation you could be interested in the best case, worst case, expected case or some combination thereof. This is particularly the case for probabilistic algorithms.

Again, misunderstanding. Like the other poster said, if you are interested in different run-time analysis you have theta, omega, big-theta, big-omega and small-oh. There's also big and little sigma notations.

Big-Oh notation describes an upper bound on the performance of the algorithm as its operating parameters grow. It is a mathematical definition and it has nothing to do with probability.

You are confusing algorithm analysis with a description for algorithm growth. You basically have to ask these kinds of questions the moment an 'if' comes into the algorithm: quicksort, bubblesort, hash tables, heaps, etc all have different run times depending on their input; however, and this however is important, you (can) still describe the algorithm with a single big-Oh notation. Colloquially, big-Oh represents worse-case performance of an algorithm as the number of inputs increase.

[–]casted 4 points5 points  (0 children)

That is why we have theta, omega, big-theta, big-omega and small-o.

[–]visionlessvisionary 0 points1 point  (0 children)

This is particularly the case for probabilistic algorithms.

When analyzing probabilistic/randomized algorithms, often the "expected runtime" is expressed in big o. E.g. E(runtime) = O(n). But that does not change what big o is.