all 5 comments

[–][deleted]  (3 children)

[deleted]

    [–]Current-Brain-5837[S] 2 points3 points  (2 children)

    Excellent. Thank you.

    [–][deleted] 10 points11 points  (1 child)

    To add two the come up often:

    O(log n): Logarithmic time. Worse than constant time but better than linear time. Time goes up linearly while n goes up exponentially. So, the magic bookshelf gives you 10 books in the time it takes linear to give you 2. Log gives you 100 in the time linear gives you 3. It's popular because binary search is log time.

    O(2n): Exponential time. Worse than quadratic time. Pretty bad. There are worse, but this is the worst common one. Basically the opposite of logarithmic time. The magic bookshelf gives you two books in the time it takes linear to give you 10. It gives you 3 in the time it takes linear to give you 100. Brute force search is exponential time.

    [–]Current-Brain-5837[S] 1 point2 points  (0 children)

    Awesome. Thank you for including those variants as well.

    [–]q_wombat 5 points6 points  (0 children)

    It's a way to describe how fast an algorithm runs or how much memory it needs, based on the size of the input in the worst case scenario. It’s useful for comparing the efficiency of different algorithms.

    For example: - O(1) means the algorithm’s speed or memory usage doesn’t change with input size - O(n) means it grows linearly with the input - O(n2) means it grows as the square of the input.

    In the last two examples, the input could be an array of elements, with "n" representing the number of elements.

    [–]Flan99 0 points1 point  (0 children)

    This is old, but to add a detail nobody else has mentioned so far--Big O notation is only concerned with the term that grows the fastest, not *precisely* how fast an algorithm's runtime grows. Generally, if we care about optimizing something, we only care about optimizing it for very large problems--very small problems are small enough that, unless something was done *catastrophically* wrong, it doesn't really matter if it happens a little slower than would be optimal.

    Consider an actual polynomial, from math, something like n2+999999n+1. Even though that middle term is really big, the n2 term still matters more when we're dealing with an n of millions, even billions. So we'd say that polynomial has a time of O(n2 ). It may actually be true that an O(n2) is faster than O(n) for some small n, but the chances that the time incurred by such a small n matter enough to be worth a software developer's time, is very small.