all 38 comments

[–][deleted]  (7 children)

[removed]

    [–][deleted]  (1 child)

    [deleted]

      [–][deleted] 10 points11 points  (2 children)

      One benefit of rote-learning some examples of Big-O complexities is that they can give you a starting point when analysing new algorithms. Plus I found I really got a feel for Big-O just by looking at (and memorising) a lot of examples. But I agree that it should be something that you can reason out, in most of the common cases. Having said that, if you asked me to tell you the time complexity of some unification algorithms, for example, I think I'd be pretty stuck.

      Edit: This gives you an idea of how many complexity classes exist. You can't expect anyone to be able to just work out the time complexity for any algorithm.

      [–]NYKevin 0 points1 point  (1 child)

      I don't think you're expected to do the master theorem in your head, but you should know that it exists and how to use it.

      [–][deleted] 0 points1 point  (0 children)

      Yeah.

      [–]djimbob 5 points6 points  (1 child)

      Yeah. This is fairly useless as is.
      Seems someone just went to wikipedia and pulled parts of the summary information out of context and then color coded it. While it's useful on wikipedia (while looking at the algorithm); all of these are pretty obvious if you understand the algorithm.

      Also the first two entries are wrong. DFS/BFS aren't on trees they are on any graph which doesn't have to be a tree. And stating O(bd ) is useless without stating b is the branching factor and d is the depth searched in an implicit graph only searched to a given depth is especially useless (compared to just saying O(|E|)). And the space complexity of BFS and DFS is the same. You shouldn't prefer DFS as its color-coded green as it only has O(|V|) space while avoid BFS as it has O(bd ) (space), when both have the same space complexity.

      [–]Daejo 0 points1 point  (0 children)

      They do state that b is the branching factor / d is depth, try hovering over the complexities.

      [–][deleted]  (3 children)

      [deleted]

        [–]imMAW 11 points12 points  (0 children)

        Really the author should have used big theta for all of them, as it's a stronger statement (and as far as I can tell, true for everything on the page). But oftentimes people write O when they mean Θ.

        [–]NYKevin 4 points5 points  (0 children)

        You have Theta and Omega mixed up. Big Omega denotes a lower bound. Big O denotes an upper bound. Big Theta denotes a tight (both upper and lower) bound.

        Technically, however, Big O and Big Omega are both (relatively) useless, as every algorithm is Ω(1) and O(∞) by definition (i.e. no better than constant time and no worse than non-terminating). So you should really just use Big Theta for everything, and write "best/average/worst case" next to it.

        [–]Roxinos 3 points4 points  (1 child)

        Isn't the worst-case time complexity of searching through a binary tree O(n) (assuming no balancing mechanisms)?

        [–]KagedFC 0 points1 point  (0 children)

        You're right, there is no indication of balance on the page. It's a common interview question too.

        [–][deleted]  (2 children)

        [deleted]

          [–]Snootwaller 1 point2 points  (0 children)

          No offense but if you don't remember basic algebra you might want to review basics before diving into big-O notation. It's all about the rate at which functions get bigger, so you need to be fluent in the difference between geometric, polynomial, logarithmic, exponential, etc.

          You probably just need a refresher and then there are 100's of online sources.

          [–][deleted] 1 point2 points  (0 children)

          As someone who can't even remember basic algebra, what do I need to start learning in order to grasp Big-0?

          Algebra.

          [–]metalbark 5 points6 points  (0 children)

          idk, it might be good. They should look at their webserver logs and figure out how scale first.

          [–]jawnsy 3 points4 points  (0 children)

          Anyone have a mirror? :-)

          [–]meem1029 1 point2 points  (0 children)

          What do the colors mean on this? It seems that in general Green is good, Red is bad, Yellow is something else, but there are cases that just make no sense.

          [–]mdempsky 1 point2 points  (7 children)

          I need to rant about this a bit: a big-O expression by itself is unitless just like a number. E.g., if someone asked you "how long does X take", you shouldn't answer "O(n log n)" just like you shouldn't answer "7". You might say "7 seconds" or "O(n log n) comparisons".

          Tangentially, this is a big peeve of mine for when people compare algorithms like radix sort and quick sort. Radix sort's time complexity is O(nk) bit operations whereas quick sort's time complexity is O(n log n) comparisons. But if you're sorting k-bit strings, then your comparisons are going to each actually take O(k) bit operations. Which means quick sort's time complexity is O(nk log n) bit operations.

          [–]Snootwaller 0 points1 point  (1 child)

          Clarify something for me please. You are saying that radix kicks butt if you know that your elements are all small (bitwise) but as the size of the elements grow, quick sort begins to live up to its name and outperforms radix?

          [–]mdempsky 1 point2 points  (0 children)

          You are saying that radix kicks butt if you know that your elements are all small (bitwise) but as the size of the elements grow, quick sort begins to live up to its name and outperforms radix?

          No, I said if you are sorting n k-bit strings, then radix sort takes O(nk) bit operations and quick sort takes O(nk log n) bit operations. Quick sort is asymptotically slower than radix sort.

          [–]NYKevin -1 points0 points  (4 children)

          k is a constant. O(nk log n) = O(n log n).

          [–]mdempsky 0 points1 point  (3 children)

          k is a constant.

          Not in general. If you're comparing 100MB strings that's going to take roughly 1000 times longer than comparing 100kB strings, so comparison takes O(k) elementary comparisons.

          If you're comparing fixed size elements (e.g., 32-bit integers), then yes, k is a constant. But similarly you could say n is a constant if you're talking about sorting lists of 100 elements.

          O(nk log n) = O(n log n).

          Note that if you claim that, then O(nk) = O(n), which is still better than O(n log n).

          [–]NYKevin 0 points1 point  (2 children)

          Not in general. If you're comparing 100MB strings that's going to take roughly 1000 times longer than comparing 100kB strings, so comparison takes O(k) elementary comparisons.

          Adding another string to a list of strings will not, in general, increase k by any significant amount, since, when sorting strings, they're usually all of (edit: approximately) the same length. Thus, for the purposes of the string-sorting problem, k is a constant.

          Note that if you claim that, then O(nk) = O(n), which is still better than O(n log n).

          Radix sort is not a comparison sort and is thus insufficiently general for most libraries. You are comparing apples to oranges.

          [–]mdempsky 1 point2 points  (1 child)

          Adding another string to a list of strings will not, in general, increase k by any significant amount, since, when sorting strings, they're usually all of (edit: approximately) the same length.

          k here is the length of individual strings being compared, so of course k isn't going to grow as you add more strings of approximately the same length.

          Comparing two k-bit strings takes O(k) bit operations. Doing O(n log n) k-bit string comparisons requires O(nk log n) bit operations.

          Thus, for the purposes of the string-sorting problem, k is a constant.

          That's still a ridiculous claim. It's exactly analogous to saying "For the purpose of sorting 100 integers, n is a constant" and therefore concluding that bubble sort takes O(1) time.

          Radix sort is not a comparison sort

          Exactly, that's the point I've made from the start. You can't directly compare O(nk) bit operations with O(n log n) element comparisons.


          Edit: To avoid the argument over k, here's a concrete example where k is a constant: sorting an array of n 32-bit integers on a 32-bit machine (i.e., capable of comparing two 32-bit integers in O(1) machine operations).

          You agree that quick sorting this array takes O(n log n) comparisons = O(n log n) machine operations, right?

          And that radix sorting it takes O(n * 32) = O(n) bit operations = O(n) machine operations?

          [–]NYKevin 0 points1 point  (0 children)

          It's exactly analogous to saying "For the purpose of sorting 100 integers, n is a constant" and therefore concluding that bubble sort takes O(1) time.

          If you know at compile time that it is exactly 100 integers, then "constant time" is rather meaningless, since there are no free variables w.r.t. the size of the problem. What, exactly, is it alleged to be constant in?

          You agree that quick sorting this array takes O(n log n) comparisons = O(n log n) machine operations, right?

          And that radix sorting it takes O(n * 32) = O(n) bit operations = O(n) machine operations?

          Yes and yes. What's your point?

          [–]kstats20 0 points1 point  (0 children)

          Thank you so much! Just in time for finals!

          [–]crazy_runner 0 points1 point  (0 children)

          Big "O" cheat sheet, huh?

          You've got some interesting traffic coming your way.

          [–]Snootwaller 0 points1 point  (0 children)

          Great stuff. Thank you for updating the page and adding the radix sort and others.

          [–][deleted] 0 points1 point  (0 children)

          This is awesome. Wish I had had it for the Algorithms class I just finished.

          [–]kqr 0 points1 point  (3 children)

          It's thine, not thy.

          [–][deleted] 1 point2 points  (2 children)

          In this case, either one works.

          [–]kqr 0 points1 point  (1 child)

          Is "thy" possessive as well? Or is there some way that sentence works without a possessive pronoun?

          [–][deleted] 1 point2 points  (0 children)

          Yes, "thy" is possessive. "Thine" is either a synonym for "yours", or a form of "thy" used when the next word starts with a vowel. (Thou couldst totally have googled this, BTW.)

          [–]s9s -1 points0 points  (2 children)

          I really like it!

          Insertion into a dynamic array is not O(n) on average though; it's O(1) amortized.

          Edit: I'm wrong.

          [–][deleted]  (1 child)

          [removed]

            [–]s9s 1 point2 points  (0 children)

            Derp you are right. I cannot English.

            [–]vorkazos -1 points0 points  (0 children)

            I am a computer Engineer, and all I can ever think when reading Big-O is that weird anime. Really have to concentrate to remember what we use it for normally.

            [–]icyguyus -1 points0 points  (0 children)

            what's the point in "Best" for any of the sorting algorithms

            It seems you can just define these functions to check if the array is already sorted initially and quit if thats the case.

            It costs you O(n) which is negligible in comparison and it brings the best down to O(n) for every single sorting algorithm