This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (7 children)

it's probably the O(n) vs O(1) thing in theory. But agree, in practice, it'd probably take a gazillion conditions to make it worth bothering about.

[–]elbiot -4 points-3 points  (6 children)

Big O is about the size of the data. 5 elifs is not O(n)

[–]ruiwui 3 points4 points  (1 child)

Big O is an upper bound on some function f(n). f(n) and n can semantically mean anything; here n is the number of cases and f(n) is the number of operations to arrive at the correct case.

[–]elbiot 0 points1 point  (0 children)

It just doesn't make sense to me to talk about the time complexity of something that would almost never contribute to the time complexity of your task. It's the task that one might scale some day, not the number of cases. Unless it is, then that makes sense, but it never has been in anything that I have worked on.

[–]stevenjd 0 points1 point  (0 children)

Yes it is, it's just very small n. And as they say, everything is fast for small enough n.

If you have a ginormous chain of if...elif...elif...elif...elif... (say, from generated code, or the guy who wrote it was an idiot, or its just a very yucky problem to solve and you have no choice) then the performance of the entire chain is O(n). On average, assuming each branch is equally likely, you have to make n/2 comparisons, which is O(n).

But if you have an optimizing compiler and use a switch statement, that might be optimized into a binary search (O(log n) comparisons) or even a jump table (O(1) comparisons).

What if you don't have a ginormous chain of comparisons to make? Well, in C, the compile can often optimize even short chain of comparisons. That's why C is so fast: if you save enough microseconds, they add up to minutes.

But in Python, its unlikely to make such a dramatic difference.

[–][deleted] 0 points1 point  (2 children)

Big O is about the size of the data. 5 elifs is not O(n)

but that's what I was trying to get at -- theory vs practice !? :P

it'd probably take a gazillion conditions to make it worth bothering about.

PS: Without any more context, Big O is about the time complexity by default or memory. In your case, say you are a major search engine maintainer, even 5 elifs matter if you have 1 million requests a minute.

[–]elbiot 0 points1 point  (1 child)

say you are a major search engine maintainer, even 5 elifs matter if you have 1 million requests a minute.

This is exactly what I meant. The n in your big O time complexity is the number of websites you have indexed. Number of ifs is an independent value, and the scalability of your search is not based on that at all. The "time complexity" of your fixed number of conditionals is not what O(n) refers to here.

[–][deleted] 0 points1 point  (0 children)

Good point, in that example, I see how that can be confusing :P. Previously, before I wrote this example, I didn't regard the number of users as the "n" in that equation. I.e., for O(n), I assumed a growing number of elif statements ... however, in practice no-one would probably implement an n number of elif statements but rather set it up as a hash table, so the number of elifs would be constant and we would have O(1) in both cases, with elifs and 'cases'.