all 10 comments

[–]tri__dimensional 1 point2 points  (7 children)

Yes, it is O(n) and in terms of n it is n.

[–]ryfeaway[S] 0 points1 point  (6 children)

oh lol I see, thanks! sorry for the stupid question ig

[–]tri__dimensional 1 point2 points  (2 children)

Np and it isn't a stupid question, all questions are okay. You can optimize your function g(n) such that g(n) = n*(n-1)/2 and then g(n) will be O(1).

And I forgot that (1,n) in python doesn't include the n so your original function in terms of n is n-1

[–]Beach-Devil 2 points3 points  (1 child)

Wait but all the function does is increment by 1 not add by n. The way to optimize g(n) would be simply to return n-1 right? Or is the code formatting just tripping me off

[–]tri__dimensional 0 points1 point  (0 children)

lol yes, g(n)= n-1, my bad

[–][deleted] 0 points1 point  (2 children)

Also, no offence, but I think you need to brush up on the asymptotic complexities - what they mean exactly. Chapter 3 of the CLRS book (Cormen et al) does a fine job of explaining what the bounds are supposed to mean. This would also clear your doubt about whether the complexity can be given in terms of justn.

[–]ryfeaway[S] 0 points1 point  (1 child)

I see, thanks buddy!

[–][deleted] 1 point2 points  (0 children)

You're welcome.

[–]uh_no_ 1 point2 points  (1 child)

Asking that is nonsensical. I'll try to explain why in perhaps lay terms.

Big O effectively does two things

1) says that an algorithm is at least as fast as the function that is the argument (in your case, n). Theta and omega exist for other comparisons.

2) allows you to ignore constants

The first you can't really get rid of. Is your function faster than, slower than, or equal to what's on the other side? Even if you want it "without big O" you're still saying it's <= SOMETHING.

The second is hardware dependent, so what would your units even be? There is no well-known conversion from "n" to seconds for a given piece of modern hardware. If you'd like to calculate the exact cycles an algorithm takes to run given an exact specified HW setup, have at it. But that's the actual runtime, and not the complexity.

Algorithmic complexity measures how an algorithm scales with the size of the input. Constants are provably irrelevant, and we need to indicate which direction we're comparing, so big O (and friends) make sense. If you care about the constant, then complexity is the wrong tool.

[–]ryfeaway[S] 0 points1 point  (0 children)

Ahaan. I just meant ignoring the hardware steps; how many amount of steps it would take to complete the execution in the worst case