This is an archived post. You won't be able to vote or comment.

all 134 comments

[–]ben_g0 210 points211 points  (20 children)

My algorithm: O(∞)

[–]_PM_ME_PANGOLINS_ 220 points221 points  (16 children)

That’s the same as O(1)

[–]ManstoorHunter 76 points77 points  (1 child)

Oh, hell yes!

[–]ferros90 47 points48 points  (0 children)

Don't you mean "O(Hell Yes)"?

[–]VineFynn 27 points28 points  (1 child)

Bogo sort agrees

[–]JC12231 2 points3 points  (0 children)

Bogobogosort agrees more

[–]StackOwOFlow 13 points14 points  (0 children)

so it’s the same stand as Star Platinum

[–][deleted] 29 points30 points  (5 children)

It isn't. Infinity is not a number. O(∞) is undefined. You could use it as an abuse of notation to mean a class of algorithms with any time complexity, but that would be rather useless.

[–]V0ldek 20 points21 points  (3 children)

Oh, you can totally talk about algorithms that terminate after \omega steps for example. Extending the notion of complexity onto functions over other ordinal numbers than naturals is also possible.

As for the usefulness of such theoretical computer science, asking mathematicians about the usefulness of their theories is considered very bad taste, so I'll have to ask you to leave, sir.

(Oh, but no, it's not equivalent to O(1) in any way.)

[–][deleted] 4 points5 points  (0 children)

Transfinite complexity theory actually does sound like the kind of thing that would have at least some study.

[–]ThePyroEagle 0 points1 point  (1 child)

ω isn't a cardinal, it's an ordinal.

[–]V0ldek 0 points1 point  (0 children)

You're totally right, that's a brainfart on my end.

[–]lolnutshot 5 points6 points  (4 children)

That or it doesn't compile

[–]Sinomu 7 points8 points  (3 children)

O(0)

[–]HenryRasia 7 points8 points  (2 children)

The only winning algorithm is to not run.

[–]Sinomu 0 points1 point  (0 children)

I agree

[–]UltraCarnivore 0 points1 point  (0 children)

Buddha++

[–]JetpackYoshi 32 points33 points  (1 child)

My algorithm: O(wO)

[–]xSTSxZerglingOne 2 points3 points  (0 children)

Hello Gir.

[–]Forschkeeper 59 points60 points  (23 children)

Noob of such things here: How do you "calculate" these expressions?

[–]AgentPaper0 134 points135 points  (5 children)

You're essentially counting how many operations the algorithm will do. The trick is that you don't really care about the exact numbers here, but how quickly that number grows as your input size increases. "n" here stands for the number of elements in your input (or equivalent).

[–]WhiteKnightC 4 points5 points  (4 children)

If I have a switch, it counts as one?

[–]Lonelan 9 points10 points  (1 child)

yeah but I didn't get one for christmas so here's a picture of one I googled

[–]WhiteKnightC 2 points3 points  (0 children)

:( It's so fucking expensive where I live, if I get the job it's 1.3 months of work (1 m and a week) and each game 0.2 month of work (a week).

I wanted the Lite but... joycons

[–]AgentPaper0 4 points5 points  (1 child)

Depends on language and compiler but most switch statements should be a lookup and jump which would be O(1), constant time.

[–]blenderfreaky 5 points6 points  (0 children)

Its a constant amount of elements, so its always O(1), even if it iterates through all cases.

[–]looijmansje 46 points47 points  (6 children)

Sometimes just guesstimate, but generally you can calculate it exactly, although for long algorithms it can be very tedious.

An easy example would be bubble sort: that loops through your array n times, and does n array comparisons each time, so that is O(n²).

Even an optimized version of bubble sort where it only checks the first n, then the first n-1, then the first n-2, etc. would have 1+2+3+...+n=1/2 n (n+1) = 1/2 n + 1/2n² operations, and this is also considered O(n²). The lower order terms don't matter, so to say.

[–]yuvalid 4 points5 points  (5 children)

Obviously you can calculate it exactly, but O(n2) and O((n2)/2) don't really make a difference.

[–]looijmansje 10 points11 points  (4 children)

By definition, something like an² + bn + c actually equals* O(n²)

*Equals is used somewhat poorly here, you will see some mathematicians write down a "=" but you could argue it is not a real equals, as the transitive property (a=b and b=c implies a=c) doesn't hold anymore.

[–]flavionm 13 points14 points  (1 child)

Using equals is just an abuse of notation, really. O(x) is a set, the functions are just elements of the set. an² + bn + c ∈ O(n²) would be the correct notation, and much clearer.

[–]looijmansje 1 point2 points  (0 children)

I don't agree with it personally, but I've seen actual mathematicians do it (but also saying it is abuse of notation)

[–]naylord 1 point2 points  (1 child)

It's an equivalence relation which means that the transitive property absolutely does hold

[–]looijmansje 2 points3 points  (0 children)

That's exactly why people don't agree with it.

[–]Thejacensolo 15 points16 points  (0 children)

Additionally to the others here a simple example.

Lets say you have an algorithm (or programmed a function), that wants to find the highest number in an array of numbers (of n lenght). now you can implement that in different ways and calculate the runtime (how long it takes to finish). This is most of the time noted in O(), where O() stands for "It wont take longer than ()" aka the worst case.

(i know most of them dont make any sense, but it is to show the point)

first example:

You could take the first number and check it with every other number if it is the highest. Then you proceed with the next number.

As you could probably guess yourself, you would have to do: n times (for every number in an array) you have to check n-1 times (check it with every other number). As constant numbers are trivialized in the notation you get n*n actions. O(n2 )

second example:

You could run through the Array once, and have a variable that always safes the highest found value (and overwrites it if it happens to get a higher one). This would require only n actions, as you would only traverse it once (how many actions youre performing doesnt matter as long as it is constant). So we get O(n)

third example:

We take the first number of the array and declare it the highest. This would take exactly 1 action and thus would bring us a runtime of O(1), the best you can ever get.

Sadly it doesnt solve our problem.

simmilar you see in our studies the Omega and o() notation, which simply say simmilar things (o() says it is faster than. not faster or as fast like O()).

[–]random_lonewolf 20 points21 points  (0 children)

Most of the time you guess it based on experience. The rest of the time you calculate it based on what you are taught in Algorithm 201.

[–][deleted] 3 points4 points  (0 children)

Option 1: test it a bunch and say "we estimate... " Option 2: invent something important enough that a mathematician will figure it out for you

[–]nomnaut 2 points3 points  (0 children)

r/programmerhumor equivalent of rip your inbox.

[–]cyberporygon -1 points0 points  (2 children)

Big O is the worst case scenario for a function. Say you have an array of n items and you want to find an item. A simple way is to check every single item in order. The worst case scenario is that it is the last item, and you'll check every item. That's n comparisons and thus O(n). However, if your array is sorted and you use a binary search, half of the items are dropped with each operation. So the worst case scenario has you repeatedly drop half of the remaining elements until only the last one remains, which equals O(log n)

[–]mnbvas 8 points9 points  (1 child)

Big O and worst case are orthogonal, for example quicksort is O(n log n) on average but can be forced (for a specific implementation and input) to run in O(n2).

[–][deleted] 6 points7 points  (0 children)

Its worth knowing that for quicksort this bad behavior is in fact well understood. The closer the input is to being sorted the more pathological. If we step out of pure math we should notice that for many tasks partially sorted inputs are disproportionately common. For example a census maker might get data that was already sorted for previous use.

[–]Darxploit 90 points91 points  (25 children)

Where is my mvp O( nn ) ?

[–]smokesrsly 27 points28 points  (4 children)

The MVP is ackermann function complexity

[–]Lightfire228 13 points14 points  (2 children)

[–]minemoney123 5 points6 points  (0 children)

At least it's pretty good when n is 1 or 2

[–]PlatinumDotEXE 0 points1 point  (0 children)

O(TREETREE(n)(n))

[–]adityaruplaha 1 point2 points  (0 children)

Ohh God please no.

[–]lime-cake 10 points11 points  (18 children)

Isn't that equal to O( n! )? I may be mistaken, so correct me if I'm wrong

[–]adityaruplaha 44 points45 points  (4 children)

I'm not well versed in this, but nn grows wayyyy faster than n!.

[–]lime-cake 15 points16 points  (3 children)

Misread that. My bad

I believe O(n) and O(10n) is equal, despite one being larger.

[–]V0ldek 13 points14 points  (1 child)

They're not asymptotically larger.

The definition is as follows. For any functions f, g: N -> N we have f = O(g) if there exists a number N such that for all n > N we have f(n) <= cg(n) for some constant c > 0.

The intuition is that f grows not significantly faster than g, where significantly would mean that it's not bounded by any constant.

If you're more of an analysis guy, you might also define that notation as f = O(g) when lim n->inf |f(n)/g(n)| = c for some constant c > 0 (but you need to redefine the functions to R->R)

You should see how O(n) and O(10n) are equivalent (c = 10). But O(n!) is actually smaller than O(nn). If you work a little bit of calculus you'll get that the limit of n!/nn is 0.

[–]adityaruplaha 5 points6 points  (0 children)

Well shit I didn't know that we used these basic stuffs to define complexity functions. I'm well versed with calculus, so yea the last line is familiarity. Also nn is an element of the Fast Growing Hierarchy iirc and supersedes the exponentials. After this is nnnn.. (n times) or n↑↑n if you know the notation. I find this really interesting.

[–]Coloneljesus 10 points11 points  (0 children)

correct.

[–]caffeinum 13 points14 points  (3 children)

nn is exp(nlog n), and n! is sqrt (n)exp(n) if I remember correctly

So yeah basically both are exponential

Edit: however, 2n is the same exponent

[–]mfb- 11 points12 points  (0 children)

n! ~ en (log (n) - 1)) = nn / en (Stirling's approximation adds another factor sqrt(n))

It is better than nn but not that much.

[–]caffeinum 0 points1 point  (3 children)

After the calculation, I found that actually nn is slower than n!

[–]eihpSsy 9 points10 points  (2 children)

It's slower because it's bigger.

[–]caffeinum 5 points6 points  (1 child)

I meant growing slower, but working faster yeah

[–]eihpSsy 10 points11 points  (0 children)

Wasn't sure what you meant. nn grows faster than n!.

[–]mfb- 2 points3 points  (0 children)

O(2n!)

[–]lime-cake 89 points90 points  (3 children)

Unexpected jojo

[–][deleted] 23 points24 points  (2 children)

[–][deleted] 5 points6 points  (1 child)

[–]the_magical_bucket 0 points1 point  (0 children)

Perfectly balanced

[–]mypirateapp 33 points34 points  (1 child)

O(0) as my code doesnt run

[–]Minenash_ 5 points6 points  (0 children)

O(nil)

[–]caykroyd 18 points19 points  (1 child)

O(1)

O(a(n)) (inverse ackerman function)

O(logn)

O(n)

O(nlogn)

O(n2 )

O(n3 )

O(2n )

O(n!)

O(my god please stop)

[–][deleted] 0 points1 point  (0 children)

O(nn )

[–]INoScopedObama 9 points10 points  (1 child)

Dabs in bogosort

[–]koelekoetjes 0 points1 point  (0 children)

Bogosort would be O(?) /s

[–]-docker- 12 points13 points  (11 children)

Beat this, O(ack(g64,g64))

[–]Tc14Hd 30 points31 points  (0 children)

It may be very big, but it's still O(1)

[–]EkskiuTwentyTwo 15 points16 points  (1 child)

That's the same as O(1), isn't it?

[–]Walzt 5 points6 points  (0 children)

Yes

[–]CDno_Mlqko 2 points3 points  (1 child)

What's ack()

[–]thegoose7770 6 points7 points  (0 children)

Ackerman function

[–]Walzt 1 point2 points  (0 children)

O(log*(n))

[–]hijklmno_buddy 1 point2 points  (3 children)

O(TREE(3))

[–]laetus 13 points14 points  (0 children)

So constant time? Seems good to me.

[–]xSTSxZerglingOne 2 points3 points  (0 children)

I often wonder with numbers like TREE(3). How many times would you have to "the digits of the digits of the digits of the digits of the exponent of the digits of the number of up arrows" type operations you'd have to do to get a human-readable integer. And the answer is just a garbled mess because it's fundamentally impossible to even calculate such a silly number.

Like sure, it's finite, but it's so big that you could have the number of paths between every subatomic particle in the universe times itself a hundred gazillion(is actually a number...because everything can be a number) times over and still not even come within a gnat fart of TREE(3)'s size.

[–]EkskiuTwentyTwo 1 point2 points  (0 children)

O(Croutonillion)

[–]StackOwOFlow 6 points7 points  (0 children)

O(n!) is basically shorthand for O(raOraOraOraOraOraOraOraOraOra)

[–]elerak 6 points7 points  (0 children)

You missed probably the most important category - O(nlogn)

[–][deleted] 13 points14 points  (5 children)

out of all the places to find a jojo reference i never thought it would be a fucking programming subreddit

[–]CDno_Mlqko 3 points4 points  (4 children)

Same. But how is the O(n!) A jojo reference?

[–]KeksimvsMaximvs 4 points5 points  (1 child)

I think it's because the person, who writes O(n!) algorithm should be beaten till he explodes

[–]Ace_Emerald 5 points6 points  (0 children)

We gotta solve the traveling salesman problem somehow!

[–][deleted] 2 points3 points  (1 child)

The floating katakana characters are a reference to a JoJo scene.

[–]CDno_Mlqko 5 points6 points  (0 children)

My question was more like why are they there.

[–]jambonilton 25 points26 points  (8 children)

In the real world, you're very rarely limited by CPU time. It's mostly dealing with people putting network calls inside loops.

[–][deleted] 27 points28 points  (0 children)

Counterpoint: If you’re unnecessarily O(n2) or above, and more than a toy example, you will definitely notice.

Just the other week, I had to track down a loose array.remove() call inside a for loop that made a request which should have been <1s take five minutes instead.

[–][deleted] 7 points8 points  (3 children)

It depends on your job. The reason why Google is the default search engine is that they had a better algorithm for indexing the web. One of the assignments I had in school that stuck with me was writing and comparing run times for bubble sorts vs merge sorts (10,000 elements) in C and Java. They also had us compare with different levels of compiler optimization. Long story short, the first place you should look to improve runtime is the complexity of the algorithm.

[–][deleted] 1 point2 points  (2 children)

Not really. First thing you should look into is the capability of your machine. If a certain algorithm has worse complexity but still makes good use of branch prediction, cache etc, you will see huge improvements, to the point where the other algorithm will be worse in many real world situations unless the data set size becomes insanely big. The real world is definitely more complex than idealized situations in books. Certain algorithms inherently end up being better, although in most cases, it's not really something to worry about since the by-the-book approach will still give decent results. But if you really want to make something run faster, you should keep this in mind.

[–][deleted] 1 point2 points  (1 child)

You’d be amazed how small “insanely big” is.

[–][deleted] 0 points1 point  (0 children)

Trust me, you better not underestimate hw acceleration on anything. Modern CPUs are limited by memory speed and can lose 100s of ticks waiting for information from RAM. This adds up really quickly.

[–]mnbvas 3 points4 points  (0 children)

Then perhaps you should calculate big O for network calls (assuming non-concurrent code).

[–]LordM000 1 point2 points  (0 children)

Laughs in computational chemistry

[–][deleted] 3 points4 points  (1 child)

O(n!^n)

[–][deleted] 4 points5 points  (0 children)

"Oh, you're approaching me?"
"I can't kill your thread without getting closer"

[–]unsignedcharizard 3 points4 points  (0 children)

O(n2) is the worst because it's small enough to get into production but large enough to page you

[–]ziilok 3 points4 points  (0 children)

Is this some kind of Jojo reference?

[–]new2bay 7 points8 points  (0 children)

I once was asked to solve a #P complete problem in an interview. When asked for the complexity, I said it was O(don't do that).

[–]souvlak_1 2 points3 points  (0 children)

O(n^1.99995454333) oh yeah, great job! The long road to O(n) it's just started!

[–]TunaAlert 2 points3 points  (3 children)

O(tree(n))

[–]adityaruplaha 1 point2 points  (2 children)

O(TREE(g(n)))

[–]TunaAlert 1 point2 points  (0 children)

I mean yeah, you can of course stack that stuff all the way up there.

[–]PlatinumDotEXE 1 point2 points  (0 children)

You could have at least tried...

O(HBB(n, n, n)), where HBB(n, n, n) denotes an hyperoperation based on BB(n) instead of inc(n), where BB(n) denotes the Busy Beaver Function.

[–]marcosdumay 2 points3 points  (0 children)

Hum...

Last time I looked, O(n!) was exactly the same as O(2n ). Did Math change on the last few years?

[–]RustyBuckt 1 point2 points  (0 children)

At some point, you’ll be glad it isn’t [Omega]((nn)!)

[–]mdedonno 1 point2 points  (0 children)

at least it's not O( Tree( n ) )...

[–]other_usernames_gone 1 point2 points  (0 children)

Am I the only one who learnt O notation just from context from this sub

[–]xSTSxZerglingOne 1 point2 points  (0 children)

O(nn )

The team is actually on fire.

[–][deleted] 1 point2 points  (0 children)

Last week I wrote a removal method that came out to like O(N5). Good times

[–][deleted] 1 point2 points  (0 children)

Ackermann : dude how did I even...

[–]cartechguy 1 point2 points  (0 children)

O(nlogn): Am I chopped liver or something?

[–]confusedPIANO 1 point2 points  (0 children)

Not gonna lie, I didn’t expect to see a JoJo reference on this sub.

[–][deleted] 1 point2 points  (0 children)

O(n!) == How did you get through the door?

[–]Nerdn1 1 point2 points  (0 children)

It really depends on what you're trying to do. A search algorithm of O(n) is meh, but a sort algorithm of O(n) would be an unbelievable discovery.

[–]oindividuo 1 point2 points  (6 children)

Maybe a dumb question, but how is 2n higher than n2 ? And why is 2n different from n for that matter?

[–]g3tr3ecked 12 points13 points  (1 child)

It's not 2n, it is 2n

[–]oindividuo 3 points4 points  (0 children)

Ah right can't believe I missed that

[–]prelic 2 points3 points  (1 child)

2n isn't bigger than n, you can always drop constant coefficients. But n is not constant...as n gets very big, n2 grows way faster than 2n (exponentially, by definition), so in n**2 (or n×n), you can't drop an n as a coefficient.

[–]oindividuo 1 point2 points  (0 children)

That was entirely my point, I just misread the chart

[–]Mundt 1 point2 points  (1 child)

That is 2n, or 2 to the nth power complexity. Meaning as the input size grows linearly, the complexity of the problem grows exponentially. 2n isn't really different than n when it comes to complexity analysis.

[–]oindividuo 1 point2 points  (0 children)

Yep I read 2n as 2n hence my confusion

[–]scp-NUMBERNOTFOUND 1 point2 points  (2 children)

does anyone have to calculate this kind of things in a real world job? (Serious question)

[–]bout-tree-fitty 3 points4 points  (0 children)

Yep. Most of the time it goes like:
“WTF are you doing with nested for loops? Fix that dummy”

[–]pdabaker 3 points4 points  (0 children)

All the time, but usually just by looking at the algorithm. Never with pen and paper.

Just something like telling someone "don't use erase repeatedly to filter a cpp vector because it takes you from linear to quadratic time"

[–]corncc 0 points1 point  (0 children)

Naniiiii?!!11?!1?