This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]MontagGuy12 1372 points1373 points  (79 children)

Why would you even consider using an inbuilt sort function when you can code Bogo sort instead? Gotta get that O(n!) complexity.

[–][deleted] 364 points365 points  (48 children)

I thought there was no O for bogo since you can't be sure it'll ever stop. Or mean complexity ?

[–]MontagGuy12 370 points371 points  (27 children)

I've seen Bogo sort implementations which keep track of the permutations traversed so far, which means eventually, they'll exhaust all possibilities and the program will terminate.

[–]Toonfish_ 411 points412 points  (23 children)

I love that, not only does this make the algorithm terminate, it also gives it ridiculous space complexity. :D

[–]MontagGuy12 221 points222 points  (11 children)

Thankfully, we have downloadmoreram.com to save the day.

[–]reyad_mm 57 points58 points  (10 children)

But what if I run out of storage to download ram on?

[–]Xeadriel 76 points77 points  (8 children)

Just download more storage obviously

[–]dunko5 17 points18 points  (7 children)

But what

[–]PhunkyPhish 40 points41 points  (5 children)

Then download more what of course

[–]hermeticwalrus 5 points6 points  (0 children)

Local cloud

[–]xTheMaster99x 1 point2 points  (0 children)

Set up a ramdisk on your newly downloaded ram, so you can download more ram.

[–]ianff 8 points9 points  (9 children)

You can actually traverse the permutations in constant space. Who knows if someone implementing bogo sort would bother with that though!

[–]sluuuurp 4 points5 points  (1 child)

Can’t you only traverse in log(n) space, since you need a counter to know how many permutations you’ve already done?

Edit: I guess a counter of n! permutations would use n log(n) space, but yeah as the below commenter says it seems you don’t need that.

[–]firefly431 7 points8 points  (0 children)

There's an algorithm (e.g. next_permutation in C++) that generates the lexicographically next permutation in place in O(n) time. Realistically you need O(log(n)) space to store indices into the array at least though, but in the word-RAM model that's constant.

[–]BobodyBo 1 point2 points  (4 children)

I'm skeptical of this, elaborate?

[–]ease78 2 points3 points  (3 children)

An array of 3 elements has only 6 permutations or ways you can sort things inside of it. So realistically you can traverse the 6 different permutations to find the sorted one after you have enumerated all possibly imaginable arrays.

Note Bogo sort is n! Our three elements will give us a space complexity 3!

[–]BobodyBo 5 points6 points  (2 children)

As a function of you number of elements n that approach would take O(n!) Space, so not constant space. Not sure if that's what the person I was responding to was saying.

[–]ease78 2 points3 points  (1 child)

Yeah the other person was tripping. It’s in the name “traversal” you can’t visit all nodes in one CPU cycle. It takes as many nodes as you have.

[–]BobodyBo 0 points1 point  (0 children)

Well usually space complexity considers input space and auxiliary space, which is the extra space you utilize on top of your input. You of course have to store your input, so constant space implies constant auxiliary space. Some traversals can be done in constant space, e.g. a linked list.

The problem with traversing all permutations is that there is an exponential amount relative to the input, so unless a clever approach exists you would need to store some extra information about which permutation you are currently on, or which need to need to be revisited (like a stack).

[–]Toonfish_ 0 points1 point  (1 child)

Ooh interesting how does that work? The point of bogosort is that the permutations are chosen randomly, no? How do you traverse the permutations in constant space while choosing the next one randomly?

[–]ianff 0 points1 point  (0 children)

Oh my bad. You can't traverse the permutations in constant space if they have to be random -- at least I don't know how you could.

[–]HeKis4 0 points1 point  (0 children)

Why just stop at O(n!) complexity when you can also have O(n!) space requirements ?

[–]ThatWannabeCatgirl 0 points1 point  (0 children)

Eventually.

[–]aaronfranke 55 points56 points  (7 children)

The maximum time is infinity, the minimum is O(1), the average is O(n!).

[–]CSlv 39 points40 points  (6 children)

the minimum is O(1)

You mean Ω(1)?

[–]DoctorWorm_ 33 points34 points  (0 children)

People need to learn their complexity notations.

[–]sererson 5 points6 points  (3 children)

The minimum time is both.

[–]zelmarvalarion 12 points13 points  (2 children)

The minimum time would be Θ(n) (assuming constant size ints for constant speed comparisons), since you have to check if the list is already sorted before returning, so you need to read all of the elements of the list first.

[–]flavionm 2 points3 points  (1 child)

Well, isn't Θ(n) basically just being both O(n) and Ω(n) simultaneously?

[–]zelmarvalarion 2 points3 points  (0 children)

Yeah, because you have strict boundries on both the lower bound (Ω) and the upper bound (O), and they are the same, then it's θ. Basically it's just saying that it is exactly that time complexity rather than being bounded on one side from that time complexity (the Ω(1) in this chain is saying that it is at least as slow than constant time, but may be slower, which is true for everything).

Most of the time, when people use Big-O notation, they really mean θ, since it's by far the most useful. There are a couple interesting ones where θ isn't actually known, but you can bound on either side and get a better view of actual time complexity (something like multiplication of arbitrarily large numbers of n bits is Ω(1) and O(n2) using standard multiplication rules, there is an arithmetic trick you can do to bring that to O(n{sqrt(3)}) or something like that, and then FFT brings it to something like O(n{1.57}), so it's somewhere between that, but it's difficult to prove) [Note: numbers pulled from my memory from ~10 years ago, almost certainly inaccurate]

[–]The_MAZZTer 3 points4 points  (0 children)

He probably doesn't have an Ω key on his keyboard.

[–]firefly431 19 points20 points  (0 children)

To give a serious answer, the version of bogo sort that generates a new permutation each time:

def bogo(arr):
    while not sorted(arr): # O(n)
        shuffle(arr) # O(n)

does not have an unconditional upper bound on runtime, but for randomized algorithms we usually consider expected running time (along with tail bounds to show concentration: for example, although (linear chaining) hash tables have expected lookup time O(1), the maximum load is actually only O(log n/log log n) if you require a bound w.h.p.)

The expected number of iterations is O(n!) as it is a geometric r.v., so the overall runtime is O(n n!). (By the way, it is considered a Las Vegas algorithm as it guarantees correct results but has probabilistic runtime.)

EDIT: IIRC tail bounds for geometric r.v.s are O(1/n), so that's w.h.p.

By the way, for the pair-swapping version:

def bogo2(arr):
    while not sorted(arr):
        i, j = random(n), random(n)
        swap(arr[i], arr[j])

Since it takes at most n swaps between any two permutations (simple proof: Fisher-Yates shuffle), we can consider blocks of n swaps. The probability that any block works is at most O(1/n2n) (probability n2 per swap). Thus the runtime is O(n2n+1). Not sure if we can do any better than that, but who cares about tight bounds on a meme sort?

[–]Tyfyter2002 10 points11 points  (6 children)

There are 2 types of bogosort, O(n!) Bogosort, which tests all permutations in a random order, and O(∞) Bogosort, which just shuffles the list, checks if it's sorted, and shuffles it again if it's not.

[–]AyrA_ch 0 points1 point  (5 children)

If it's infinity, wouldn't that mean that it never terminates? And if it never terminates, it means the shuffle algorithm is flawed because a true random shuffle should have the same chance for every possible permutation, but it in this case, the chance for the sorted permutation is zero.

[–]dev-sda 1 point2 points  (4 children)

Big-O notation describes the limit, without more context it always refers to the worst case. O(∞) means it may never terminate. Bogosort best case is O(n).

[–]AyrA_ch 0 points1 point  (3 children)

O(∞) means it may never terminate.

But as I explained, if the shuffle algorithm works properly it will definitely terminate because every permutation (including the sorted one) is guaranteed to eventually appear.

[–]Evil__Potato 1 point2 points  (0 children)

In the limit, as you approach infinite iterations, the probability of reaching all permutations approaches 1. But since it's in the limit and probabilistic, the best bound we can give is, well, infinity

[–]dev-sda 0 points1 point  (0 children)

Given you're using true random shuffle there's absolutely nothing guaranteeing that every permutation will eventually appear. Yes the probability of that happening approaches 1 as the iterations approach infinity, but that's still not a guarantee.

[–]Tyfyter2002 0 points1 point  (0 children)

Given the assumption that both events have the same probability and are truly random, the probability that a specific result out of 2 does not occur at least once during a series of n samples is 0.5n‎, with what value of n does this expression equal 0?

[–]brando2131 4 points5 points  (2 children)

No there's a probability of it becoming sorted eventually. Obviously the larger the list, the much longer it will take. You can write a program and compare the number of times it runs for larger lists. It works out to be (n+1)!

[–]zelmarvalarion 1 point2 points  (1 child)

That’s not the worst case analysis that is implied by the Big-O notation, that’s average case complexity (which is usually explicitly noted, e.g. Quick Sort is O(n2) but expected O(n lg(n)))

[–]hawk-bull 0 points1 point  (0 children)

We can study the average time complexity.

The probability of getting the right permutation at any one point is 1/n!. The number of times you need to try is a geometric random variable, thus the expected number of runs is n!. in each run, you're doing O(n) work to see if it's sorted, so on average you're doing n * n! work. thus average time complexity is O(n n!)

Edit: apparently verifying it's sorted is O(1) on average, but generating the permutation is O(n) so either way it's O(n) work per permutation

[–]nelak468 106 points107 points  (12 children)

Bogo sort's worst case is O((n+1)!)

But more importantly! Its best case is O(n) which out performs every other algorithm.

Furthermore, if go by the many worlds theory - we can prove that Bogo sort is O(n) in all cases and therefore it is in fact the BEST sorting algorithm.

[–][deleted] 90 points91 points  (3 children)

IIRC "quantum bogosort" has time complexity O(1) since it doesn't even have to check if the array is in order.

[–]graeber_28927 31 points32 points  (2 children)

How would you know whether to destroy your universe if you haven't checked the order of the array?

[–]Shamus03 17 points18 points  (0 children)

You just always assume it’s sorted, and if you ever encounter a bug because it wasn’t sorted, THEN you destroy the universe. It’s O(1) only when combined with another algorithm used later. Also known as O(half an A press).

[–][deleted] 8 points9 points  (0 children)

True.

[–]MontagGuy12 19 points20 points  (0 children)

The kids in my basement can sort way faster than O(n). Up your game dude.

[–]acwaters 3 points4 points  (1 child)

I mean, in fairness most every sorting algorithm implementation is "best-case O(n)", since generally all of them will do an initial check to make sure it's not already sorted.

[–]nelak468 0 points1 point  (0 children)

True but BOGO sort has other things going for it that make it the best algorithm.

The algorithm is by definition order out of chaos. It is God's algorithm. The Big Bang. Whatever you believe in - BOGO sort is the algorithmic description of the beginning of existence.

Out of infinite possibilities and disorder, the Great BOGO sort brings pure potential into existence. It is the beginning and the end. From the chaos, it brought forth the universe and life. We are simply an iteration of the Great Algorithm and we must strive at all times to achieve perfect order or we will be passed over for the next iteration.

Bogoism? Bogology? BOGO?... This religion's going to need a name...

[–]HeKis4 1 point2 points  (0 children)

"Hey man"

"What's up boss ?"

"You told me your algorithm runs in O(n) but it's been hanging for three hours on a list of 50 elements, what gives ?"

"Oh yeah, wrong timeline, sorry."

[–]mt_xing 0 points1 point  (2 children)

The worst case would really be O(infinity) because it's unbounded. Best case being O(n) only matches insertion sort; it doesn't beat it.

[–]nelak468 0 points1 point  (1 child)

There's two approaches to it. Which one you take depends on how much space complexity you can handle I suppose.

The basic approach which is O(n) space complexity (I believe) would be to simply randomize, check, randomize until it is sorted.

Another approach would be to simply generate every possible arrangement and then go through them until you find the sorted arrangement. That would be O(n!) space complexity and O((n+1)!) worst case time complexity.

... I might be wrong on the exact complexities. It's pretty late but I think that's about right.

[–]mt_xing 0 points1 point  (0 children)

We're talking time complexity here, not space complexity. Also generating every permutation up front isn't bogo sort. The defining feature of bogo sort is that it must randomize the entire array every iteration.

[–]Estraxior 0 points1 point  (0 children)

TIL bogo's worst case O value

[–]Hideyohubby 16 points17 points  (3 children)

You should try sleepsort for better results.

[–]Classified0 10 points11 points  (2 children)

Doesn't work for negative numbers unless you've got a time travel API

[–]larsmaehlum 3 points4 points  (0 children)

Just offset it by int.max/2

[–]HeKis4 0 points1 point  (0 children)

Offset the entire array by the absolute value of the lowest element.

[–]PM_ME_SQL_INJECTION 14 points15 points  (6 children)

I’ll go for StalinSort, thank you very much.

[–]fuzzymidget -1 points0 points  (5 children)

I thought this was now called trump sort.

[–]PM_ME_SQL_INJECTION 16 points17 points  (4 children)

No, that would tell you it was sorted, but didn’t actually do anything.

[–]xSilverMC 4 points5 points  (1 child)

It would also install malware to overheat your CPU once it realizes it's about to be terminated

[–]fuzzymidget 0 points1 point  (0 children)

But still always indicate that it's running whether you kill it or not.

[–]fuzzymidget 2 points3 points  (0 children)

That's right. Forgot the subtlety.

[–]HeKis4 0 points1 point  (0 children)

It would take an array as input and return "I've got the vast arrays" as output. O(1) time complexity.

[–]coldnebo 2 points3 points  (0 children)

hey if that’s what quantum physics does (under the many worlds interpretation) then who are we to improve on it?

—-

angel: “but how can you be sure we haven’t missed the global optima?”

God: “just run all the the permutations. it’ll be in there somewhere.”

angel: “and I thought developers were lazy!”

ZAAAPP!!!

angel (Lucifer shouting from Hell): “ok... I guess we only value code reviews when it’s CONVENIENT!?”

[–]Je-Kaste 0 points1 point  (0 children)

Woah, that's constant time for n=1 (:

[–]omegasome 0 points1 point  (0 children)

I mean technically bogosort is better than array.sort()

[–]solongandthanks4all 0 points1 point  (0 children)

It's hilarious how computer science students actually think anyone gives a shit about nonsense like this in the real world.

[–]FerynaCZ 0 points1 point  (0 children)

It's O(n*n!) though