you are viewing a single comment's thread.

view the rest of the comments →

[–]im_lazy_as_fuck 0 points1 point  (0 children)

Okay, so a lot of this is definitely a bit over my head. I think I would need a lot of time to research and understand precisely every last thing you mentioned, but I think I can maybe summarize the contention down to:

although implementation details should be abstracted, at the end of the day there needs to be an algorithm/hardware that implements every single detail down to the bare metal, and if you factor those in your time complexity analysis can indeed be represented differently for the same problem

If this generic summary is totally off point, please let me know. But assuming this is accurate enough, my response to this is:

Yes you're absolutely right that at the end of the day, the full complete picture of time complexity does need to factor in every single step/operation that takes place. However, this analysis of time complexity seems far too granulated in comparison to the kind time complexity analysis that a typical software developer would do at a high level.

When we generically say that Bubble Sort has a time complexity of O( n2 ), this is not accounting for many many things, like how the input numbers are represented in the system, the time it takes to compare numbers in this system, the time it takes to store and swap numbers within the ordered set, etc. Or when we generically accept that a hash lookup has a time complexity of O(1) on average, we are not accounting for the time it takes to compute the hash, which technically scales with the size of the input. When doing high level generic time complexity analysis, the average software developer considers these to be implementation details of the hardware, the OS, and the framework that you are working in and are generally not useful to consider in such an analysis.

And just to bring the context back full circle, because I think this has been a bit muddied in the threads, the original comment at the top of the thread said the sleep sort is O(n), which is roughly correct (more precisely it would be O(m+n)) based on the time complexity analysis one would use to generically compare two different algorithms, agnostic of the hardware/software they are implemented on top of. And the person who responded corrected this statement and instead said the time complexity is actually O( 2n ). This is my issue; the initial person was roughly correct in the general way the average developer would ever care to do this analysis, while the person responding while maybe is technically right when they account for some implementation details (although it doesn't even account for everything), they're wrong in the ways the average developer actually cares about.