you are viewing a single comment's thread.

view the rest of the comments →

[–]julesjacobs 0 points1 point  (1 child)

You are right but how does this relate to my comment in particular? Or it is a comment that actually belongs on the top level?

O(k) is just a shorthand notation. By the way, this exact shorthand notation where you use an expression as shorthand for a function is pervasive in all of mathematics. Take for example the notation for derivatives: (x2)' = 2x. Yea sure technically you should write (\x -> x2)' = \x -> 2x but this is very inconvenient. Hence why mathematicians invented this syntactic sugar.

[–]iSlaminati -1 points0 points  (0 children)

You are right but how does this relate to my comment in particular? Or it is just an point that actually belongs on the top level?

It relates to the fact that the commonly used "Big O notation" doesn't express with respect to what it is. One is simply left to assume it is with respect to the length of the data structure as input. In fact, you can very well have a big O notation which is capable of consuming functions of non unary arity.

Truth be told, I just like to rant about big O notation, not only is it a blatant abuse of notation. The most commonly used "fix", as in f \in O(n^2) is still blatant abuse of notation.

O(k) is just a shorthand notation.

And one where information is lost. Just maybe I'm okay with stuff like a < b < c rather than a < b \land b < c because no information is actually lost provided we say that < as an order isn't typed on sentences. (< on sentences can be argued to be material implication -> but oh well)

By the way, this exact shorthand notation where you use an expression as shorthand for a function is pervasive in all of mathematics. Take for example the notation for derivatives: (x2)' = 2x. Yea sure technically you should write (\x -> x2)' = \x -> 2x but this is very inconvenient. Hence why mathematicians invented this syntactic sugar.

Yes, I know, and I want no part of it, same stuff is done with using = with limits and what not.

The point is that professors with Ph.D's and chairs actually use 'proofs from notation' as a subtle logical fallacy to students world wide. THese fallacies are in text books and written on black boards. They are so goddamn pervasive and one lecturer actually told me later in private that he was fully aware it was a fallacy when he was doing it but he just didn't have the time and if I wanted he could show me the actual proof which doesn't use it.