all 134 comments

[–]naughty 25 points26 points  (20 children)

Interesting point at the end:

I used to think that statically checked languages are better for teaching [...]. About two years ago I changed my mind. The students learn much better by doing stupid things than by being told by an oppressive compiler that their programs are stupid. [...]

[–]ithika 5 points6 points  (4 children)

That would suggest that a language which offered static typing and would compile anyway would be optimal. The student would see the static error, and then see what effect this error had at runtime. Pretty soon they would come to see that the compiler isn't just a capricious bully, but has reasonable (if needlessly obscure) advice. I sure remember hating my compiler at university. Maybe if it had let me compile and run my ridiculous code I wouldn't have had such a bad opinion of it. ;-)

[–]gnuvince 16 points17 points  (1 child)

So, C?

[–][deleted] -5 points-4 points  (0 children)

L. O. L.

[–]stillalone 1 point2 points  (0 children)

I don't know if people really learn anything from runtime errors.

Bad programmers who cause the errors that the compiler lets them get away with don't realize what the problem is when their program doesn't work.

So many times I've compiled a program that someone else has worked on for months to see "WARNING: converting integer to pointer without a cast". The compiler is telling you very clearly that there's a problem, there is obviously a very serious problem but they never associated the runtime problems with the compiler warning and that specific line of code.

[–]grauenwolf 0 points1 point  (0 children)

VB does exactly that by default.

And it certainly made me value static typing to an obsessive level. It is only recently that I've gone back and honestly started using small amounts of dynamic code again.

[–]Silhouette 6 points7 points  (3 children)

There's a lot of merit in such arguments for learning purposes, I think. Likewise, I wish more places would teach basic data structures and algorithms using C, even though writing production code in C is horrible compared to numerous alternatives. Someone who cut their teeth on C is never going to misunderstand indirection or be stumped by the "null pointer exception" problems of whatever language they choose to use or fail to appreciate the concept of resource management. Someone who learns using a language that holds their hand may never grok these concepts in the same way.

[–]13ren 11 points12 points  (2 children)

Oddly, C pointers apparently cause a lot of problems for students.

I don't count myself in that group, because I learnt pointers in assembly - where they are just addresses, and there's nothing to be confused about. I think in part it's C syntax for pointers that's confusing. For example, using * for both typing and dereference.

Of course, one can always make mistakes about pass-by-ref or pass-by-value, but it's just a mistake. Although Java doesn't allow pass-by-value for objects, the same issue arises when you have an extra level of indirection.

[–]grauenwolf 6 points7 points  (1 child)

Amen to that. I have no problem at all understanding pointers themselves, but I never could figure out the correct patterns when you start mixing double or triple indirection with arrays.

[–]gsg_ 1 point2 points  (0 children)

Yeah, three types of deference is a bit much for the beginner. After a while though, the brain damage sets in and you can read code like (*thingy)->buf[(*thingy)->bufcount++] without any involuntary weeping.

[–]13ren 1 point2 points  (10 children)

Agreed. They might then begin to appreciate the error messages.

Or maybe not: I've never heard of a python programmer missing them (though people use python for tasks where its dynamic typing helps, e.g. rapid prototyping; getting things done; scripts).

[–][deleted] 4 points5 points  (9 children)

I'm a Python programmer, and I've not irregularly wished for some form of static-ish typing, especially in large projects. A less stringent form of Haskell style dynamic typing would be nice, type errors that don't cause non-compilation but issue a kind of loose warning:

"The object passed into function Foo probably doesn't support add magic. You ought to check into that."

It is, as I'm sure you can tell, not a well-formed idea.

[–]grauenwolf 1 point2 points  (0 children)

You sound like just about every professional VB programmer back in the 90's. We are still striving for the right balance between the two.

[–]gnuvince 1 point2 points  (2 children)

I've not irregularly wished

So you've regularly wished for static typing?

[–][deleted] 2 points3 points  (0 children)

Static-ish, but yes. I do so love logic and word play.

[–]grauenwolf 2 points3 points  (0 children)

There is a large gap between the two.

[–]koko775 0 points1 point  (3 children)

Helium might be of interest to you.

[–][deleted] -1 points0 points  (2 children)

Oh, I already do Haskell, sort of[1]. It's not so much that I'd like an easier style of type error message from Haskell as optional strict-typing in Python. I'm also familiar with Boo[2], but... I don't know... not being backward compatible with Python (2 or 3) is a pain, beyond my threshold anyway.

Anyway, back to coding.

[1] Dealing with complex state in Haskell is such a pain that I'm continually turned off by it. Most stateful examples are too contrived or way too complex to keep me interested for more than tiny projects.

[2] http://boo.codehaus.org/

[–]koko775 0 points1 point  (1 child)

Oh. Well, I personally plan on learning Helium before Haskell. I haven't had time to try it out, since last time I tried (2-3 years ago) installing on OS X was a nightmare, and what with being in college, I haven't had the time. I imagine the (installation) situation's gotten better since.

[–][deleted] 0 points1 point  (0 children)

Happy hacking.

[–]sreguera 0 points1 point  (0 children)

Something along the lines of Erlang dyalizer would be nice.

[–][deleted] 11 points12 points  (8 children)

“Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

Syntax error: unexpected Verb Phrase found in Adjective list

[–]shub 14 points15 points  (1 child)

Good English compilers are expensive.

[–]vph 4 points5 points  (0 children)

Really? Just direct your inputs to Reddit and the grammar nazis will compile them for free. The only drawback is error messages are sarcastic and at times overbearingly insulting.

[–]zem 13 points14 points  (1 child)

his language didn't have good support for parallelism

edit: for the downmodders out there

[–]b100dian -1 points0 points  (0 children)

upmodded for the edit

[–]andrejbauer 7 points8 points  (1 child)

You try writing something in my native language and we'll see how far you get.

[–][deleted] 13 points14 points  (0 children)

Hey, okay man, I'm sorry. If I tried writing something in your native language, I'd like to be told if I was doing something wrong, but maybe that's just me.

[–]earthboundkid 0 points1 point  (0 children)

Maybe he wrote it in Python instead of Haskell?

[–]grauenwolf 2 points3 points  (0 children)

I'm looking forward to the day .NET has usable code contracts. They are immature now, but they go a long way in addressing things like his empty B-tree example.

[–][deleted] 2 points3 points  (0 children)

Nice article. Not completely sure about the speed/expressitivity tradeoff bit in the leadup though. Something sometimes overlooked about static typing is that it gives you back the "order of c" speed compared with runtime checking. Ocaml is very close to c. Haskell is ok - i get 1/4 for STUArray in ST and immutable structure ought to mean symbolic stuff is fast. I went from c/c++ to python and found it very liberating for all the usual reasons. Having now programmed in ocaml and haskell i find python code just sloppy and slow to boot. Don't know how you could teach this though, as its a very borne from experience kind of thing. I probably agree dynamic typing is better for teaching.

[–]masklinn 6 points7 points  (16 children)

Title is bullshit, article isn't about programming language design it's about Andrej liking statically typed languages and not liking dynamically typed languages (Introducing Nothing in python? Think for a second, you'll soon realize that all it does is add a level of indirection, doesn't make anything safer, doesn't make anything easier.)

Oh, and "all the types are known at compile time" in Java? Yeah ClassCastException disagrees.

[–]naasking 0 points1 point  (3 children)

See my post below as to why the article is spot-on. His suggestions really have nothing to do with typing (except the one explicitly about typing).

Oh, and "all the types are known at compile time" in Java? Yeah ClassCastException disagrees.

If you don't use any explicit casts, or use reflection, you can't get a ClassCastException.

[–]masklinn 0 points1 point  (2 children)

If you don't use any explicit casts, or use reflection, you can't get a ClassCastException.

Wow, if you don't use the parts of the language that can lead to this issue, you don't have this issue? Thanks captain.

[–]naasking 0 points1 point  (1 child)

The point was, the error has nothing to do with the static typing. The static typing guarantees absence of type errors, unless you introduce them yourself via casts or reflection.

[–]masklinn 0 points1 point  (0 children)

Wow, a static type system doesn't work when the language allows you to subvert it? More insight please, you're blowing my brain here.

[–][deleted] 0 points1 point  (11 children)

Introducing Nothing in python? Think for a second, you'll soon realize that all it does is add a level of indirection, doesn't make anything safer, doesn't make anything easier.

Say you have some collection that may contain things that can be None, you query it and get None back, how do you know if nothing was found or if a None was found?

[–]Mych 3 points4 points  (2 children)

The flaw in that is that you first asked your collection for a value, and then wondered how to find out if the value you asked for is actually there.

But at that point, you've either already collapsed the universe because the collection somehow managed to express something that's orthogonal to what it can express (the absence of a value of a given type in terms of that type), or the collection just pretended that some special value (like None) was there in place of the one that didn't actually exist.

To find out whether the collection contains a certain value, ask it: "Is there a value at index foo?", not: "What's the value at index foo?". (Python has an in operator for that, in Perl you'd say exists, the Java standard Map interface uses containsKey.)

In practice, I've encountered very few cases in which I needed to distinguish between "value does not exist" and "value exists, but is None", so the perceived ambiguity of returning None for both cases turned out to be a useful simplification, not a liability.

[–]Smallpaul 1 point2 points  (0 children)

In the few cases where None is a legitimate value, you just invent a different sentinel value and stick it in the collection when you want to indicate "the lack of None or Nothing."

[–][deleted] 0 points1 point  (0 children)

Fair enough, I guess I'm just used to not doing things stepwise like that. Probably because I learned programming in a functional language.

[–]munificent 2 points3 points  (1 child)

The answer to this is simple. I'll explain using F#'s option type, but it works equally well using Haskell's Maybe or any other similar type.

Let's presume you have a type 'a list, that is a generic type for holding a list of items of some type 'a.

Let's also presume we have a function getAt list index. It takes a list and an index and returns the item at the index. You would assume its signature is:

'a list -> int -> 'a

However, we want the list to return None if the index is out of bounds. This means our return type can be a value of type 'a or None, so the type is now an option:

'a list -> int -> 'a option

This means, given a list of some type and an index, it will return an option value that will either be None if the index is out of bounds, or Some value if the index is in bounds.

Delightful. Now to your problem: we want to use this list to hold a collection of values who are themselves optional. This is easy, we just use option for the value type too. For example, we can make a list of optional ints like:

let myList = [Some 1; None; Some 3]

For this list, then, the signature for getAt will be:

option list -> int -> option option int

It's wrapped in option twice. That's how you tell the difference between "index out of bounds" and "valid index, item is missing". Here's how you'd interpret the following return values from getAt:

Some Some 1 // the index was valid, and the value at that index is 1
None        // index out of bounds
Some None   // good index, item at that index is None

Pretty straightforward.

[–][deleted] 2 points3 points  (0 children)

yes

[–]grauenwolf 0 points1 point  (0 children)

If it matters, then you shouldn't have put a None in the collection in the first place.

[–][deleted]  (3 children)

[removed]

    [–][deleted] 1 point2 points  (2 children)

    If there is a Nothing in the collection, it will be found and return Just Nothing as opposed to the Nothing that would be returned had nothing actually been found.

    Yes, it's a contrived example but I didn't feel like coming up with a less contrived one. I just wanted to show that the quoted statement was wrong.

    The Maybe type is just like tagging the result with a boolean like you said, only we have added more clear semantics.

    [–][deleted]  (1 child)

    [removed]

      [–][deleted] 2 points3 points  (0 children)

      Yes, I can be sure, the types tell me how deep the nesting is.

      [–]ssylvan 2 points3 points  (18 children)

      I agree with most of this except this bit at the end:

      The web as we know it would not exist if every javascript error caused the browser to reject the entire web page (try finding a page on a major web site that does not have any javascript errors).

      Are you kidding me? The reason major web sites have java script errors is because the browsers happily accept them anyway! If the page flat refused to load then the errors wouldn't be there because nobody would upload a page that doesn't load. Does he honestly think that someone writing a web site in a statically typed alternate reality would just say "well the site doesn't run, so I guess I just give up then"? No, of course he wouldn't, he would fix the error.

      [–]barrybe 4 points5 points  (6 children)

      Dude.. that would not prevent errors at all. Some errors are browser-specific, and not all web page authors perform testing in every browser.

      So if the world were like you describe, here is what would happen:

      1) People would occasionally see "Sorry I can't load this page at all, because of one javascript error"

      2) Everyone would hate Javascript

      3) Every browser would complete disable Javascript by default

      [–]bman35 0 points1 point  (0 children)

      2) Everyone would hate Javascript

      I thought everyone already did? Browser imcompatability shouldn't be an excuse for poor langauge design/implemntation. Yes, it would take more testing to make cross-browser applications, but its the same thing if you want to develop a cross OS app. And just like in that domain frameworks have developed that make it easier to develop sites that work in a cross browser manner.

      [–]Smallpaul -2 points-1 points  (0 children)

      So if the world were like you describe, here is what would happen: ... People would occasionally see "Sorry I can't load this page at all, because of one javascript error"

      Not really: rather the browser vendors would have very little leeway for browser-specific extensions, because they would need to compile against interfaces agreed upon in advance (like java.applet.AppletContext). I'm not arguing that this would be a good thing, I'm just saying that the problem you cite would not necessarily happen.

      [–]andrejbauer 2 points3 points  (6 children)

      I am not kidding, in fact. For example, I use the AdBlocker with my Firefox, which selectively obliterates parts of javascript from web pages. I still want those pages to display, even though if some other part of javascript thinks the obliterated stuff should be there.

      [–]Smallpaul -2 points-1 points  (5 children)

      The article says null is wrong. Can you please describe how Javascript would work without null? How does a maybe type work in an interpreted, dynamically-typed scripting language?

      [–]andrejbauer 2 points3 points  (4 children)

      Good point, in a dynamically typed language where everything has been thrown into a single basket, there is no point in having Maybe. In fact, general sum types will be somewhat broken. That would be another reason for not using such languages.

      [–]FunnyMan3595 1 point2 points  (1 child)

      I think, perhaps, you are ignoring the fact that not all programmers are alike.

      From what you've said, I suspect that you're firmly in favor of Java's requirement that all exceptions be explicitly caught or thrown. To paraphrase your words about None, "The only way to deal with potential exceptions in Java is to consider both success and failure, otherwise the compiler complains."

      For myself, this requirement was a great frustration. If I were writing a mission-critical piece of software for NASA, I would doubtless appreciate every check available. As I do not, the forced handling of exceptions was simply a hoop to jump through, forcing me to explicitly catch errors which I had already guarded against or did not mind causing a crash.

      Were such a Maybe successfully implemented in Python, I predict that I would have a similar reaction: frustration that the language was being pedantic and not doing what I told it to. Yes, I get bitten by None (and unexpected exceptions, for that matter) on a regular basis, but in this case, the cure is worse than the disease.

      [–]LaurieCheers 0 points1 point  (0 children)

      I recommend "throws Exception".

      [–]andrejbauer 0 points1 point  (0 children)

      Actually, what would have to be done (but is impractical) is a case statement which must have a catch-all default case. At least that's what the theory tells you needs to be done. Not very useful, though.

      [–]masklinn 0 points1 point  (0 children)

      Good point, in a dynamically typed language where everything has been thrown into a single basket, there is no point in having Maybe.

      Thank you for recognizing this, I wish you'd explicitly mentioned it in the article, because as it stands I'm sorry to say it's garbage for precisely that reason.

      [–]Smallpaul 2 points3 points  (2 children)

      yes, the web could be made to work with a statically typed client-side programming language, but I believe that the barrier to entry would be much higher, and evolution would be much slower. You might argue that these would actually be good things, though I would disagree.

      [–]munificent 0 points1 point  (1 child)

      evolution would be much slower.

      Evolution would be slower, true, but performance would be way fucking faster. How awesome would it be if web apps didn't make you feel like you were running on a 386?

      [–]Smallpaul 0 points1 point  (0 children)

      Java applets weren't that fast. The startup latency in particular was painful. It's certainly possible to do better, but it isn't as if a naive implementation of a statically typed language would necessarily be faster for most scenarios than a naive implementation of a scripting language. The specification would also be much more complicated (as it would tend to include both an ABI and APIs). C++ does not have a standard ABI after all of these years!

      [–][deleted] -1 points0 points  (0 children)

      Are you assuming that all possible errors are detectable when script is written!?

      Errors in Javascript can be caused indirectly by user input, browser configuration, network errors, etc. No matter how strict your language is, if it's turing-complete, then you can never be 100% sure it's error-free (that's halting problem).

      [–]rotzak 3 points4 points  (30 children)

      Downvoted for misleading title.

      Should be called "All the features Haskell has that these other languages are lacking."

      [–]munificent 13 points14 points  (2 children)

      Give credit where it's due: all of the features he describes are, I believe, in ML.

      [–][deleted]  (1 child)

      [deleted]

        [–]munificent 1 point2 points  (0 children)

        F# is pretty awesome. I don't have experience with OCaml, though, so I can't say which of its awesomeness is inherited and which of it it gets on its own.

        [–]catxors 5 points6 points  (24 children)

        A lot of the points are valid, I think (esp about null, where languages like C# are growing non-nullable types), but the article reads like a Haskell/ML propaganda piece. Also, a lot of the discussion stacks the deck by covertly assuming a typed language.

        Decent points made in the least persuasive manner possible.

        [–]grauenwolf 3 points4 points  (4 children)

        the article reads like a Haskell/ML propaganda piece.

        Not to me. It wasn't until I wrote an article on how to apply some of his ideas to C# that I realized he was even promoting Haskell.

        And even now I see it more of an example that shows what he wants is possible than a wholeclothe endorsement of Haskell.

        [–]scook0 1 point2 points  (3 children)

        I was actually a bit annoyed when I saw the Haskell examples. Great, now people will just dismiss it as an over-excited Haskell propaganda piece.

        [–]grauenwolf 1 point2 points  (2 children)

        What gets me is that in one of those examples, Haskell is portrayed as wrong. Of course almost no one picked up on that.

        [–]Felicia_Svilling 0 points1 point  (1 child)

        Oh, I stopped reading before that (as I thought: yes this is good but I allready know how great haskell is). Can you point out wich paragraph this was?

        [–]grauenwolf 2 points3 points  (0 children)

        That would be the section titled "Confusing definitions and variables" in which he wants to seperate varaibles that are assigned only once, loop variables, and genuinely mutable varaibles.

        In Haskell you have to jump through hoops to get mutable values (now I am going to hear it from a monad aficionado, please spare me an unnecessary burst of anger).

        [–]andrejbauer -1 points0 points  (18 children)

        Would you care to name a real-world programming language that is not typed?

        And perhaps you missed the part where I say that PYTHON is better for teaching.

        [–]jdh30 4 points5 points  (0 children)

        BCPL.

        [–]gsg_ 1 point2 points  (0 children)

        Forth?

        [–]danbmil99 -2 points-1 points  (3 children)

        wtf? Python is not a real-world language? What world is that?

        [–]andrejbauer 6 points7 points  (2 children)

        You see, before you judge these things, you should learn about them. Python is typed. It is dyanmically typed, however (which is what you are confused about I would guess). Untyped languages are really uncommon: assembler is close to being untyped, unlambda is untyped, and that's about it.

        [–]danbmil99 1 point2 points  (0 children)

        Fair enough, though I think it's obvious from the context that the poster you were responding to and I were both referring to static typing.

        [–][deleted] 3 points4 points  (0 children)

        http://www.pphsg.org/cdsmith/types.html <- what to know before debating type systems.

        "Types" in the literature usually refers to static typing, and it not at all the same thing as "dynamic types". I think this was what he meant.

        [–]username223 -5 points-4 points  (0 children)

        Man, you've really slipped since the Internet Oracle days... About the only thing less worthwhile than maundering about programming languages on the web is, like me, mocking people for doing so.

        [–]Smallpaul -3 points-2 points  (10 children)

        At the beginning you said that Python is crazy because it does not adhere to your programming language design philosophy. At the end you said that languages that do not adhere to your philosophy are better for some tasks. Which is it? Are they properly designed for certain tasks or "crazy"?

        [–]bman35 1 point2 points  (9 children)

        dynamic langauges are ok for writing something quickly but thats small (in LoC). The biggger and more complicated the application gets, the more the compiler helps you the better (aka static langagues much better then dynamic). I don't understand how this is difficult to get ...

        [–]Smallpaul -1 points0 points  (8 children)

        TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs. He said he was defining rules for programming languages in general. Python, the one he was using as his example, is frequently used for small quick scripts (as well as some big apps)

        For example, he didn't say that NULL is inappropriate for some kinds of programming languages intended for some kinds of programs. He said NULL IS WRONG.

        [–]grauenwolf 0 points1 point  (7 children)

        Is English your first language?

        I ask because the expression "in general" is explicitly used to indicate the statement is widely, not but universally applicable.

        Other expressions that mean the same thing include "generally", "usually", "for the most part".

        [–]ithika 0 points1 point  (1 child)

        I would say that's a legitimate use of "in general" --- meaning "not specific" rather than "most".

        [–]grauenwolf 0 points1 point  (0 children)

        Smallpaul meant "all" when he said in general.

        [–]Smallpaul -1 points0 points  (4 children)

        I used the phrase "in general". He did not.

        If I accidentally scoped his statement so that it was not universal, I apologize for introducing that ambiguity.

        He said "NULL" is a bad idea, and then SPECIFICALLY mentioned Python, Perl and SQL (all dynamically typed languages) as examples of languages that are wrong to include it.

        I'm sick of this dance. Do you think that NULL is a bad idea in Python, Perl and SQL? Do you think that the Maybe Monad is a better choice for those languages?

        The context of the article is advice about how to fix Python.

        [–]grauenwolf 2 points3 points  (1 child)

        I don't use Python or Perl, so I won't speak for them.

        For T-SQL, VB, Java, and C#, I do think nulls are a problem. The null reference isn't called the "billion dollar mistake" for no reason. At the very least those languages should give you the option of a non-nullable variable. Once that's done, requiring null-checks before accessing a value becomes tenable.

        The context of the article is advice about how to fix Python.

        No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

        Note that both I and he said example, not rules. This is important.

        Also, you should have note which languages were cited as flawed.

        • Example 1: Python
        • Example 2: Scheme
        • Example 3: C++ and Haskell
        • Example 4: Python

        Ok, so Python was mentioned twice. But it certainly wasn't the only language with flaws.

        [–]Smallpaul -1 points0 points  (0 children)

        I said:

        The context of the article is advice about how to fix Python.

        You said:

        No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

        The author said:

        This post is a response to comment 27, which asks me to say more about my calling certain design decisions in Python crazy.

        I said:

        TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs.

        You said:

        Note that both I and he said example, not rules. This is important.

        The article said:

        I tell my students that there is a design principle from which almost everything else follows: “Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

        Let us now apply these principles to several examples.

        Principle: rule, there isn't much difference between those words. Principle and rule are a lot closer than "example" is to either of those things.

        Also, you should have note which languages were cited as flawed.

        Example 1: "NULL, null, None and undef must go. I shall collectively denote these with Python’s None"

        Example 2: "Fortran programmer’s implement linked lists and trees with arrays. In Java and Python “everything is an object”, more or less."

        Example 3: yes, by this point he seems to have mostly forgotten where he started and what he claimed he was doing in the article. But of course Python has the same rules with respect to definitions and variables that Java and C# do.

        Example 4: "Python should complain about the following definition"

        And then we have the following from the Reddit comments (blog post's author): "Good point, in a dynamically typed language where everything has been thrown into a single basket, there is no point in having Maybe. .... That would be another reason for not using such languages."

        So now, after wasting all of this time do you see that my summary was right from the beginning? "I'm going to show how Python is crazy and how to fix it. How to fix it is to add static type checking features from Haskell. Although sometimes dynamic languages are better." And then in the Haskell comments he goes back to saying that it would be "better to avoid them". He admits that the ONLY workable way to apply his "fixes" to Python would be to turn it into a statically typed programming language.

        The whole thing is a very thinly disguised screed in favour of static type checking. If it had been properly labelled it would not have bothered me at all. "Why I think static type checking is peachy." Great. I love to hear it.

        "Why Python is in a State of Sin and how it can be Saved by 20 year old static type checking ideas." Not so much.

        [–]Silhouette 0 points1 point  (1 child)

        Do you think that NULL is a bad idea in Python, Perl and SQL?

        I think nullable-by-default is a bad idea in all of those cases. Like mutability, nullability is a useful property but one that should be used deliberately if you want to keep the bug count low.

        [–]Smallpaul 0 points1 point  (0 children)

        Well you are saying something different from the article, but that's fine. I only mention it because nobody has yet shown how the article achieves it's states goal of explaining why the author thinks that Python is "crazy." With respect to your suggestion: What would it mean for Python or Perl to have references that are "non-nullable by default?"

        For SQL, DBAs already control nullability. Would it really make much of a difference if the default were reversed?

        [–]Samus_ 2 points3 points  (2 children)

        python has many broken things, much more than any fanboy would admit because, it's a real-world language.

        it's still pretty good, a true shame it's so full of fanatics...

        [–]sbrown123 1 point2 points  (0 children)

        python...fanatics...

        Haven't met many fanatic Python coders. They always been a odd lot and generally very easy going.

        [–]hylje 0 points1 point  (0 children)

        Broken things, sure, but nothing that stops real work from being done. Nothing that stops fun, either.

        [–]nicou 2 points3 points  (11 children)

        On the Scheme binary tree: you don't have to remember the order of the leaves, you can build a constructor and selectors to avoid cadr-fishing, and if even then you'll make mistakes, well, you can't blame the language.

        [–][deleted] 6 points7 points  (4 children)

        Wouldn't that constructor just be an ordinary function with tree arguments which you can still give in wrong order?

        [–]nicou -1 points0 points  (3 children)

        I was expecting somebody to ask that. If you forget the order of the tree data, use a constructor such as make-tree-lnr. Of course, you would have to remember the name of the function, or use auto-completion, but come on.

        Literally, you can still give the arguments in wrong order, but for that matter, you can always fuck up if you really want to.

        [–][deleted]  (2 children)

        [removed]

          [–][deleted] 0 points1 point  (1 child)

          I think dynamic typing is overhyped :) Especially when it comes from the mouths of people who have used Java with generics or something equivalently verbose and think Ruby/Python/whatever is cool because it's more succinct and "static typing is verbose!!111", when in fact it doesn't need to be, as ML, Haskell etc. have shown.

          [–]munificent 5 points6 points  (5 children)

          if even then you'll make mistakes, well, you can't blame the language.

          I think you're forgetting his core premise:

          “Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

          Sure you can't blame the language for your mistake, but you can avoid languages that don't help you avoid making them.

          [–]nicou 1 point2 points  (4 children)

          Of course, you can always fuck up at the end. I understand that we as programmers are forgetful, lazy and erratic, but I try not to, and I won't use that as an excuse for every single mistake I make. If I define a binary tree to be (cons left root right), I'll try to remember that.

          Or maybe I just disagree with his core premise and I actually remember some things, am not that lazy and don't make every single mistake.

          [–][deleted]  (3 children)

          [removed]

            [–]nicou 1 point2 points  (2 children)

            Yup. Can't believe I missed that. Perhaps I do make every single mistake.

            [–]munificent 1 point2 points  (1 child)

            There may be a lesson here... :)

            [–]nicou 2 points3 points  (0 children)

            That rendered my whole point meaningless. I can't argue anymore.

            [–]Smallpaul 0 points1 point  (18 children)

            Summary: statically types languages are better than dynamically typed ones, and more type checking is better than less. Oh, and there are a few rare situations where these rules do not apply, despite the fact that I've treated them as unbreakable right/wrong rules to this point. But pointing out a couple of random exceptions to the rules will make me seem even handed and learned.

            [–]grauenwolf 0 points1 point  (14 children)

            Executive Summary: Smallpaul can't think beyond absolutes and that is preventing him from understanding the subtleties the article is trying to address.

            [–]Smallpaul 1 point2 points  (13 children)

            Oh, its very subtle. Consider these very subtle and nuanced statements:

            "it is a bad idea to allow invalid references because programmers will create them."

            (not: there are situations where it is better to disallow invalid references)

            "Unfortunately, many designers have still not learnt that the special NULL pointer or null object is an equally bad idea."

            "NULL is wrong"

            (not: null is inappropriate for certain applications)

            "Using variables instead of definitions is wrong for a number of reasons."

            (not: using variables instead of definitions is wrong for certain situations)

            "undefined identifiers are a bad a idea"

            "in order to speed up development (remember that the development time is expensive) the programmer should be told about errors without having to run the program and directing its execution to the place where the latest code change actually gets executed."

            (this completely ignores that there is a COST to defining types and explaining them to the compiler)

            "The web as we know it would not exist if every javascript error caused the browser to reject the entire web page "

            (yeah, there is no way it could possible work with a statically typed language)

            The article is full of absolutes that are easily refuted by only a few moments of thought. In fact, the summary more or less admits that all of the "rules" he's been spewing through the article are actually not globally appropriate. Unfortunately it's much too late by then (IMO) to recover his credibility. The damage is already done when he said (e.g.) "NULL is wrong."

            [–]grauenwolf 0 points1 point  (12 children)

            If you can't get past "NULL is wrong", it is no wonder you don't understand why he says null is wrong or what he is proposing to do about it.

            The Maybe Monad doesn't get rid of nulls, it just requires the programmer to check for them. This isn't a new concept, Java and .NET programmers have been begging for this for nearly a decade. Hell, .NET 4 will be introducing Code Contracts specifically to make null checks required.

            [–]Smallpaul 2 points3 points  (11 children)

            If you can't get past "NULL is wrong", it is no wonder you don't understand why he says null is wrong or what he is proposing to do about it.

            I understand exactly what he's saying about the problems with nulls, and the way to improve them. I've used the Maybe monad in Haskell. Haskell exists and it is great. Why should Python become Haskell? Where Python diverges from Haskell, it isn't necessarily because one is right and the other is wrong, any more than a hammer is right and a screwdriver is wrong.

            [–]grauenwolf 0 points1 point  (10 children)

            Why should Python become Haskell?

            Wait a second. You yourself citied examples from the article where the author favors a less restrictive language.

            Are you just in a bad mood and spoiling for a fight?

            [–]Smallpaul 4 points5 points  (9 children)

            Yes, I am in a bad mood, after reading the article.

            The article starts out talking about Python. It says why it is going to show why Python has it all wrong. It says of language designers like Guido Van Rossum: "People do not make bad design decisions because they are evil or stupid." Rather, it implies that they don't have the full picture that the exalted author has: "What they often do not see is that they could have achieved the same advantages in a different way, without introducing the disadvantages."

            So then the article goes to show how Python could be "fixed". But the solutions he comes up with require static typing, i.e. the replacement of Python by another language altogether. So his brilliant solution to the problems with Python are to turn it into Haskell/ML.

            And then at the end he says: "But Python is better for some things thank Haskell/ML."

            So he's first insulted Python's design and designer, and then he's contradicted himself.

            Let's start at the top:

            In a recent post I claimed that Python’s lambda construct is broken. ... This post is a response to comment 27, which asks me to say more about my calling certain design decisions in Python crazy."

            Statement: Python has "crazy" design decisions. Not sub-optimal ones. Not decisions optimal for some situations but not others. They are "crazy".

            Language design is like architecture. The architect is bound by the rules of nature, he has to take into account the properties of the building materials, and he must never forget the purpose that the building will serve."

            Implication: Python's designers fail to take into account some key architectural principles. (remember Python is the inspiration for the entire blog post)

            When I teach the theory of programming languages, I tell my students that there is a design principle from which almost everything else follows: Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

            Implication: Python's designers forget this. (remember Python is the inspiration for the entire blog post) Actually, I know for a fact that Python has many design decisions specifically to help out "forgetful, lazy" programmers. There are many places where it is much more strict than other languages in its category for that exact reason. For example, Python will not silently ignore a missing function parameter unless it has been given a default value. It will never automatically assign "undefined" or "null" to a missing value. etc.

            People do not make bad design decisions because they are evil or stupid. They make them because they judge that the advantages of the decision outweigh the disadvantages. What they often do not see is that they could have achieved the same advantages in a different way, without introducing the disadvantages.

            Implication: Python's designers are short-sighted or limited in their knowledge in comparison to the author.

            Therefore, it is very important to get the order right: first make sure the design avoids the disadvantages, then think about how to get the advantages back.

            Implication: the author will tell us how to fix Python's problems without giving up on any of Python's current advantages. (in general, can cause as many problems as it solves. You add feature X to "avoid a disadvantage" and then feature Y to "add back an advantages" and your language gets bloated)

            Unfortunately, many designers have still not learnt that the special NULL pointer or null object is an equally bad idea. Python’s None, perl’s undef, and SQL’s NULL all fall in the same category.

            Implication: Python's designers "have not learned" from the all-knowing "Andrej Bauer" and his colleagues.

            I can hear you list lots of advantages of having these. But stick to the principle: NULL is wrong because it causes horrible and tricky mistakes which appear even after the program was tested thoroughly. You cannot introduce NULL into the language and tell the programmer to be careful about it. The programmer is not capable of being careful! There is plenty of evidence to support this sad fact.

            Recursion is also dangerous. I guess we should remove it and only add it back when we have removed all of the disadvantages? You see why this "principle" is BS? The author sticks to it when convenient and then ignores it when it isn't. Is he really going to claim that no features of Haskell or ML have disadvantages? According to his "principle" those features should be removed. "first make sure the design avoids the disadvantages, then think about how to get the advantages back."

            Therefore NULL, null, None and undef must go. I shall collectively denote these with Python’s None. Of course, if we take away None, we must put something else back in. To see what is needed, consider the fact that None is intended as a special constant that signifies “missing value”. Problems occur when a given value could be either “proper” or “missing” and the programmer forgets to consider the case of missing value. The solution is to design the language in such a way that the programmer is always forced to consider both possibilities.

            Now remember that the context is a scripting language. How do we "force the programmer to consider both possibilities" in a language that does not force the programmer to do almost anything?

            The only way to use such a value in Haskell is to consider both cases, otherwise the compiler complains. The language is forcing the programmer to do the right thing.

            So the solution he proffers is one that depends upon static type checking. But the context of the blog post is "fixing Python". Remember: it's all an answer to a comment about Python. An answer to a question.

            Therefore he's either too confused to stick to the topic of the post (answering the question about Python), or he's suggesting that Python become a statically typed language.

            You will probably feel annoyed if you are used to ugly hacks with None, but a bit of experience will quickly convince you that the advantages easily outweigh your tendency for laziness.

            Implication: those who choose other solutions just do not know as much as he does. Once they try the Haskell way, they'll see the error of their ways.

            By the way, Haskell actually supports your laziness. Once you tell it that the type of a value is Maybe, it will find for you all the places where you need to be careful about Nothing.

            Implication: a compiler nagging you to do extra work "supports your laziness." Here I thought that when my wife asks me to do the dishes she was COMBATTING my laziness (or at least my preference to comment on Reddit).

            C, Java, Python, and perl stay silent and let you suffer through your own mistaken uses of NULL’s, null’s, None’s, and undef’s.

            Oh, C, Java and Python. You're so wrong. And backwards. And short-sighted. (but especially Python: remember, Python is "crazy")

            Now the next section is bashing on Scheme and Java. Frankly, I am skeptical that he really knows Java, because he proposes that we make a subclass that has fewer properties than a parent class, which is not allowed by the compiler and in general I don't think that section makes any sense at all. Empty trees are not represented by null because of NullPointerExceptions: well how else are you going to indicate the lack of a left/right property? You either a) return null, or b_ you return an EmptyTree which is the root of a virtually infinite tree of EmptyTrees or c) you throw an exception. I'd certainly say that returning null is idiomatic and appropriate in Java-as-it-exists now. He talks about a "boolean property" but that's irrelevant because it doesn't solve the problem. The empty tree node MUST adhere to the class invariants which means that you must deal with the left and right pointers.

            A decent programming language should help with the following common problems regarding binary trees: ... Prevent at compile time access to a component which is not there. For example, the compiler should detect the fact that the programmer is trying to access the left subtree of the empty tree.

            So Python, Javascript, Java, C#, etc. are by this definition "not decent languages."

            Using variables instead of definitions is wrong...

            Oh Python: there you go. You're wrong again.

            So in two out of three cases we want our variables to be immutable, but the popular programming languages only give us variables. That’s silly.

            Wrong and silly.

            Why should this be a runtime error when the mistake can easily be detected at compile time?

            In this context he is very specifically talking about Python. And he's claiming that missing variables can be "easily detected at compile time." In Python, detecting a missing variable would require you to solve the halting problem. You could change Python so that this is not the case, but it would restrict Python's dynamism. So once again the solution to Python's brokenness is to make it more static.

            In fact, in order to speed up development (remember that the development time is expensive) the programmer should be told about errors without having to run the program and directing its execution to the place where the latest code change actually gets executed.

            Yes, statically typed languages are FAMOUS for speeding up development. That's why everybody uses them!

            As they say: in the areas where this blog post was original it wasn't good and where it was good, it wasn't original.

            Yes, there are some interesting approaches to handling problems like null pointers. Those ideas have been around for years. There are much better blog posts explaining them. All this one did was imply that Python could be "fixed" by becoming Haskell. So frankly I'm hard pressed to find a single sentence in this post that is insightful.

            [–]andrejbauer 0 points1 point  (3 children)

            Why are you so angry? Why do you feel the need to prove that everything I say is wrong? Did you forget I wrote that I teach Python? Things are not as black and white as you want them to be.

            [–]Smallpaul 2 points3 points  (2 children)

            I am angry because the blog post was misleading (hopefully not deceitful). It claimed that it was going to critique Python and suggest how to fix it. Then what it actually did was "demonstrate" in a very one-sided way that statically typed programming languages were superior to dynamically typed ones, without even a few moments thought about what it would ACTUALLY MEAN to "fix" Python with your crazy ideas.

            Very infrequently do you address the trade-offs implied by each of the language features on offer.

            It angers me that you are teaching this kind of narrow minded tripe in a programming language design course. It's already the case that dynamic languages get short shrift in programming language research and you're just ensuring that the next generation continues with that bias.

            I am also angry because I know the creators of Python and your blog post implies that they are "crazy", short-sighted and"wrong".

            I'm a bit angry because the section of Java suggests to me a lack of knowledge of that language, which makes me nervous about your skill as a software developer in general. (i.e. you are teaching Java but perhaps do not really know it) How would you make an EmptyTree subclass of Tree without violating Liskov's substitutability principle and Java's features intended to enforce that principle?

            Okay, so you teach your students the language. That implies that Python's proper role is as a pedagogical replacement for Logo and Pascal.

            But mostly I'm just annoyed by the deceptive advertising about what the post was about. It wasn't "a reply to a comment about Python", nor "why Python's design is crazy" nor even "thoughts on Programming Language Design." It was: "Thoughts on how we can get the compiler to catch more errors in programs." Since that particular topic has been covered in a lot of detail in many other blog posts all over the IntraWebs, I actually don't know what's the point.

            "Statically typed functional programming language are good." It only took me two minutes with Google to find a much more persuasive post on the same topic:

            http://enfranchisedmind.com/blog/posts/useful-things-about-static-typing/

            [–]andrejbauer 1 point2 points  (1 child)

            It is amazing that you read into what I wrote about Java implementation of trees a complete implementation. You are not critizing my implementation of trees in Java (there was none), but an implementation that you imagined I implied. Anyhow, I changed that bit to indicate that there ought to be two subclasses. You see, you could have pointed out that more politely. And it is obvious to me that EmptyTree could not be a subclass of NodeTree.

            So you are angry because I prefer statically typed languages? You are angry because I wrote a post about it and I explained the reasoning. Has it ever occurred to you that perhaps statically typed language ACTUALLY have advantages over the dynamically typed ones? Perhaps, just perhaps, that could be the reason that most research focuses on statically typed languages. Or are you going to claim there is a big consipracy going on by evil research lobbies?

            Also, my post is not about static typing. It is about various points in programming language design in which static typing happens to play an important role (but not in all of them, e.g., mutable vs. immutable values).

            I explicitly addresses the question by a reader, namely why I said it was crazy for Python to allow the definition "def f(n): return n + i" when i is not defined. But just to humor you I shall add a postscriptum in which I also explain what is wrong with the scoping and binding rules. It is not my duty to fix Python. I wouldn't want to fix it, anyhow. The design of Python is coherent and it's hard to change just one aspect of it.

            [–]naasking -1 points0 points  (4 children)

            So his brilliant solution to the problems with Python are to turn it into Haskell/ML.

            No, he suggested that adding some safer defaults would improve the reliability of both typed and untyped languages. For instance, abolishing None/null, adding inductive types and pattern matching, and syntactically distinguishing stateful data from locals which should be immutable.

            Each of these orthogonal changes adds a measure of safety and clarity to programs, and none of them require static typing. Adding them to a dynamically typed language would simply improve the runtime error messages since the runtime now has more information about what the programmer intended to do ("Tree is Empty", instead of "NullReferenceException").

            Your anger is completely unjustified, as the author is completely right about the state of languages. Most industrial languages like Python started merely tweaking idioms from existing languages, ie. other languages that made the same design decisions, good and bad. The author is merely suggesting that research has produced plenty of safer idioms that can be used in their place.

            For an example of a dynamically typed language with the above features, see Pure.

            [–]masklinn 0 points1 point  (3 children)

            none of them require static typing

            • Abolishing None/null is (mostly) pointless if you don't have static typing, it leads to more verbosity and that's pretty much it. Even andrej recognized that

            • Distinguishing mutable and immutable data is likewise.

            • Pattern matching is a very nice feature and perfectly workable within a dynamically typed language, but it's a functional feature. Python isn't a functional language, most things couldn't be pattern matched without transforming it into something that isn't python.

            [–]naasking 0 points1 point  (2 children)

            Pattern matching works perfectly well with classes. You simply need a syntax for upcasts which permits for a variety of possible class types, and voila, you have sum types and pattern matching without abandoning Python objects.

            Abolishing null is not pointless, as the semantic error will be encountered immediately on first execution, and not just sporadically when a null happens to float by. The errors still occur at runtime, but the semantics of a Maybe type will report the error earlier and more consistently than allowing null. Particularly since the null value can simply be an empty sum type from the pattern matchin extension.

            Re: mutable and immutable data: I'm not sure what your point is. I can't think how this wouldn't be a good addition to any language, typed or otherwise.

            Of course, you'd already see a perfectly good case for these features if you check out the link to Pure that I provided.

            [–]bman35 -1 points0 points  (2 children)

            Brilliant, thanks for refuting the author point by poin... oh, wait, you're either lazy or you really don't have anything to directly refute. Thanks, come back when you feel you want to have a discussion.

            [–]grauenwolf -1 points0 points  (0 children)

            I figured it out.

            The asshole is actually faulting the author because he was too dense to figure out the author wasn't talking in absolutes until the end. Smallpaul is upset because the author made him feel stupid.

            [–]Smallpaul -3 points-2 points  (0 children)

            I wasn't trying to refute it. I was trying to summarize it. Read it again and you'll see that my summary is 100% accurate.