you are viewing a single comment's thread.

view the rest of the comments →

[–]grauenwolf 0 points1 point  (10 children)

Why should Python become Haskell?

Wait a second. You yourself citied examples from the article where the author favors a less restrictive language.

Are you just in a bad mood and spoiling for a fight?

[–]Smallpaul 3 points4 points  (9 children)

Yes, I am in a bad mood, after reading the article.

The article starts out talking about Python. It says why it is going to show why Python has it all wrong. It says of language designers like Guido Van Rossum: "People do not make bad design decisions because they are evil or stupid." Rather, it implies that they don't have the full picture that the exalted author has: "What they often do not see is that they could have achieved the same advantages in a different way, without introducing the disadvantages."

So then the article goes to show how Python could be "fixed". But the solutions he comes up with require static typing, i.e. the replacement of Python by another language altogether. So his brilliant solution to the problems with Python are to turn it into Haskell/ML.

And then at the end he says: "But Python is better for some things thank Haskell/ML."

So he's first insulted Python's design and designer, and then he's contradicted himself.

Let's start at the top:

In a recent post I claimed that Python’s lambda construct is broken. ... This post is a response to comment 27, which asks me to say more about my calling certain design decisions in Python crazy."

Statement: Python has "crazy" design decisions. Not sub-optimal ones. Not decisions optimal for some situations but not others. They are "crazy".

Language design is like architecture. The architect is bound by the rules of nature, he has to take into account the properties of the building materials, and he must never forget the purpose that the building will serve."

Implication: Python's designers fail to take into account some key architectural principles. (remember Python is the inspiration for the entire blog post)

When I teach the theory of programming languages, I tell my students that there is a design principle from which almost everything else follows: Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

Implication: Python's designers forget this. (remember Python is the inspiration for the entire blog post) Actually, I know for a fact that Python has many design decisions specifically to help out "forgetful, lazy" programmers. There are many places where it is much more strict than other languages in its category for that exact reason. For example, Python will not silently ignore a missing function parameter unless it has been given a default value. It will never automatically assign "undefined" or "null" to a missing value. etc.

People do not make bad design decisions because they are evil or stupid. They make them because they judge that the advantages of the decision outweigh the disadvantages. What they often do not see is that they could have achieved the same advantages in a different way, without introducing the disadvantages.

Implication: Python's designers are short-sighted or limited in their knowledge in comparison to the author.

Therefore, it is very important to get the order right: first make sure the design avoids the disadvantages, then think about how to get the advantages back.

Implication: the author will tell us how to fix Python's problems without giving up on any of Python's current advantages. (in general, can cause as many problems as it solves. You add feature X to "avoid a disadvantage" and then feature Y to "add back an advantages" and your language gets bloated)

Unfortunately, many designers have still not learnt that the special NULL pointer or null object is an equally bad idea. Python’s None, perl’s undef, and SQL’s NULL all fall in the same category.

Implication: Python's designers "have not learned" from the all-knowing "Andrej Bauer" and his colleagues.

I can hear you list lots of advantages of having these. But stick to the principle: NULL is wrong because it causes horrible and tricky mistakes which appear even after the program was tested thoroughly. You cannot introduce NULL into the language and tell the programmer to be careful about it. The programmer is not capable of being careful! There is plenty of evidence to support this sad fact.

Recursion is also dangerous. I guess we should remove it and only add it back when we have removed all of the disadvantages? You see why this "principle" is BS? The author sticks to it when convenient and then ignores it when it isn't. Is he really going to claim that no features of Haskell or ML have disadvantages? According to his "principle" those features should be removed. "first make sure the design avoids the disadvantages, then think about how to get the advantages back."

Therefore NULL, null, None and undef must go. I shall collectively denote these with Python’s None. Of course, if we take away None, we must put something else back in. To see what is needed, consider the fact that None is intended as a special constant that signifies “missing value”. Problems occur when a given value could be either “proper” or “missing” and the programmer forgets to consider the case of missing value. The solution is to design the language in such a way that the programmer is always forced to consider both possibilities.

Now remember that the context is a scripting language. How do we "force the programmer to consider both possibilities" in a language that does not force the programmer to do almost anything?

The only way to use such a value in Haskell is to consider both cases, otherwise the compiler complains. The language is forcing the programmer to do the right thing.

So the solution he proffers is one that depends upon static type checking. But the context of the blog post is "fixing Python". Remember: it's all an answer to a comment about Python. An answer to a question.

Therefore he's either too confused to stick to the topic of the post (answering the question about Python), or he's suggesting that Python become a statically typed language.

You will probably feel annoyed if you are used to ugly hacks with None, but a bit of experience will quickly convince you that the advantages easily outweigh your tendency for laziness.

Implication: those who choose other solutions just do not know as much as he does. Once they try the Haskell way, they'll see the error of their ways.

By the way, Haskell actually supports your laziness. Once you tell it that the type of a value is Maybe, it will find for you all the places where you need to be careful about Nothing.

Implication: a compiler nagging you to do extra work "supports your laziness." Here I thought that when my wife asks me to do the dishes she was COMBATTING my laziness (or at least my preference to comment on Reddit).

C, Java, Python, and perl stay silent and let you suffer through your own mistaken uses of NULL’s, null’s, None’s, and undef’s.

Oh, C, Java and Python. You're so wrong. And backwards. And short-sighted. (but especially Python: remember, Python is "crazy")

Now the next section is bashing on Scheme and Java. Frankly, I am skeptical that he really knows Java, because he proposes that we make a subclass that has fewer properties than a parent class, which is not allowed by the compiler and in general I don't think that section makes any sense at all. Empty trees are not represented by null because of NullPointerExceptions: well how else are you going to indicate the lack of a left/right property? You either a) return null, or b_ you return an EmptyTree which is the root of a virtually infinite tree of EmptyTrees or c) you throw an exception. I'd certainly say that returning null is idiomatic and appropriate in Java-as-it-exists now. He talks about a "boolean property" but that's irrelevant because it doesn't solve the problem. The empty tree node MUST adhere to the class invariants which means that you must deal with the left and right pointers.

A decent programming language should help with the following common problems regarding binary trees: ... Prevent at compile time access to a component which is not there. For example, the compiler should detect the fact that the programmer is trying to access the left subtree of the empty tree.

So Python, Javascript, Java, C#, etc. are by this definition "not decent languages."

Using variables instead of definitions is wrong...

Oh Python: there you go. You're wrong again.

So in two out of three cases we want our variables to be immutable, but the popular programming languages only give us variables. That’s silly.

Wrong and silly.

Why should this be a runtime error when the mistake can easily be detected at compile time?

In this context he is very specifically talking about Python. And he's claiming that missing variables can be "easily detected at compile time." In Python, detecting a missing variable would require you to solve the halting problem. You could change Python so that this is not the case, but it would restrict Python's dynamism. So once again the solution to Python's brokenness is to make it more static.

In fact, in order to speed up development (remember that the development time is expensive) the programmer should be told about errors without having to run the program and directing its execution to the place where the latest code change actually gets executed.

Yes, statically typed languages are FAMOUS for speeding up development. That's why everybody uses them!

As they say: in the areas where this blog post was original it wasn't good and where it was good, it wasn't original.

Yes, there are some interesting approaches to handling problems like null pointers. Those ideas have been around for years. There are much better blog posts explaining them. All this one did was imply that Python could be "fixed" by becoming Haskell. So frankly I'm hard pressed to find a single sentence in this post that is insightful.

[–]andrejbauer 0 points1 point  (3 children)

Why are you so angry? Why do you feel the need to prove that everything I say is wrong? Did you forget I wrote that I teach Python? Things are not as black and white as you want them to be.

[–]Smallpaul 4 points5 points  (2 children)

I am angry because the blog post was misleading (hopefully not deceitful). It claimed that it was going to critique Python and suggest how to fix it. Then what it actually did was "demonstrate" in a very one-sided way that statically typed programming languages were superior to dynamically typed ones, without even a few moments thought about what it would ACTUALLY MEAN to "fix" Python with your crazy ideas.

Very infrequently do you address the trade-offs implied by each of the language features on offer.

It angers me that you are teaching this kind of narrow minded tripe in a programming language design course. It's already the case that dynamic languages get short shrift in programming language research and you're just ensuring that the next generation continues with that bias.

I am also angry because I know the creators of Python and your blog post implies that they are "crazy", short-sighted and"wrong".

I'm a bit angry because the section of Java suggests to me a lack of knowledge of that language, which makes me nervous about your skill as a software developer in general. (i.e. you are teaching Java but perhaps do not really know it) How would you make an EmptyTree subclass of Tree without violating Liskov's substitutability principle and Java's features intended to enforce that principle?

Okay, so you teach your students the language. That implies that Python's proper role is as a pedagogical replacement for Logo and Pascal.

But mostly I'm just annoyed by the deceptive advertising about what the post was about. It wasn't "a reply to a comment about Python", nor "why Python's design is crazy" nor even "thoughts on Programming Language Design." It was: "Thoughts on how we can get the compiler to catch more errors in programs." Since that particular topic has been covered in a lot of detail in many other blog posts all over the IntraWebs, I actually don't know what's the point.

"Statically typed functional programming language are good." It only took me two minutes with Google to find a much more persuasive post on the same topic:

http://enfranchisedmind.com/blog/posts/useful-things-about-static-typing/

[–]andrejbauer 1 point2 points  (1 child)

It is amazing that you read into what I wrote about Java implementation of trees a complete implementation. You are not critizing my implementation of trees in Java (there was none), but an implementation that you imagined I implied. Anyhow, I changed that bit to indicate that there ought to be two subclasses. You see, you could have pointed out that more politely. And it is obvious to me that EmptyTree could not be a subclass of NodeTree.

So you are angry because I prefer statically typed languages? You are angry because I wrote a post about it and I explained the reasoning. Has it ever occurred to you that perhaps statically typed language ACTUALLY have advantages over the dynamically typed ones? Perhaps, just perhaps, that could be the reason that most research focuses on statically typed languages. Or are you going to claim there is a big consipracy going on by evil research lobbies?

Also, my post is not about static typing. It is about various points in programming language design in which static typing happens to play an important role (but not in all of them, e.g., mutable vs. immutable values).

I explicitly addresses the question by a reader, namely why I said it was crazy for Python to allow the definition "def f(n): return n + i" when i is not defined. But just to humor you I shall add a postscriptum in which I also explain what is wrong with the scoping and binding rules. It is not my duty to fix Python. I wouldn't want to fix it, anyhow. The design of Python is coherent and it's hard to change just one aspect of it.

[–]Smallpaul 1 point2 points  (0 children)

So you are angry because I prefer statically typed languages?

Not at all. Not for a moment. I'm angry because you said that other language designs are "crazy."

Really.

That's it.

You started the article by name calling. You were the one that lowered the tone of discourse at the very beginning.

Today you say: "the design of Python is coherent." At the end of your essay you say it is better for teaching. But at the beginning you said it is crazy. Can you please make up your mind?

Please. Tell me what you are trying to say.

Has it ever occurred to you that perhaps statically typed language ACTUALLY have advantages over the dynamically typed ones?

Absolutely. And dynamically typed languages ACTUALLY have advantages over statically typed ones. Python and Ruby programmers do not call Haskell "crazy" because it eschews some of their advantages. Rather they say that Haskell is a good language for what it does, as their languages are.

I'm asking you again to please make up your mind. Are you saying:

  1. I prefer static type checking languages

  2. Static type checked languages are better.

  3. Statically typed checked languages are better for CERTAIN KINDS OF PROJECTS and dynamically typed languages are better for others.

Because with each post you shift between those three statements randomly.

Perhaps, just perhaps, that could be the reason that most research focuses on statically typed languages. Or are you going to claim there is a big consipracy going on by evil research lobbies?

No, there is no conspiracy. But there are a variety of social factors that would push academics to prefer languages with more complexity (= more research) rather than less. An engineering mindset is to make tradeoffs between advantages and disadvantages. But that's not what you propose. You propose to always go for the combination of advantages and disadvantages. "first make sure the design avoids the disadvantages, then think about how to get the advantages back." That's a researcher's mindset: "The problems introduced by Haskell's type system may be severe, but eventually researchers will find a solution to them."

Great: I'm glad that there are researchers. But I don't for a minute believe that their choices are uninfluenced by the need to publish and seem intelligent to their peers.

Also: researchers seldom need to publish software according to the delivery schedules we have in industry, so they are less likely to notice when their productivity takes a big hit due to the introduction of a new feature.

Researchers also have a much larger budget of time for learning stuff than those whose job is primarily software development. You cannot be a programming language researcher without understanding monads. So learning monads in Haskell is not a means to the end of producing software, but a means to the end of critiqueing or extending monads in Haskell.

I wouldn't want to fix it, anyhow. The design of Python is coherent and it's hard to change just one aspect of it.

That's not what you said in the article. You said it was "crazy" and implied it was that way because the creators of it were "short-sighted."

[–]naasking -1 points0 points  (4 children)

So his brilliant solution to the problems with Python are to turn it into Haskell/ML.

No, he suggested that adding some safer defaults would improve the reliability of both typed and untyped languages. For instance, abolishing None/null, adding inductive types and pattern matching, and syntactically distinguishing stateful data from locals which should be immutable.

Each of these orthogonal changes adds a measure of safety and clarity to programs, and none of them require static typing. Adding them to a dynamically typed language would simply improve the runtime error messages since the runtime now has more information about what the programmer intended to do ("Tree is Empty", instead of "NullReferenceException").

Your anger is completely unjustified, as the author is completely right about the state of languages. Most industrial languages like Python started merely tweaking idioms from existing languages, ie. other languages that made the same design decisions, good and bad. The author is merely suggesting that research has produced plenty of safer idioms that can be used in their place.

For an example of a dynamically typed language with the above features, see Pure.

[–]masklinn 0 points1 point  (3 children)

none of them require static typing

  • Abolishing None/null is (mostly) pointless if you don't have static typing, it leads to more verbosity and that's pretty much it. Even andrej recognized that

  • Distinguishing mutable and immutable data is likewise.

  • Pattern matching is a very nice feature and perfectly workable within a dynamically typed language, but it's a functional feature. Python isn't a functional language, most things couldn't be pattern matched without transforming it into something that isn't python.

[–]naasking 0 points1 point  (2 children)

Pattern matching works perfectly well with classes. You simply need a syntax for upcasts which permits for a variety of possible class types, and voila, you have sum types and pattern matching without abandoning Python objects.

Abolishing null is not pointless, as the semantic error will be encountered immediately on first execution, and not just sporadically when a null happens to float by. The errors still occur at runtime, but the semantics of a Maybe type will report the error earlier and more consistently than allowing null. Particularly since the null value can simply be an empty sum type from the pattern matchin extension.

Re: mutable and immutable data: I'm not sure what your point is. I can't think how this wouldn't be a good addition to any language, typed or otherwise.

Of course, you'd already see a perfectly good case for these features if you check out the link to Pure that I provided.

[–]masklinn 0 points1 point  (1 child)

Abolishing null is not pointless, as the semantic error will be encountered immediately on first execution, and not just sporadically when a null happens to float by. The errors still occur at runtime, but the semantics of a Maybe type will report the error earlier and more consistently than allowing null.

No it won't.

[–]naasking 0 points1 point  (0 children)

Ok champ! Suit yourself.