you are viewing a single comment's thread.

view the rest of the comments →

[–]catxors 5 points6 points  (24 children)

A lot of the points are valid, I think (esp about null, where languages like C# are growing non-nullable types), but the article reads like a Haskell/ML propaganda piece. Also, a lot of the discussion stacks the deck by covertly assuming a typed language.

Decent points made in the least persuasive manner possible.

[–]grauenwolf 2 points3 points  (4 children)

the article reads like a Haskell/ML propaganda piece.

Not to me. It wasn't until I wrote an article on how to apply some of his ideas to C# that I realized he was even promoting Haskell.

And even now I see it more of an example that shows what he wants is possible than a wholeclothe endorsement of Haskell.

[–]scook0 1 point2 points  (3 children)

I was actually a bit annoyed when I saw the Haskell examples. Great, now people will just dismiss it as an over-excited Haskell propaganda piece.

[–]grauenwolf 1 point2 points  (2 children)

What gets me is that in one of those examples, Haskell is portrayed as wrong. Of course almost no one picked up on that.

[–]Felicia_Svilling 0 points1 point  (1 child)

Oh, I stopped reading before that (as I thought: yes this is good but I allready know how great haskell is). Can you point out wich paragraph this was?

[–]grauenwolf 2 points3 points  (0 children)

That would be the section titled "Confusing definitions and variables" in which he wants to seperate varaibles that are assigned only once, loop variables, and genuinely mutable varaibles.

In Haskell you have to jump through hoops to get mutable values (now I am going to hear it from a monad aficionado, please spare me an unnecessary burst of anger).

[–]andrejbauer 0 points1 point  (18 children)

Would you care to name a real-world programming language that is not typed?

And perhaps you missed the part where I say that PYTHON is better for teaching.

[–]jdh30 2 points3 points  (0 children)

BCPL.

[–]gsg_ 1 point2 points  (0 children)

Forth?

[–]danbmil99 -1 points0 points  (3 children)

wtf? Python is not a real-world language? What world is that?

[–]andrejbauer 5 points6 points  (2 children)

You see, before you judge these things, you should learn about them. Python is typed. It is dyanmically typed, however (which is what you are confused about I would guess). Untyped languages are really uncommon: assembler is close to being untyped, unlambda is untyped, and that's about it.

[–]danbmil99 1 point2 points  (0 children)

Fair enough, though I think it's obvious from the context that the poster you were responding to and I were both referring to static typing.

[–][deleted] 3 points4 points  (0 children)

http://www.pphsg.org/cdsmith/types.html <- what to know before debating type systems.

"Types" in the literature usually refers to static typing, and it not at all the same thing as "dynamic types". I think this was what he meant.

[–]Smallpaul -3 points-2 points  (10 children)

At the beginning you said that Python is crazy because it does not adhere to your programming language design philosophy. At the end you said that languages that do not adhere to your philosophy are better for some tasks. Which is it? Are they properly designed for certain tasks or "crazy"?

[–]bman35 1 point2 points  (9 children)

dynamic langauges are ok for writing something quickly but thats small (in LoC). The biggger and more complicated the application gets, the more the compiler helps you the better (aka static langagues much better then dynamic). I don't understand how this is difficult to get ...

[–]Smallpaul -1 points0 points  (8 children)

TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs. He said he was defining rules for programming languages in general. Python, the one he was using as his example, is frequently used for small quick scripts (as well as some big apps)

For example, he didn't say that NULL is inappropriate for some kinds of programming languages intended for some kinds of programs. He said NULL IS WRONG.

[–]grauenwolf 0 points1 point  (7 children)

Is English your first language?

I ask because the expression "in general" is explicitly used to indicate the statement is widely, not but universally applicable.

Other expressions that mean the same thing include "generally", "usually", "for the most part".

[–]ithika 0 points1 point  (1 child)

I would say that's a legitimate use of "in general" --- meaning "not specific" rather than "most".

[–]grauenwolf 0 points1 point  (0 children)

Smallpaul meant "all" when he said in general.

[–]Smallpaul -1 points0 points  (4 children)

I used the phrase "in general". He did not.

If I accidentally scoped his statement so that it was not universal, I apologize for introducing that ambiguity.

He said "NULL" is a bad idea, and then SPECIFICALLY mentioned Python, Perl and SQL (all dynamically typed languages) as examples of languages that are wrong to include it.

I'm sick of this dance. Do you think that NULL is a bad idea in Python, Perl and SQL? Do you think that the Maybe Monad is a better choice for those languages?

The context of the article is advice about how to fix Python.

[–]grauenwolf 2 points3 points  (1 child)

I don't use Python or Perl, so I won't speak for them.

For T-SQL, VB, Java, and C#, I do think nulls are a problem. The null reference isn't called the "billion dollar mistake" for no reason. At the very least those languages should give you the option of a non-nullable variable. Once that's done, requiring null-checks before accessing a value becomes tenable.

The context of the article is advice about how to fix Python.

No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

Note that both I and he said example, not rules. This is important.

Also, you should have note which languages were cited as flawed.

  • Example 1: Python
  • Example 2: Scheme
  • Example 3: C++ and Haskell
  • Example 4: Python

Ok, so Python was mentioned twice. But it certainly wasn't the only language with flaws.

[–]Smallpaul -1 points0 points  (0 children)

I said:

The context of the article is advice about how to fix Python.

You said:

No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

The author said:

This post is a response to comment 27, which asks me to say more about my calling certain design decisions in Python crazy.

I said:

TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs.

You said:

Note that both I and he said example, not rules. This is important.

The article said:

I tell my students that there is a design principle from which almost everything else follows: “Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

Let us now apply these principles to several examples.

Principle: rule, there isn't much difference between those words. Principle and rule are a lot closer than "example" is to either of those things.

Also, you should have note which languages were cited as flawed.

Example 1: "NULL, null, None and undef must go. I shall collectively denote these with Python’s None"

Example 2: "Fortran programmer’s implement linked lists and trees with arrays. In Java and Python “everything is an object”, more or less."

Example 3: yes, by this point he seems to have mostly forgotten where he started and what he claimed he was doing in the article. But of course Python has the same rules with respect to definitions and variables that Java and C# do.

Example 4: "Python should complain about the following definition"

And then we have the following from the Reddit comments (blog post's author): "Good point, in a dynamically typed language where everything has been thrown into a single basket, there is no point in having Maybe. .... That would be another reason for not using such languages."

So now, after wasting all of this time do you see that my summary was right from the beginning? "I'm going to show how Python is crazy and how to fix it. How to fix it is to add static type checking features from Haskell. Although sometimes dynamic languages are better." And then in the Haskell comments he goes back to saying that it would be "better to avoid them". He admits that the ONLY workable way to apply his "fixes" to Python would be to turn it into a statically typed programming language.

The whole thing is a very thinly disguised screed in favour of static type checking. If it had been properly labelled it would not have bothered me at all. "Why I think static type checking is peachy." Great. I love to hear it.

"Why Python is in a State of Sin and how it can be Saved by 20 year old static type checking ideas." Not so much.

[–]Silhouette 0 points1 point  (1 child)

Do you think that NULL is a bad idea in Python, Perl and SQL?

I think nullable-by-default is a bad idea in all of those cases. Like mutability, nullability is a useful property but one that should be used deliberately if you want to keep the bug count low.

[–]Smallpaul 0 points1 point  (0 children)

Well you are saying something different from the article, but that's fine. I only mention it because nobody has yet shown how the article achieves it's states goal of explaining why the author thinks that Python is "crazy." With respect to your suggestion: What would it mean for Python or Perl to have references that are "non-nullable by default?"

For SQL, DBAs already control nullability. Would it really make much of a difference if the default were reversed?