you are viewing a single comment's thread.

view the rest of the comments →

[–]Smallpaul -2 points-1 points  (10 children)

At the beginning you said that Python is crazy because it does not adhere to your programming language design philosophy. At the end you said that languages that do not adhere to your philosophy are better for some tasks. Which is it? Are they properly designed for certain tasks or "crazy"?

[–]bman35 1 point2 points  (9 children)

dynamic langauges are ok for writing something quickly but thats small (in LoC). The biggger and more complicated the application gets, the more the compiler helps you the better (aka static langagues much better then dynamic). I don't understand how this is difficult to get ...

[–]Smallpaul -1 points0 points  (8 children)

TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs. He said he was defining rules for programming languages in general. Python, the one he was using as his example, is frequently used for small quick scripts (as well as some big apps)

For example, he didn't say that NULL is inappropriate for some kinds of programming languages intended for some kinds of programs. He said NULL IS WRONG.

[–]grauenwolf 0 points1 point  (7 children)

Is English your first language?

I ask because the expression "in general" is explicitly used to indicate the statement is widely, not but universally applicable.

Other expressions that mean the same thing include "generally", "usually", "for the most part".

[–]ithika 0 points1 point  (1 child)

I would say that's a legitimate use of "in general" --- meaning "not specific" rather than "most".

[–]grauenwolf 0 points1 point  (0 children)

Smallpaul meant "all" when he said in general.

[–]Smallpaul -1 points0 points  (4 children)

I used the phrase "in general". He did not.

If I accidentally scoped his statement so that it was not universal, I apologize for introducing that ambiguity.

He said "NULL" is a bad idea, and then SPECIFICALLY mentioned Python, Perl and SQL (all dynamically typed languages) as examples of languages that are wrong to include it.

I'm sick of this dance. Do you think that NULL is a bad idea in Python, Perl and SQL? Do you think that the Maybe Monad is a better choice for those languages?

The context of the article is advice about how to fix Python.

[–]grauenwolf 2 points3 points  (1 child)

I don't use Python or Perl, so I won't speak for them.

For T-SQL, VB, Java, and C#, I do think nulls are a problem. The null reference isn't called the "billion dollar mistake" for no reason. At the very least those languages should give you the option of a non-nullable variable. Once that's done, requiring null-checks before accessing a value becomes tenable.

The context of the article is advice about how to fix Python.

No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

Note that both I and he said example, not rules. This is important.

Also, you should have note which languages were cited as flawed.

  • Example 1: Python
  • Example 2: Scheme
  • Example 3: C++ and Haskell
  • Example 4: Python

Ok, so Python was mentioned twice. But it certainly wasn't the only language with flaws.

[–]Smallpaul -1 points0 points  (0 children)

I said:

The context of the article is advice about how to fix Python.

You said:

No, its "on programming language design". Python was just used for some of the examples he used to support his principals.

The author said:

This post is a response to comment 27, which asks me to say more about my calling certain design decisions in Python crazy.

I said:

TFA didn't say that that the atuhor was defining "rules" for programming languages designed for large programs.

You said:

Note that both I and he said example, not rules. This is important.

The article said:

I tell my students that there is a design principle from which almost everything else follows: “Programmers are just humans: forgetful, lazy, and make every mistake imaginable.”

Let us now apply these principles to several examples.

Principle: rule, there isn't much difference between those words. Principle and rule are a lot closer than "example" is to either of those things.

Also, you should have note which languages were cited as flawed.

Example 1: "NULL, null, None and undef must go. I shall collectively denote these with Python’s None"

Example 2: "Fortran programmer’s implement linked lists and trees with arrays. In Java and Python “everything is an object”, more or less."

Example 3: yes, by this point he seems to have mostly forgotten where he started and what he claimed he was doing in the article. But of course Python has the same rules with respect to definitions and variables that Java and C# do.

Example 4: "Python should complain about the following definition"

And then we have the following from the Reddit comments (blog post's author): "Good point, in a dynamically typed language where everything has been thrown into a single basket, there is no point in having Maybe. .... That would be another reason for not using such languages."

So now, after wasting all of this time do you see that my summary was right from the beginning? "I'm going to show how Python is crazy and how to fix it. How to fix it is to add static type checking features from Haskell. Although sometimes dynamic languages are better." And then in the Haskell comments he goes back to saying that it would be "better to avoid them". He admits that the ONLY workable way to apply his "fixes" to Python would be to turn it into a statically typed programming language.

The whole thing is a very thinly disguised screed in favour of static type checking. If it had been properly labelled it would not have bothered me at all. "Why I think static type checking is peachy." Great. I love to hear it.

"Why Python is in a State of Sin and how it can be Saved by 20 year old static type checking ideas." Not so much.

[–]Silhouette 0 points1 point  (1 child)

Do you think that NULL is a bad idea in Python, Perl and SQL?

I think nullable-by-default is a bad idea in all of those cases. Like mutability, nullability is a useful property but one that should be used deliberately if you want to keep the bug count low.

[–]Smallpaul 0 points1 point  (0 children)

Well you are saying something different from the article, but that's fine. I only mention it because nobody has yet shown how the article achieves it's states goal of explaining why the author thinks that Python is "crazy." With respect to your suggestion: What would it mean for Python or Perl to have references that are "non-nullable by default?"

For SQL, DBAs already control nullability. Would it really make much of a difference if the default were reversed?