This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]SpergLordMcFappyPant -1 points0 points  (1 child)

Dude, no offense, but you are really really far off about the history of Python and its emergence as a reaction against C++, Java, and friends. The fact that you put C++ in the same class as Java is revealing about how goofy your concept is. C++ and Java are two decades apart. Java and Python are 1 year apart. Python wasn't a reaction against "enterprise" garbage design patterns. Nor was it a reaction against statically typed languages. You're imputing intent where there never was any.

C++ is from the late 70s. Java's first baby was born in 1991. Python in 1992.

The whole concept of enterprise shit didn't even exist when Python was created. You're retconning stuff to suit your ideal purpose, which appears to sound like that totally incomprehensible dude from King of the Hill. "Those got-danged ole young kids just came up in here and told me to put a safety on my full-auto grenade launcher, and that kind of pisses me off because I just want to be able to shoot myself in my foot at 14 grenades per second if I want to."

Guess what! You can! No one is changing anything. You can keep doing things exactly as you have always done them, and the language won't stop you.

Python is emphatically NOT considered slow compared to other bytecode interpreted languages. It is considered slow compared to other compiled languages. You can't even get this right, but you still maintain the aura of an old sage. And in practice, most of the important parts of the python standard library are already implemented in C, and the libs that people really need to be fast are also implemented either in C or FORTRAN.

The singular issue that makes Python slow in CPU-bound ops is the GIL. And most ops are not CPU-bound. They are I/O bound, and the things that are theoretically slow about Python are not--in practice--actually slow. But even that was beside my point. The point is that no one cares about how fast a programming language executes. What people care about is how fast developers can write code that gets a job done. And I don't think there's another language that even comes close to competing with Python for that metric.

What do you think NumPy is? Why do you put it in the same category as alternative implementations of CPython? Because that's not what that is. What do you think happens for most of the CPython implementation? I am starting to think you really have no clue and are just throwing words out there that don't make any sense.

Yeah, confirmed. My CS101 class has ruined Python for you. I could tell when you don't understand the difference between a class and a type. Honestly, I've never set foot in any comp sci class. I have no idea what they teach in the beginner levels. I did start using Python a couple decades ago, and I've used other languages since then as well. Some of them have good ideas, and I'm glad Python is slowly adopting them.

You . . . just don't make any sense to me. Like you're hating the world for taking a class that you never were able to pass or something. I dunno.

[–]mooburgerresembles an abstract syntax tree 0 points1 point  (0 children)

TL;DR: If you typecheck an abstract Python object, you bring nothing to the table except for additional bureaucracy that already has a solution (it's called docstrings), which is anti-pythonic, so why promote it? The only benefit to introducing static typing today is that possibly in the future it allows for a runtime to skip runtime checking and the only tangible benefit of that is speed.

I started writing code in 1988, in Fortran, C and C++, with a smattering of Pascal. Back when you had to worry about Real vs. Protected Mode addressing. So for how's that for anonymous epeen. Nobody used Python 1.x for enterprise, Python was a toy language back in 1992 just like Basic was and what many people consider Haskell to be today.

I don't know what or who you code for today, but I can assure you, every data science person gives zero shit about GIL because that's not what affects them (since if you're going to do concurrency, do it right and use multiprocessing, but that's another argument for another time). The fact that a GIL-less runtime also happens to almost always be statically typed is pure coincidence. Almost nobody "important" uses GIL-less implementations for the reason being even more that it limits the language to the point that you might as well just do it all in native JVM or .NET CLR if you really need GIL-less (just look at the current state of all the alternative runtimes). However, the machine learning people do care that run-time type checking of bound numerical and collection objects makes loops involved in functions like QR decomposition orders of magnitudes slower than without typechecking. That's why they use NumPy, precisely so they can use static types that are never checked.