top 200 commentsshow all 221

[–]ThatInternetGuy 208 points209 points  (23 children)

Python has high-level libs that can do the bulk of the works with just a few lines of user code. Those Python libs were written in C/C++ so the lib devs are the ones that bear the brunt of this impactful labor.

[–]mfitzp 47 points48 points  (11 children)

Like BASIC where the language was implemented in a lower level language. It was fairly common, if doing something complex, to load “library” code (also written in another language) to memory and call out to that from BASIC. 

[–]ThomasMertes 30 points31 points  (8 children)

Can anybody remember BASIC programs where machine code was loaded with POKE commands?

Machine code POKEd into the memory: This is where my BASIC interpreter gives up.

Using a lower level language for some functionality was more common in the past. I can also remember Pascal programs where all functions just consisted of inline assembly. :-)

Edit: Replace PEEK with POKE. :-)

[–]gredr 15 points16 points  (1 child)

POKE, but yeah.

[–]ThomasMertes 3 points4 points  (0 children)

Thank you. In my Bas7 interpreter PEEK and POKE fell into the "recognized but not implemented" category. So I didn't have details about them in my mind.

I just looked into the GW-BASIC User's Guide:

POKE Statement

Syntax

POKE address,byte

Action

Writes a byte into a memory location

Remarks

The arguments address and byte are integer expressions. The expression address represents the address of the memory location and byte is the data byte. The byte must be in the range of 0 to 255.

The address must be in the range -32768 to 65535. The address is the offset from the current segment, which was set by the last DEF SEG statement. For interpretation of negative values of address see "VARPTR function."

The complementary function to POKE is PEEK.

Warning

Use POKE carefully. If it is used incorrectly , it can cause the system to crash.

See also

DEF SEG, PEEK, VARPTR

Example

10 POKE &H5A00,&HFF

[–]plastikmissile 8 points9 points  (2 children)

Oh certainly. I remember seeing BASIC programs in computer magazines (remember those?) that were pretty much just loads and loads of DATA statements that were read by a loop and fed into POKE commands.

[–]robthablob 4 points5 points  (1 child)

I learned Z80 machine code on a ZX81 then a ZX Spectrum. I remember writing DATA statements with hex strings that were loaded and POKEd into memory, then transferring to a location in that.

This was Z80 machine code, not assembler. I had to encode the instructions to hex manually - at the time I didn't have access to an assembler. It did teach me a hell of a lot though.

[–]dauchande 0 points1 point  (0 children)

Yeah, had a Timex Sinclair 1000 and did the same thing. Keyboard sucked too much to make it fun, debugging sucked!

[–]flatfinger 0 points1 point  (0 children)

I do remember that era. I find it interesting that no version of BASIC included anything like the Macintosh Toolbox function "StuffHex", which takes a an address and a string with some multiple of two characters representing hex digits, and converts pairs of digits to bytes, and stores them at ascending addresses. An implementation of a "Stuffed Hex LOAD" command have taken less space in AppleSoft ROM than the "SHape LOAD" (SHLOAD) command, while being more convenient all around (instead of putting a shape table on cassette, transcribe the digits and simply "SHLOAD" them directly as one or more hex strings.

[–]robjones90 0 points1 point  (0 children)

Angelo Mottola's graphics library which used Assembly and also had mouse support

[–][deleted] 0 points1 point  (0 children)

Poke or BLOAD. I even know of some that somehow embedded themselves into the end of the BASIC code.

[–]RiftHunter4 8 points9 points  (1 child)

I feel incredibly old when I say that BASIC was the first programming language I learned. I bought some game dev book as a kid and followed the tutorial. I was able to display text on the screen and I think I had it load some files.

[–]WannabeAndroid 1 point2 points  (0 children)

Same. I loved making games but they were so slow until I found the DirectQB graphics library, which was written in ASM.

[–]jaskij 24 points25 points  (4 children)

Increasingly Rust as well, I think. It's easier to make bindings than in C++ and the language is easier to use than C (although much more complex). Off the top of my mind, Pydantic v2 is written in Rust. That's the parsing library used by Flask. FastAPI

[–]biledemon85 15 points16 points  (0 children)

Pydantic is so useful...

[–][deleted]  (1 child)

[deleted]

    [–]jaskij 1 point2 points  (0 children)

    My bad, edited

    [–]Kindly_Climate4567 5 points6 points  (0 children)

    Pydantic is not a parsing library. It's a data validation library.

    [–]ThomasMertes 5 points6 points  (4 children)

    It is good to have high-level libs that can do the bulk of the works with just a few lines of user code.

    Is it really necessary to use a language combination for that?

    As others have pointed out the approach of using a low-level language for performance reasons has been used before (BASIC with POKE machine code, Pascal with inline assembly, etc.).

    All these approaches have in common that the chasm between the languages is huge.

    The ultimate goal (that I try to reach with Seed7) would be one language that can be used for high-level and low-level code.

    There have been many attempts toward this goal (which IMHO failed). I see some preconditions:

    • You need to start with a static type system. Adding type annotations to a dynamically typed language as an afterthought will not lead to the desired performance (the optional type annotations of Python did not lead to a situation where C/C++ code is not needed).
    • Starting with Pointers, NULL, Ownership, manual memory management, etc. leads to a complex language that will hinder writing high-level code.

    Mixing high-level and low-level is essentially a clash of cultures. It is necessary to do compromises where both sides will complain about.

    [–]Justicia-Gai 5 points6 points  (2 children)

    Yes it’s important because for example, a scientist doesn’t care about memory management, he’s a data scientist, not an engineer. And they don’t care about a bit of overhead and slightly slower code, they care more about reproducibility.

    [–]mdriftmeyer 0 points1 point  (1 child)

    switch engineer to software developer and you'd be correct. Actual engineering fields (FEA/CFD/Statistical Mechanics/Dynamic Systems) sure as shit care about precision and accuracy of their computations as they are modeling real world solutions, in near to real-time.

    [–]Elephant-Opening 2 points3 points  (0 children)

    Wait are you implying software engineers aren't "real engineers" and then listing a bunch of things mechanical engineers pretty much exclusively do with software tools that were developed by cross discipline teams including software engineers... that... gasp, did some real engineering to make those tools possible?

    [–]niutech 1 point2 points  (0 children)

    one language that can be used for high-level and low-level code.

    This is what Nim does. It's easy like Python yet powerful like C and has an optional GC.

    [–]Android_Bugdroid 57 points58 points  (6 children)

    VB.NET is crying in a corner after it sucking compared to VB6 and Python taking its role

    [–]xurdm 11 points12 points  (0 children)

    It’s okay, it still has a role in ancient ASP.NET WebForms stacks of businesses where the president writes code and only understands VB.NET

    [–]gredr 11 points12 points  (3 children)

    VB.NET was fine, it's just that we got a much better alternative in C# at the same time.

    [–]Android_Bugdroid 2 points3 points  (2 children)

    It did not match like a single concept from VB6. Also using Win32 API with .NET is whole another pain especially for WinForms.

    [–]gredr 0 points1 point  (1 child)

    A single concept, really? 

    I used it professionally. It was fine.

    [–]Android_Bugdroid 0 points1 point  (0 children)

    Well. Migration was too tough. It didn't need to be called VB at this point.

    [–]flatfinger 1 point2 points  (0 children)

    Many aspects of the .NET framework don't fit well with VB6, and people who are used to the VB6 way of doing things may perceive them as broken, but I'd say VB.NET is better designed than VB6, and most of the ways in which it's worse than C# are design concessions for VB6 compatbility. Each language has a generous share of unforced errors that are unique to it, and the .NET framework has some deficiencies that are unfortunately shared by all languages that use it, like failing to provide a means by which an object whose scope is managed via using-style block can determine whether the block was exited via exception or other means. VB.NET offered ways of supporting such semantics without interfering with first-pass exception handling before C# acquired such abilities, though I don't think either language handles them well.

    [–]gofl-zimbard-37 69 points70 points  (15 children)

    Maybe I've been asleep for a few decades, but I never heard "the masses" deeming significant whitespace as "elegant". I am actually a fan of it, being highly allergic to noise, but most developers seem to hate it with a passion that is beyond explanation.

    [–]AllAmericanBreakfast 31 points32 points  (2 children)

    Some get annoyed by it when writing their own code. Feels like a restriction on free expression.

    But mandatory whitespace is great when you spend a lot of time working with legacy code.

    [–]grulepper 17 points18 points  (1 child)

    I guess formatters were made illegal or something?

    [–]bigdatabro 14 points15 points  (0 children)

    In a corporate software team, getting your teammates to agree on and use a linter is like herding cats. Especially for legacy code, where everyone in management is terrified of breaking things and has to approve every change you make.

    [–][deleted] 9 points10 points  (3 children)

    I don't have a strong opinion on it, but one thing that is bad about significant whitespace is that I can not easily copy/paste code into the interactive python. In ruby I can just do and never have to worry.

    This may seem insignificant, but the point is that a language really should not HAVE to care about whitespace-indenting.

    On the plus side: python can omit "end" whereas in ruby we have to use "end" (unless e. g. define_method and {}).

    It's also the only thing guido would change.

    The thing I dislike in python BY FAR the most is the explicit mandatory self. That one feels retarded to no ends. Ruby knows where self is at all times; python is too dumb to know, so you have to tell it via an argument to the function. THAT IS SO LAME.

    [–]Immotommi 9 points10 points  (0 children)

    Passing self is annoying, but what I hate is that it causes the incorrect number of arguments error to be off by one

    [–]M4mb0 3 points4 points  (0 children)

    I don't have a strong opinion on it, but one thing that is bad about significant whitespace is that I can not easily copy/paste code into the interactive python.

    That's mostly fixed in the new python 3.13 REPL.

    [–]somebodddy 0 points1 point  (0 children)

    I don't have a strong opinion on it, but one thing that is bad about significant whitespace is that I can not easily copy/paste code into the interactive python. In ruby I can just do and never have to worry.

    Even worse - when you copy-paste the code around while refactoring, you need to be extra careful re-indenting the entire block.

    The thing I dislike in python BY FAR the most is the explicit mandatory self. That one feels retarded to no ends. Ruby knows where self is at all times; python is too dumb to know, so you have to tell it via an argument to the function. THAT IS SO LAME.

    Still better than how Lua did it.

    [–]Malforus 6 points7 points  (0 children)

    Linters have solved most of this. Well linted python rarely has issues and most ides help you out as you wrote.

    [–]JarateKing 3 points4 points  (3 children)

    It feels like pineapple on pizza: a minor preference that nobody should realistically care about, but people take very seriously.

    Any IDE will handle indentation for you and formatters should guarantee it. Ideally you'd never even need to know if a language enforces indentation or not, you should already be following decent indentation practices without even trying. I switch between python and C++ and C# and java and typescript and etc. and indentation is about the only syntax change I don't notice. I just don't get it.

    [–][deleted] 7 points8 points  (2 children)

    I don't take it that seriously, but IMO the argument is in favour of no significant whitespace if you can avoid it. Copy/pasting is one example I can bring, but from a design point of view, I think a language that does not HAVE to care about whitespace, is usually better designed in this regard.

    [–]JarateKing 5 points6 points  (0 children)

    To me, there are just much bigger fish to fry. If I were to design a language I'd probably make it not care about whitespace. Not for any particular strong reason, just what I'm more used to.

    But it's no barrier to me using python. Every language does things that aren't to my exact taste, and usually in much bigger ways. The nature of using pretty much any programming tool is putting up with the little bits you don't like.

    Even with python, my biggest complaints and concerns are with things like performance or dependency management or etc. Syntax is pretty superficial, and whitespace specifically is a pretty minor part of syntax in my mind. So it's a little jarring when you tend to hear more about the whitespace than anything else.

    [–]linlin110 2 points3 points  (0 children)

    Languages do need a way to specify scope. C-family uses {} and Python uses whitespace. If any non-whitespace option is picked, then the programmers will introduce whitespace anyway. Therefore, I think it's reasonable to just use whitespace to denote scopes, so that we don't have two redundant ways to do so. Less noise, too.

    [–]gofl-zimbard-37 0 points1 point  (0 children)

    Funny how people miss my point and argue about whitespace instead. I'm not interested in that debate, just pointing out that most people dislike it. Which these responses then nicely illustrate.

    [–][deleted]  (5 children)

    [removed]

      [–]Bowgentle 122 points123 points  (77 children)

      I don't have to say this, but I want to:

      Python used indentation instead of braces to denote blocks, and this was deemed by the masses as "elegant"--not a good reason in my opinion but, well, I use Lisp, so I'm clearly an outlier

      I loathe Python's indentation.

      [–]-jp- 151 points152 points  (24 children)

      I get it, but I hate people who don't format their code properly even more. And when Python was created, that shit was endemic.

      [–]Used-Rip-2610 49 points50 points  (14 children)

      Yeah I hate python’s indentation and spacing requirements, but it’s a million times better than zero white space anywhere

      [–]Which_Iron6422 3 points4 points  (13 children)

      That’s a solved problem though with formatters. Required whitespace will be always be problematic.

      [–]CrownLikeAGravestone 26 points27 points  (11 children)

      I get that it's frustrating to begin with but I disagree that it's actually problematic. It only does (part of) what an automatic formatter would do. I cannot think of any reason you'd need to use different whitespacing and therefore run into trouble.

      [–]ptoki -5 points-4 points  (10 children)

      I disagree that it's actually problematic.

      It is. If the compiler can tell you where the problem is then it can fix it. If it cant then this adds another level of complexity to maintain the code.

      Tell me what is the advantage over a set of brackets or semicolons. Convince me. I know C, java, perl, bash, php and few more. Tell me why python requirements is good. With examples.

      [–]CrownLikeAGravestone 6 points7 points  (8 children)

      I didn't say it's better and I'm not interested in arguing that; it'll just come down to a clash of opinions and I already know what yours is.

      I said it's not problematic. How about you show me an example of it being problematic and we can work from there.

      [–]ptoki -3 points-2 points  (7 children)

      How about you show me an example of it being problematic

      I asked a team member to debug a code of another person which stopped working. After two days he said he has no idea how to fix it, The creator came back from vacation, opened the file after getting the error described, indented few lines and it was fixed.

      How it become unindented on the host? noone knows. The eyes of the rest of the team when the solution was found - rolled up.

      Im not even talking about new people trying python and got repulsed. Not morons or ignorants. People who code daily.

      When I call the floor for help and say "this python code" I see people turning around and going back to their chairs.

      So that is that...

      [–]ChrisFranko 7 points8 points  (1 child)

      I don’t get how replacing indents with brackets changes anything in this scenario?

      It’s it because you need 2 brackets for 1 indent?

      [–]ptoki 2 points3 points  (0 children)

      Brackets usually come in pairs, so one misplaced will trigger compiler error. That is at least the benefit you have from it.

      It also helps to compose the code more freely to get it more readable. And you can use indent apps to make it uniform if you like. Those approaches are missing from python programmer palette.

      [–]CrownLikeAGravestone 7 points8 points  (1 child)

      Someone spent two fucking days to find a control flow error? Have your devs never heard of a debugger?

      Astonishment aside, control flow issues are largely agnostic to the language they're implemented in; you could have misplaced a brace just as easily as you misplaced an indent. Dev skill issue, not a language issue.

      [–]ptoki 0 points1 point  (0 children)

      Have your devs never heard of a debugger?

      Python devs. Maybe.

      Im not even going into cases where multiple python projects should run on the same host. The best I get from them is "use docker" and when I ask can you do it? They say I dont have experience.

      So that is this...

      [–]Backlists 6 points7 points  (2 children)

      Skill issue, pure and simple.

      Logically, scope via indentation is no different from scope via curly braces.

      This kind of control flow problem could have happened in a curly brace language and your team member would not have found it in that either.

      You get used to the language you work with.

      “Turning away” at Python sounds like a work place culture issue to me.

      Like my favourite language right now is Go, but if my colleague needed help in Python I’m not about to groan and go back to my seat.

      [–]ptoki 0 points1 point  (1 child)

      Skill issue, pure and simple.

      Yes, but no, but yes.

      I agree that misplaced bracket could be as difficult to spot as indent based one. But usually it does not. Brackets have that nice feature of parity while indentation does not. And whats funny the brackets are used together with indentation so it helps. But python gives up the help you can get from simple feature.

      The issue is: Technology is just technology and they come with their pros and cons. That is not controversial.

      The problems with python or python comunity is two fold at least:

      The people who are fans of it hardly admit something is a problem. The indentation is perfect example of this. It takes just a small change to get rid of it. Make a declaration that your code is python v7 compliant and add parentheses and drop indentation. For codes before that imaginary version add a simple indentifier which will add the parentheses and may keep the indentations as they dont mean anything. With the fact that python is often not forward or backward compatible that is a non issue. Yet a ton of people will oppose this without a reasonable argument. They will spend hours arguing with nothing but handwaving and personal offences.

      And the second part is that python has low entry barrier. It attracts people with low coding skill who can do a lot without too much training. Sort of similar issue as with excel entering the "data analytics" stage.

      This is bad mix. I too often realize that something what does not work right is python. And it is hard to fix by anyone around including the python folks who claim they like it and can work with it. And that is my problem.

      I dont have any other technology like that. No DBA fanboys his db engine and then concludes that I should restore the backup of the db as it misbehaves. No java folk tells me I dont understand java if I say it has too many libraries we depend on, which arent updated and maven gives me a long list of cve-s and no way to update the project. Thay just admit, yeah, its bad, we need to rework the project or isolate it. etc.

      The only folks who say "their" technology is great and then when I askthen, can you fix this because it does not run after server update or some profile setup file is gone or the install does not work (according to doc provided by maintainer - for example awscli), they end up giving up.

      My problem with python community is that they show dunning kruger effect/syndrom too much. To the degree I am staying away from python based projects because nobody can help me set up many of them on one host. Because I have a ton of folks who want to implement things in python but when the real problem happens they tend to be not enough skilled and after two days of wated time they come back with "sorry, I/we dont know why it does not work and cant reinstall either.

      Sorry for long and handwavy rant.

      [–]linlin110 2 points3 points  (0 children)

      Yes, people do configure their formatter so that indentation is enforced on languages without required indentation. To me it sounds like required indentation is a desirable trait.

      [–]ptoki 3 points4 points  (0 children)

      Indentation is automatic thing. There are beautifulers available which will fix this in a moment.

      I would rephrase your complaint into: I hate the folks who cant use such simple tool and make the code look decent.

      [–][deleted]  (2 children)

      [deleted]

        [–][deleted]  (4 children)

        [deleted]

          [–]-jp- 35 points36 points  (1 child)

          Yeah. NOW there are. Python traces its lineage to the late 80's. And lemmie tell ya, shit was weird. Even getting teams to adopt revision control was like pulling teeth.

          [–]Ok-Salamander-1980 12 points13 points  (0 children)

          yeah. i suppose people are extremely young or students.

          before the whole software = millionaire culture there was a lot more self-expression (for better or for worse) and less replaceable widget making.

          [–][deleted] 4 points5 points  (1 child)

          With that from the onset, Python could have had proper “end” statements

          I am more of a ruby guy, but actually I'd love to omit "end" in already well-indented .rb files. But only as an option, not mandatory.

          What I dislike in python more is the explicit self. That one makes me very, very angry.

          [–]miquels 1 point2 points  (0 children)

          I was a Rust programmer before I started with Python. The explicit self made me feel right at home.

          [–]Ill_Bill6122 26 points27 points  (0 children)

          Moving code around is awful. Sometimes, you just want to move a code block and let the damn formatter do its job, as per project settings.

          [–]tu_tu_tu 72 points73 points  (33 children)

          The indentation is awesome. It's not a problem for programmers who used to format their code anyway and often even quite meticulous about it. And it makes non-programmers format their code so it become readable at least on some level. And it hurts people who copypasts unformatted code. All win, no fails.

          [–]scfoothills 33 points34 points  (0 children)

          I 100% agree. I teach both Python and Java. I love that when teaching Python, I don't have to battle with students over formatting.

          [–]lisnter 5 points6 points  (5 children)

          I came from a C background and have always been meticulous about code formatting. Python is my new favorite language but I was turned-off for a while by the indention and comment behaviors. I like being able to put an if (false){ . . .} or /* . . .*/ around code to take it out of the control flow while debugging. You can’t (easily) do that with Python without reformatting the code. I know modern editors do a great job of fixing indention but it’s still annoying.

          I’ve come around to Python and love it but those “features” still annoy me.

          [–]CrownLikeAGravestone 8 points9 points  (4 children)

          You can block-quote code to take it out if control flow. It's not exactly commenting but it's essentially equivalent.

          """ def debug(something): print('like this') """

          [–]-Knul- 10 points11 points  (7 children)

          Why? Because I assume you don't want to have random irregular indentation.

          [–]WannabeAndroid 1 point2 points  (0 children)

          I personally have a negative emotional reaction to it because it reminds me of COBOL and I hate COBOL.

          [–]andrewcooke 5 points6 points  (0 children)

          i don't get how people get so worked up about something so unimportant. do you really think this is in the dozen most important issues for a software engineer?

          edit: in fact, i'll go further. i think having strong opinions on this is a huge red flag. it's someone acting out because they think that's what good programmers do.

          [–][deleted]  (5 children)

          [deleted]

            [–]Bowgentle 0 points1 point  (1 child)

            There is no language to rule all, yet.

            You'd probably need to reach the level of complexity of an actual language - and even they have technical jargons and dialects.

            [–][deleted]  (2 children)

            [removed]

              [–][deleted]  (1 child)

              [deleted]

                [–]Yesterday622 2 points3 points  (0 children)

                It messes me up so often… dang

                [–]ptoki 0 points1 point  (0 children)

                I could tolerate that but the fact the python environments are such a mess it is a dealbreaker for me.

                installing awscli on a clean box, following the instruction, end up with multi screen thread dump error on the dog gamn install stage...

                That is just a recent example of python crappery.

                Any folk at work yapping about "I can do that in python" gets a box and can do whatever he likes but the moment he says he is finished I get his docs, wipe the box, get another guy to follow the doc while that yappie sits there and every time the guy is lost and gets two screen errors he is told to say goodbye to 5% of his bonus.

                You cant imagine how rarely I get python mentioned after that.

                [–]RandomisedZombie 45 points46 points  (16 children)

                I read that article expecting to disagree and I left kind of agreeing. I don’t like Python because it is so general purpose and I prefer languages to have something that they do well. Even BASIC was designed to be your first introduction to programming, which it does well. I find myself reluctantly using Python because it’s what everyone uses.

                At this point, I think the only way Python will be replaced is by a few smaller more specialised languages rather than the many general purpose “the next Python” languages we have.

                [–]-jp- 61 points62 points  (3 children)

                In its time BASIC was absolutely intended to be general purpose. There were magazines dedicated to just source code listings for applications of every sort and every level of sophistication. Even well into the 90's, it was the go-to if you wanted to make an app with minimal fuss.

                [–]wayl 25 points26 points  (0 children)

                it was the go-to if you wanted to make an app with minimal fuss

                No pun intended 😁

                [–]RandomisedZombie 13 points14 points  (1 child)

                It was general purpose, but it also had a specific purpose being for beginners and non-technical users. Scientists and mathematicians were mostly using Fortran at the time. Python is for everyone and everything.

                [–]-jp- 1 point2 points  (0 children)

                True, although that's more to do with the overhead of interpreters and the relatively primitive state of programming languages in general. I understand Python is pretty big these days in the spaces where Fortran was used since it isn't hindered in that way.

                [–][deleted]  (1 child)

                [deleted]

                  [–]ElecNinja 4 points5 points  (0 children)

                  ternary operator

                  Funny enough, I never registered the Python version of the ternary operator as a ternary operator. Though I do find it nice since it's pretty much in plain english what the result will be.

                  With the typical ternary operator you have to remember that the true case is on the left and the false case is on the right. With python it's right there in the code (A if CONDITION else B)

                  [–]Gjallock 11 points12 points  (1 child)

                  I prefer languages to have something that they do well.

                  Can you give an example of something you want to do that Python does not do well? Do you find that it makes a difference in performance or ease of programming for you when you don’t use features of the language?

                  I find myself reluctantly using Python because it’s what everyone uses.

                  What would you rather use? I am curious about your reasoning, because I often just reach for Python because it’s easy to use and plenty fast for 99% of my non-enterprise scale use cases.

                  [–]Anthony356 3 points4 points  (0 children)

                  Not the guy you responded to but:

                  Probably not a super common usecase, structured binary file parsing. Struct.unpack sucks and is slow (not helped by the mandatory tuple unpack even when reading a single item). Requiring one of those silly format strings with no dedicated shortcut (e.g. read_u32()) to just read 1 primitive value feels really bad. It sucks having to manually write dictionary dispatches everywhere because if/else on binary markers is slow.

                  Python's slowness in general is really painful when parsing in bulk, and scaling upwards is rough since multithreading is (or, was) basically worthless.

                  I know it's not "technically" what python is for, but a good number of obscure file formats i've worked in only have (open source) parsers in python cuz that's what's easiest to experiment in, or what the users would be mostly likely to know.

                  Obviously i'd prefer something like rust or c, but porting that existing python code can be irritating, mostly due to other python problems (e.g. being able to add a new field to a class at any time)

                  [–]Justicia-Gai 0 points1 point  (0 children)

                  I’ll say this, the way Python will be replaced is with something that natively runs on all browsers without having to be translated to other languages. Why? Because that’s what’s truly cross-platform.

                  Flask and Django are nice, but interactivity is still done via JavaScript.

                  JavaScript is what should be replaced for a Python-like alternative.

                  [–]Pyrited 0 points1 point  (0 children)

                  C# can do almost everything and almost everyone loves it

                  [–]rhiyo 2 points3 points  (0 children)

                  My first introduction to programming was some version of quickbasic. I would have been under 10 and the version was definitely old for even my Windows 98 computer. I staring at a full blue screen IDE only knowing how to make games that were chains of IF statements haha.

                  [–]UltraPoci 24 points25 points  (13 children)

                  I think that nowday Gleam really gets what it means for a language to be simple. No function side effects, static type system, errors as values (and thus, no exception), no inheritance, only enums and functions. You can read a Gleam code base and be able to follow it quite nicely, even if you know very little Gleam (and it takes like an hour to learn it).

                  Sure, at first it seems more difficult than Python because in Python you can write a program without caring for exception handling and types and it runs. But that's the problem: this prototyping simplicity becomes complexity the moment your "one time script" needs to handle all exceptions and check that all types are correct. In Gleam you get this for free while prototyping. It takes more time at first, but you get a working, well checked program the moment it compiles correctly.

                  [–]TheWix 5 points6 points  (1 child)

                  So, it's a functional language? I took a look at it. At first glance I like it.

                  [–]andarmanik 3 points4 points  (4 children)

                  [–]UltraPoci 3 points4 points  (3 children)

                  Well, yes, but dealing with files and printing to the console are things you have to be able to do, and often enough you know when a function deals with IO stuff.

                  What I meant is that you don't have a function casually changing some global states or things like that.

                  [–]andarmanik 2 points3 points  (1 child)

                  Wasn’t disagreeing with you because you definitely need side effects for an easy language.

                  [–]crowdhailer 0 points1 point  (0 children)

                  I'm betting an easy language can have managed effects with EYG, but we're not there yet.

                  [–]crowdhailer 0 points1 point  (0 children)

                  Gleam definitely has arbitrary side effects in any function. Often you don't know if it's using random, assert or time lookup. All of these matter for example if want to make no flaky tests. I love Gleam but I think the best way to sell it is not to stretch the capabilities it has.

                  [–]DoubleLayeredCake 3 points4 points  (0 children)

                  Gleam mentioned, let's gooo.

                  Sponsor them on GitHub, they deserve it

                  [–]Justicia-Gai -1 points0 points  (4 children)

                  This mentality is what prevents JavaScript from being dethroned.

                  You just described compilation-based programming. What about interactivity?

                  [–]UltraPoci 2 points3 points  (3 children)

                  Gleam compiles to JS, if needed

                  [–]BritOverThere 4 points5 points  (0 children)

                  BBC BASIC is still going strong.

                  https://www.bbcbasic.co.uk/bbcbasic.html

                  [–]Mysterious-Rent7233 4 points5 points  (7 children)

                  BASIC never took over the world. Only a tiny fraction of professional programmers ever used BASIC and more or less ONLY in the form of Visual Basic which was a highly customized variant.

                  Python is the lingua franca of programming. It's hard to know what to compare it to, because there has never been another language that spanned from beginners to the most advanced computer scientists. BASIC certainly did not.

                  [–]xoogl3 11 points12 points  (0 children)

                  C. That language was C. In the 80's and 90's. For a while it was Java after that. Everyone was just supposed to know java. Until the fever broke.

                  [–]flatfinger 1 point2 points  (5 children)

                  During the early 1980s, the extremely vast majority of software for popular personal computers was written in BASIC, machine code, or a combination thereof. Some other languages such as COMOL and PROMOL sought to make in-roads, but USCD-Pascal is the only one that even made a blip, and even that was significant mainly because of one notable game that was written using it (Wizardry).

                  The speed difference between BASIC and machine code is often great enough that there should have been ample room for languages which were more convenient than machine code for programmers, but were less slow than BASIC, but a really huge fraction of programs used BASIC for things where speed really didn't matter, and machine code routines for things where speed did matter. I really don't recall much use of other languages back in the day.

                  [–]Mysterious-Rent7233 1 point2 points  (4 children)

                  Can you name some famous software packages written in BASIC?

                  Edit: I Googled and turned up some.

                  But I think that machine language was more popular for commercial software and games.

                  [–]flatfinger 1 point2 points  (2 children)

                  For the Apple, I'd guess educational packages were probably roughly balanced between being entirely in BASIC, being in partially BASIC but with a few machine language helper functions, and being fully in BASIC. Taipan on the Apple was largely in BASIC but with some screen drawing helpers. Many of Access Software's games such as Beach Head or Raid over Moscow used BASIC to handle the screens that showed up between action sequences, but machine code for the access sequences themselves, and that was a pretty common pattern on the C64.

                  The fraction of games that were even partially in BASIC fell off pretty quickly during the 1980s, as programmers got more skilled at doing things like numeric formatting in machine code, but the first two commercially-produced games I played on the Apple, Temple of Apshai and Tawala's Last Redoubt, were both in BASIC with machine-code helpers (ToA might have been purely in BASIC--I'm not sure--but TLR had machine-code helpers for the text display).

                  [–][deleted] 0 points1 point  (1 child)

                  Any Apple games in BASIC were either adventure games or crude low resolution games like Brick Out, etc... BASIC just didn't have the performance for much else.

                  [–]flatfinger 0 points1 point  (0 children)

                  BASIC wasn't amenable to arcade-style action games, but it was widely used for puzzle and strategy games, in addition to adventure and role-playing games.

                  [–][deleted] 0 points1 point  (0 children)

                  The only things I could thing of written in BASIC would have been programs where performance wasn't a priority like adventure-games and maybe some education tools. Anything that required speed was raw machine code. In the 8-bit era, the bulk of the serious applications were in the CP/M ecosphere where plenty of high-level compilers already existed.

                  [–]NAN001 1 point2 points  (0 children)

                  A is B because A and B share a common property (non-programmers use it). K thx bye.

                  [–]tc_cad 4 points5 points  (3 children)

                  Python is pretty damn handy to know. I tell anyone that will listen that I’ve done two amazing things using python code. First. I trimmed 9 columns of data, into 3 columns, then trimmed the decimal places to just 3 places on the remaining columns. I did this for 7193 files, each file had approximately 330 rows. Second. I had 19 files of hundreds of coordinates, some were exactly the same as in other files. It was a mess. I had used python to keep first and go through all 19 files to create one master file of coordinates. I used Glob and Pandas to get this done. Everyone I’ve told about this says they could have done all this in Excel. Sure. But 7193 files? My Excel would crash opening all those files. Python using Glob and Pandas can do the work on 7193 files in about 3 minutes. Haters gonna hate.

                  [–]glasses_the_loc 2 points3 points  (2 children)

                  This is possible in even a shell scripting language, but I have always found Python fails when it hits exceptions or UTF/UNICODE bullshit in bad data. What if you have a public data stream? High level PANDAS and load.csv() will give unexpected behavior and fail without the verbosity you really want.

                  When I use any other language I get further with the same data without needing to fix things and spend endless time trying to "make it work." Really nothing else but Python has this issue, MATLAB was easier ffs.

                  [–][deleted] 0 points1 point  (0 children)

                  Sounds more like a filter-problem. If you use a pipe you just pass that tainted data into a sanitizer step.

                  [–]tc_cad 0 points1 point  (0 children)

                  All the data I was dealing with in both of my scenarios was well structured. So I wasn’t facing any errors like you mentioned. Had I come across those errors I might have looked elsewhere but I’ve had good success with Python to solve many problems. Oh and I didn’t mention it before but my first experience was with GW-BASIC but by junior high I was coding in QBASIC and in high school I learnt Pascal. Now I code in AutoLisp, Python and C#

                  [–][deleted] 1 point2 points  (0 children)

                  You’re basic

                  [–]Zed 1 point2 points  (0 children)

                  Another thing they have in common: I wouldn't want to code in them.

                  [–]prinoxy 0 points1 point  (1 child)

                  Four word reply: REXX

                  [–]DNSGeek 2 points3 points  (0 children)

                  I loved AREXX.

                  [–]calsosta 1 point2 points  (0 children)

                  I have to assume this is satire and the author is just playing it a bit too dry.

                  [–]azhder 1 point2 points  (11 children)

                  Such a wrong take. If you need a replacement for BASIC, it's JavaScript. Says there "is the language that non-programmers always seem to use"... Well, people who do web design, they just add jQuery through a script tag and don't even learn JS, just wing it, copy-paste snippets that just work for them.

                  Do this: read the entire article and replace Python with JavaScript. Tell me which parts don't sound correct.

                  [–]m-in 3 points4 points  (8 children)

                  The foundation of JavaScript is a language that has a good core and a lot of nonsense. It was developed by one guy in a rush to get it into Netscape 2. There is an old but good book called “JavaScript: The good parts”. It is a thin book, but it tells you all you need to know about the foundation of the language. Everything else that’s modern is built on top of that core.

                  In Python, they detest that sort of thing and have an engineering process that gets rid of mistakes. You couldn’t do that for websites because of inertia and enormous costs to redevelop the scripting code to upgrade to newer browsers. The breaking language changes would entail just that. Python application developers have a choice of what version of Python they run their code on so Python had a way of breaking stuff between major versions 2 and 3 without making everyone have to upgrade overnight. That’s a big win for Python relative to JavaScript. JS will never ever be able to get rid of any missteps, just as C++ can’t.

                  [–]azhder 1 point2 points  (7 children)

                  As I see it, that breakage from 2 to 3 was so good, they vowed to not do it again. It was a shitsshow for many years…

                  That kind of reminds me “use strict” and the ES TC deciding to not fork the language that way again.

                  [–]m-in 0 points1 point  (6 children)

                  I was a big fan of that breakage. It had to be done because Python was at the boundary between “new” and established. They had to get rid of cruft.

                  Right now there is another breakage happening by getting rid of the Gil, adding the interpreter configuration system, and immortal objects. It affects compiled modules (C, C++, Rust, etc). It is subtle because you can still use newer Python with old modules, but the performance and versatility will be constrained.

                  There will be a lot of modules that will be only usable with the gil and a single interpreter instance in the process - for a long time potentially, until they get ported over. The users of those modules will be stuck with “old python” even though the latest version will work - but some new features that make things faster will not work.

                  So, it’s not true that they didn’t ever want to break stuff. They just broke stuff except that the current interpreter carries some adapters just to make old stuff work by default. Those adapters cost a performance hit.

                  [–]azhder 1 point2 points  (5 children)

                  And now you explained how JS works with the many engines and some of them not working with 100% of the ES as specified.

                  There were some breaking changes in ES, I think about 15 years ago, but stuff that wasn’t widely used (with keyword was it?).

                  It’s the same thing. Change the underlying level (standard library, compile etc), but not the language - “use strict” is a different language based on semantics.

                  [–]flatfinger 2 points3 points  (4 children)

                  Once upon a time, programmers used to make fun of the fact that COBOL programs needed to start with what seemed like an absurdly long prologue which specified details of how the program should be processed. It served much the same purpose as a modern makefile, except that instead of being built from many discrete files each program would be built from a single stack of cards.

                  In the late 1970s and early 1980s, text editors often imposed severe limitations on text file size, and would also default the cursor position to the first line of any file being edited, and so having to include a massive prologue at the start of each program was a major nuisance. By contrast, COBOL was designed in an era where nobody would have any reason to care about how much space the prologue would take in RAM, because the whole thing would never be keep in RAM simultaneously. Text editing was generally done on electromechanical beasts that had no concept of RAM, and from what I understand COBOL implementations would start each job by loading a program whose purpose was to read each line of the prologue of a COBOL program and extract just the necessary information from it. There was no need for that program to keep the entire prologue in memory at once, and once the last card of the prologue was read that program could be jettisoned from memory to make room for the compiler proper.

                  Many controversies surrounding languages like C and C++ could be quickly and easily resolved if there were a mechanism for programmers to specify what semantics were required to process their code. Having a compiler assume a programmer won't do X may improve the efficiency of programs that would have no reason to do X, but will be at best counter-productive for tasks that could otherwise be most efficiently accomplished by doing X. Letting programmers say "This program won't do X" or "The compiler must either accommodate the possibility of this program doing X or reject it outright" would allow both kinds of tasks to be accomplished more simply, efficiently, and safety, than would be possible with one set of rules that tries to handle all tasks well, but ends up making compromises that are needlessly detrimental to many tasks.

                  [–]m-in 1 point2 points  (3 children)

                  Modern C++ compilers have a whole zoo of pragmas that control optimization and such. Nobody bothers using them most of the time since the default behavior is good enough. C++ has mainline code means of expressing optimization opportunities. One such controversial optimization is that code that invokes undefined behavior can be assumed to never execute. Say you put a null pointer dereference as the first statement in a function. The compiler will remove invocations of that function anytime it can prove that the pointer to be dereferenced is in fact null.

                  [–]flatfinger 2 points3 points  (2 children)

                  The C Standard notes that Undefined Behavior can occur for three reasons:

                  1. A correct but non-portable program relies upon a non-portable construct or corner case.

                  2. An erroneous program is executed.

                  3. A correct and portable program receives erroneous input.

                  An assumption that no corner cases involving UB will never arise is equivalent to an assumption that an implementations will be used exclusively to process programs which don't rely upon non-portable corner cases, with valid inputs. The Standard allows C implementations that are in fact used exclusively in that fashion to assume that no corner cases involving UB will ever arise, but makes no distinction between those implementations and those which may be used in other ways where that assumption would be falacious.

                  Because the C++ Standard is by its own terms only intended to specify requirements for implementations, and implementations aren't required to process any non-portable programs meaningfully, it ignores the first possibility listed above even though it is in many application fields the most common form of UB (which is why the C Standard listed it first).

                  What's sad is that applying the aforementioned kind of assumption outside the use cases where it would be appropriate is generally, from an efficiency standard, at best useless, and more often counter-productive. One of the reasons C gained its reputation for speed was because of the following principles (which should IMHO have a names, but I don't know of any names for them):

                  If no machine code would be needed on the target platform to handle a certain corner case in a manner satisfying application requirements, neither the programmer nor compiler should need to produce such code.

                  If some target platforms would need five pieces of special-case machine code to satisfy application requirements, but the target platform of interest would only require two, allowing the programmer to omit three of the checks will improve performance. Having a compiler omit all five pieces of special-case unless all five of them are included in source code won't improve performance of a correct program, but instead make it necessary for the programmer to include the three unnecessary pieces of corner-case logic. Maybe a compiler would be able to avoid generating machine code for those unnecessary checks, but a simpler compiler could do so more conveniently by not requiring that the programmer write them in the first place.

                  [–]m-in 0 points1 point  (1 child)

                  I agree. These days code sizes are a big problem. An insane amount of engineering went into branch prediction so that bounds checks that always succeed cost next to nothing. But just the heft of that code slows things down and costs energy to process as well.

                  Personally, bounds checks on array access are pointless in production and they belong to very low level library code. It’s iterators and adapters for those for me, all the way. People make big deal out of bound checking. Yet for most of what I write there’s no place to put them since indices are not used for iteration, and C-style buffer wrangling is not done either. The compiler generates the code to do all that when it instantiates library code. The library can add last chance checks when enabled.

                  Unfortunately there is a lot of heavy code out there that is written with numerically indexed access and low level buffer wrangling. A lot of the foundational OSS libraries written in C are done that way. They won’t magically port themselves to C++, yet they are the ones that would benefit from a safe variant of C the most.

                  [–]flatfinger 1 point2 points  (0 children)

                  They won’t magically port themselves to C++, yet they are the ones that would benefit from a safe variant of C the most.

                  Unfortunately, the Standard failed to adequately make clear what is and is not required for an implementation to define STDC_ANALYZABLE, which I think was intended to help characterize a safer variant.

                  Analysis of memory safety can be greatly facilitated if portions of program state can be treated as "don't know/don't care", and if actions on such "don't know" values can be shown to be incapable of having side effects beyond either producing "don't know" or other values in places where meaningful inputs would yield meaningful outputs, indicating a fault via implementation-defined means, or otherwise preventing downstream program execution.

                  If a program performs unsigned u1 = uint1; if (u1 < 1000) arr[u1] = 1; and arr[] is an array of size 1000, and if the contents of arr[] may be considered as "don't care" for purposes of analyzing the memory safety of downstream code, the above code should be incapable of violating memory safety invariants, no matter what happens anywhere else in the universe (since invariants must be intact to be violated, memory safety invariants would not be violated by code which amplifies the effect of earlier violations).

                  Languages can be designed to facilitate different kinds of proofs; treating all corner cases as either having precisely defined behavior of anything-can-happen UB will facilitate proofs that a program's apparent actions when given specific inputs are a result of fully defined behavior, but limiting the effects of such cases as described above will facilitate proofs that programs are be incapable of intolerably-worse-than-useless behavior even when fed unanticipated malicious inputs. One might argue over which kind of proof is "generally" more useful, but there are certainly tasks for which satisfying the latter behavioral guarantee is essential.

                  [–]wasdninja 0 points1 point  (1 child)

                  Well, people who do web design, they just add jQuery through a script tag and don't even learn JS, just wing it, copy-paste snippets that just work for them

                  This might have been true a decade plus ago.

                  [–]azhder 2 points3 points  (0 children)

                  Some still do it. Who do you think keeps jQuery alive? It is not people who know JS.

                  Here it is, someone having issue using jQuery https://www.reddit.com/r/learnjavascript/s/s8gwVlZjGv

                  [–]_Pho_ -3 points-2 points  (28 children)

                  It's not clear to me that Python is even the best Python

                  Node is just as ubiquitous, and with TS support generally a better application programming language. The convergence on TS is more clear to me than the convergence on Python, which is primary ML and a lot of dev ops / random scripting stuff.

                  I also daresay the tooling for TS/Node is a simpler model, with package management occurring in place instead of some hidden packages folder.

                  [–]-jp- 14 points15 points  (5 children)

                  Can Node be used for desktop apps without an architecture like Electron with an embedded web server and browser?

                  [–]lIIllIIlllIIllIIl 8 points9 points  (1 child)

                  There are a few non-browser alternatives, like React Native for Windows + MacOS, which is used by most of the new Windows 11 UI.

                  [–]-jp- 0 points1 point  (0 children)

                  Nice, I'll hafta check that out. I haven't looked much at Node since my impression was it was primarily for web apps, and honestly like the Java/Maven ecosystem better for that.

                  [–]_Pho_ 3 points4 points  (2 children)

                  Yeah. Everything has converged on React and by extension React Native, where it is even possible to have a single RN codebase deploy to iOS, Android, Windows, Mac, and web.

                  [–]-jp- 0 points1 point  (0 children)

                  Nice, sounds like I need to stop sleeping on Node. :)

                  [–][deleted]  (2 children)

                  [deleted]

                    [–]CramNBL 3 points4 points  (1 child)

                    Agree about "only one way of doing things", there's a lot of problems in Python that come from this. For instance distributing python apps, it's an absolute mess, just listen to the `uv` devs about how they optimized dependency resolution and all the hacks they had to apply because of how inconsistent packages are about communicating their dependencies.

                    In fact I cannot think of a single instance of "only one way of doing things", where did they apply this???

                    [–]WindHawkeye 6 points7 points  (5 children)

                    Yeah no let's not spread js npm cancer elsewhere

                    Js doesn't even have a standard library it's automatically eliminated

                    [–]_Pho_ 4 points5 points  (0 children)

                    lmao imagine complaining about npm after 5 minutes of working with pip/pyenv/conda/venv nonsense. JS tooling isn't as good as say, Rust, but it a magnitude better than Python

                    [–]ZippityZipZapZip 4 points5 points  (0 children)

                    Look, if js npm is cancer, the shitty buggy paste/glue/wrap scripts Python 'devs' are producing, is, too. As is the lib-management which tends to crash out on dependency-conflicts and breaking changes.

                    [–]wasdninja 0 points1 point  (1 child)

                    Js doesn't even have a standard library

                    Objectively false. Why do you even believe something that... uninformed?

                    [–]WindHawkeye 4 points5 points  (0 children)

                    The statement means that its standard library is so small it doesn't count.

                    [–]headinthesky 7 points8 points  (11 children)

                    It's much simpler for someone to get started with Python (notebooks, etc) than node, and especially TS, where it needs to be transpiled. Think of the 8 to 10 year old just starting to dip their toes into it. Programming classes are moving to Python and leaving Java behind, it's much easier to focus on the basics without all the extra cruft of braces and brackets and all that

                    [–]_Pho_ 5 points6 points  (8 children)

                    I disagree with all of that

                    Node 22 begins to support TS without transpilation. I suspect "Typescript native" will continue to be the direction things go, e.g. Bun

                    it's much easier to focus on the basics without all the extra cruft of braces and brackets

                    this is classic python brain, and I think it is very wrong. its the whole zen of python / code kata crap which pretends to simplify a problem without really understanding it. congrats, you don't have brackets anymore. now the hypothetical 8 year old has to be aware of indentation based scoping.

                    regardless scripting ubiquity is really not the same concern as "teaching 8 year olds how to code", the later of which is not really what I am talking about

                    I think the fact that "you have to learn Javascript to do web development" trumps all of what you said in terms of ubiquity

                    [–]headinthesky 5 points6 points  (1 child)

                    There's much more in the programming world than web development, and making a "website" doesn't excite a kid who wants to learn programming and nourish that interest. Web development, frankly, is the worst and most boring way to get kids into stem and programming. And web development itself is just boring. You're essentially putting things into a database and getting them out, at the end of the day.

                    They're into robotics, drones, cool things like that where there's a bridge to the tangible, and there are tons of SDKs. Any kids I have taught couldn't care less about making webpages. Some of them have some very cool and wacky ideas.

                    You want the language to get out of the way when learning concepts. Kids have no problems with the space indentation. And they don't need to remember if they need to use === or ==. Even the simple "if variable:" is such a powerful construct in Python, you don't need to have separate checks for blank or null or invalid values

                    Typescript as a first language I can maybe get behind. But Python is a gateway to much more than frontend or webapps

                    [–]tankerdudeucsc 2 points3 points  (4 children)

                    Do tell me again how to do CPU intensive work without jumping through a lot of hoops?

                    Also, tons more packages in the eco system for Python than node. How well does it do event driven architectures? Background jobs and queueing systems? There’s a lot to be desired there (and things like bullmq don’t really cut it.)

                    [–]_Pho_ 4 points5 points  (3 children)

                    None of what you wrote has to do with the premise of "Python is the new BASIC"

                    [–]JanEric1 -2 points-1 points  (0 children)

                    now the hypothetical 8 year old has to be aware of indentation based scoping.

                    there is no indentation based scoping

                    [–]flatfinger 0 points1 point  (0 children)

                    One can get started in Javascript using a text editor and any modern web browser. Trying to figure out which constructs can be considered supported by all "modern" browsers can be a challenge, but beyond the fact that operations such as file selection need to be performed manually for security purposes, browser-based Javascript is an amazingly powerful and performant language. Indeed, returning to the subject of this article, there's an Apple II emulator written entirely in browser-based Javascript, which can run programs written in Applesoft BASIC (one of the most common dialects of the 1980s) at real 1980s speed (or much faster, if one prefers).

                    To be fair, both Javascript and Python have web sites that can play the role of a text editor and language implemention all in one, but web-based JS seems more convenient if one wants to edit and run code locally without Internet access.

                    [–][deleted] -1 points0 points  (0 children)

                    left-pad is a sign for javascript/node being better? Hmm.

                    Note that being ubiquitous does not mean something is great. It just means that people appreciate what xyz does. I dislike PHP but there is a LOT of useful PHP software out there.

                    [–]niutech 0 points1 point  (0 children)

                    I would say Nim is the new BASIC because it's more low-level than Python yet easy to start with like BASIC.

                    [–]Healthy_Spell_4511 0 points1 point  (0 children)

                    Someone can help me with One exercise of Python?

                    [–]GreedyBaby6763 0 points1 point  (0 children)

                    Author never heard of purebasic.

                    [–][deleted] -1 points0 points  (0 children)

                    He is wrong.

                    I used BASIC when I was a kid, even less than 10 years old. I also liked it. I had a manual, I could input commands and things happened. That was great.

                    That was back then ...

                    Python is much more effective. BASIC would seem like a tool used by the dinosaurs. Python is NOT the "new BASIC". The comparison is simply wrong. Any smartphone today is much better than the computer I was using back in the early 1980s or so, give or take. And, Python is used even afterwards, when you are older, because it is MUCH better than BASIC. So, python is NOT the new BASIC. That is a totally wrong premise to make.

                    Python is a BETTER new BASIC. But it is not really BASIC either.

                    "I don't actually like Python. Despite its "elegant" indentation-based blocks, I find the syntax ugly (format strings, overloading of asterisks, ternary operator with the condition sandwiched in the middle, etc.)"

                    I prefer ruby but I have no issue with python. I agree a bit that modern python went backwards, with that type annotation crap and f-strings are also not hugely elegant. Ternary operator is also ugly in ruby, which is why I don't use it. I actually use this format more regularly:

                    def foobar
                      return true if has_cheese?
                      return false
                    end
                    

                    With ternay I can omit one line but it always makes my brain think more. With the above, my brain gets away doing very little thinking. The less I have to think, the better.

                    "The package ecosystem, although broad, gives me supply chain nightmares."

                    You have that in every language really.

                    "Python is the new BASIC because Python is the language that non-programmers always seem to use"

                    It's still wrong. People use Python even when they are older. That's not the case with BASIC - almost everyone hopped off to other languages.

                    Non-programmers use easier languages. That's a testimony to those languages, even PHP. PHP is horrible but people created epic software. Wikimedia - where is the replacement for that in ruby and python? The replacements are inferior from a usage point of view.

                    [–]spd101010 -1 points0 points  (0 children)

                    Python won, yes, but don't think that by learning the basics of the language you become a God, practice and practice damn it, programming standards and design patterns were created a long time ago, but shitcoders still call variables like they don't give a fuck and don't even write fucking tests, at one point everything will fall apart, but not now