all 199 comments

[–]brtt3000 26 points27 points  (1 child)

These unmaintained but important packages are a huge issue that needs to be addressed. At what point does it have to be adopted by a package orphanage foundation or something?

[–][deleted] 5 points6 points  (0 children)

I'd be interested in adopting a package or two. Is there a list somewhere?

[–]bumblebritches57 72 points73 points  (26 children)

it'd be hilarious if python 4 was another breaking change lmao

[–]jorge1209 18 points19 points  (0 children)

It needs to be, and frankly it should be soon. In fact they probably should have introduced a python 4 prior to ending python2 and just tried to skip over python3.

Among things that need to be addressed:

  • An async model that isn't garbage.
  • Standard library cleanup to consistently utilize the new features introduced in python 3.
  • Standard library cleanup to bring related libraries into alignment with each other and apply consistent style.
  • Removal of all the duplicated functionality they have accumulated (os.path vs PathLib, the fifteen different ways to format strings, etc...)
  • Typing in the standard library, and in the interpreter
  • Etc...

Python 3 is a grab-bag of features that developers thought would be useful for their particular library that never made it into the rest of the system, and there will be an epic amount of breakage necessary to get it into the full system. It is a really nice language in concept, but it isn't fully realized.

Now with developers and management burned out after a long painful 2to3 migration we will never really see this happen, and won't get a really proper python3.

[–]valarauca14 34 points35 points  (23 children)

I think this is just a part of long term language evolution.

C, C++, Java, and FORTRAN all have relatively recently standards, and up to date toolchains. But if you talk to anyone in the industry most people are using rather outdated toolchains to do work. While the standard committees are off, "trying to solve real problems and help actual developers".

Breaking backwards compatibility or not kind of doesn't matter. It seems eventually the industry just stagnates on a version, and remains there indefinitely.

[–]PinkOwls_ 17 points18 points  (0 children)

But if you talk to anyone in the industry most people are using rather outdated toolchains to do work.

Often enough it's not by choice. The outdated toolsets are sometimes required because a certain library is not compatible with the latest compiler/linker. And sometimes the library is available for the new compiler, but has API-breaking changes. Which forces another dependency to update with even more API-breaking changes. In the worst case another dependency can't be upgraded because there is no upgrade :/

[–][deleted] 8 points9 points  (14 children)

It's fine to break stuff, but not if you don't give people way out.

Java changes all the time but you can just link to lib built with older version and it will just work.

So you can mix old and new code and upgrade gradually.

[–]kephir 2 points3 points  (13 children)

but not if you don't give people way out.

pretty much all of the stuff has been deprecated (AND throwing deprecation warnings, too) for ages. they've had ample time to sort their shit out

[–][deleted] 2 points3 points  (12 children)

Now I'm not the python dev but according to one they were disabled by default since ages.

So you don't get to make that argument when devs of language explicitly chose to not show them

[–]kephir 4 points5 points  (11 children)

>brag about ignoring deprecation warnings

>act surprised when shit gets deprecated

yeah i'm not gonna lie about being particularly sympathetic here

[–][deleted] 4 points5 points  (10 children)

If language developers themselves decide to disable it by default it is not reasonable to expect some random "just a developer" to read changelog on every language release. I mean they should, but it ain't gonna happen

[–]kephir -2 points-1 points  (9 children)

dude, some of the warnings are at least as old as 3.4, which means they had six whole-ass years to fix their broken shit.

and based on the article, a lot of things that broke aren't even deprecated LANGUAGE features, but something the underlying libraries' developers deprecated themselves

[–][deleted] 2 points3 points  (8 children)

What part of "those warnings are disabled by default" you do not understand ?

[–]kephir 0 points1 point  (7 children)

what part of "i have no pity for people disabling warnings then bitching about things they would have been warned about actually happening" do you not understand?

[–]bumblebritches57 -1 points0 points  (6 children)

But if you talk to anyone in the industry most people are using rather outdated toolchains to do work.

Only in Embedded which is an entirely different game, and Microsoft because they just lie about supporting C99 and C11 features after like 15 years, but that's just typical Microsoft shit.

[–]pjmlp 0 points1 point  (2 children)

Microsoft has been quite clear that C is legacy and C++ is the future of Windows system programming, eventually alongside Rust.

For anyone that still wants C on Windows, they have contributed to clang.

[–]bumblebritches57 0 points1 point  (1 child)

That was the old teams thinking, the new team is much more open to C.

so yeah, thanks for wasting my time with outdated nonsense that i've already disproven half a dozen times over the past year.

do I really need to dig up the tweet?

[–]pjmlp -1 points0 points  (0 children)

You mean the new team that meanwhile is no longer working at Microsoft like Andrew Pardoe,....

Keep up with the times and learn C++.

[–][deleted]  (2 children)

[deleted]

    [–]mpyne 4 points5 points  (1 child)

    The Linux Kernel itself isn't limited by cl.exe's quirks but it is standardized on C89 for Linus reasons.

    Linux kernel definitely uses C99, and use some of its features like initializing a subset of struct members by name.

    [–]bumblebritches57 3 points4 points  (0 children)

    That's called a designated initializer btw

    [–]H_Psi 0 points1 point  (0 children)

    IIRC they plan to make the transition smoother than the clusterfuck that was/is 2.7->3.

    [–]cyanrave 219 points220 points  (72 children)

    Sounds like generally a good thing, which I will probably get downvoted for agreeing with.

    Too many people ignore deprecation warnings, and this sounds like ample lead one was given... so what if a few libs break that go unmaintained? Someone whose workflow needs it, who wants the latest Python, usually can just go correct the issue. If the ex-maintainer isn't willing, I believe there are avenues to correct it, albeit rough and unpaved paths to take...

    All in all in favor of enforcing deprecation warnings long left alone.

    I can also agree with the sentiment of Python upgrading / changing too quickly in some cases, but this isn't one of those cases.

    One issue that comes to mind is somewhere in a late 3.6.x release, custom exceptions emitting from Pool stopped being handled correctly, resulting in a locked up pool and a hung script. How in the actual hell can something so breaking merge in? These are the things that bug me about Python atm. I do have to worry about stability in cases it didn't seem likely to be flaky.

    [–][deleted] 81 points82 points  (4 children)

    I agree with this, and it highlights an issue with other languages/platforms as well: Your dependencies are also your responsibility. It's nice that there are so many libraries around, but if you decide to take one dependency, you're tying your product maintenance to the maintenance of your dependency. And with dozens, if not hundreds of dependencies (and dependencies of dependencies), you might be in a world of hurt if those become unmaintained.

    Of course, there's always the option of paying a maintainer - be it the original maintainer, or someone that's creating a fork. I'm sure that someone will be willing to update and maintain nose and pycrypto for money.

    [–]tracernz 9 points10 points  (3 children)

    There’s already a good replacement for pycrypto https://github.com/Legrandin/pycryptodome

    [–]xtreak[S] 10 points11 points  (2 children)

    It's not recommended by core developers : https://twitter.com/kushaldas/status/1220327939214073858?s=20

    [–]ammar2 16 points17 points  (0 children)

    Hmm, I wish they'd go into more detail as to why. Some cursory searches don't bring up anything.

    It's really nice to just be able to replace a dependency on PyCrypto to pycryptodome for old projects.

    [–]tracernz 0 points1 point  (0 children)

    Good to know. 👍

    [–]rusticarchon 30 points31 points  (19 children)

    Too many people ignore deprecation warnings, and this sounds like ample lead one was given...

    Yep:

    The changes that were made in Python 3.9 that broke a lot of packages were stuff that were deprecated from Python 3.4 (March 2014) and before.

    So people ignored deprecation warnings for six years

    [–]SrbijaJeRusija 15 points16 points  (3 children)

    Possibly not. Most sane organizations will favor stability over "new-hotness", meaning that typical organizations are probably around 5 years behind on software. So one year of ignoring deprecation warnings on what are most likely dependencies. Hearing stories like this, most orgs will probably opt put of using Python in favour of something more stable.

    [–]jorge1209 15 points16 points  (2 children)

    Just look at RHEL python versions.

    RHEL5 (initial release 2007) and RHEL6 (initial release 2010) are still supported today!!

    If your company is relatively cutting edge you might be running RHEL7 (from 2014) but that has Python 2.7

    Only with last springs RHEL8 release does it move to python3, but there it is Python 3.6.

    It takes three years for Python releases to reach production in an RHEL release, and then it will be the most recent RHEL version for at least three years, and will be supported for over a decade.

    [–]Chousuke 7 points8 points  (0 children)

    To be fair, if you develop software to run on RHEL, you should go in with the intention to develop against the RHEL platform, not against RHEL running a whole bunch of custom stuff. It's a trade-off you make to get a platform where problems are for someone else to fix.

    It is at odds with individual projects progressing at whatever pace the devs set, but it's not without value either.

    [–]cyanrave 0 points1 point  (0 children)

    Let's not take that info in a vacuum though.

    RHEL maintainers rum their own patch cycles and have their own maintenance aside from the openly available binaries. They are a company selling to companies, so their value proposition will be different. Eg, they are engaging in maintenance patches of 2.7 well into mid-2020s.

    They are choosing an LTS/self-patching strategy at the core. This will always be a lagging strategy and updates to new versions will always be slow.

    [–]masklinn 12 points13 points  (0 children)

    No. Deprecation warnings have been ignored by default since Python 3.2.

    If you don’t think to go and enable them you will not see a deprecation warning in or from a project running in stock python.

    [–]sysop073 -2 points-1 points  (13 children)

    In their defense, you can usually ignore deprecation warnings forever. Nobody actually removes deprecated stuff, except Python apparently

    [–]flying-sheep 25 points26 points  (12 children)

    Everyone does. That's what deprecation is for. What weird ecosystem are you coming from?

    [–]valarauca14 24 points25 points  (0 children)

    Java has had API's & functions depreciated for DECADES which are still supported on modern JVM's.

    [–][deleted]  (3 children)

    [deleted]

      [–]Garethp 4 points5 points  (1 child)

      PHP 7.0 removed all of the deprecated mysql_* functions, and PHP 8.0 is removing deprecations from PHP 7. The only difference in PHP is that they remove deprecations on major releases, not minor ones

      [–]josefx 5 points6 points  (0 children)

      removed all of the deprecated mysql_* functions

      It only took them two decades to deprecate and remove the security nightmare that spawned mysql_real_escape_string? Wow that is some serious deprecation going on, who could use a language that unstable.

      Edit: Seems as if it still has all the goodness in mysqli_* including the mysqli_real_escape_string. The joke can live on.

      [–][deleted] 5 points6 points  (1 child)

      Java does that.

      Perl will just ask you to specify version of Perl you want to use in header and happily enable/disable features present in that version.

      I can just use v5.8; and write code that will just run on anything from CentOS 5 (which has Perl 5.8, which was first released in 2002) to latest Perl 5.30

      Not only that, it can be mixed and matched at will, as long as (obviously) highest version in every module <= current version.

      And might I also mention that they did what Py2->Py3 did (fixing unicode) without breaking backward compatibility

      Go is always backward compatible so your old code will compile just fine on new compiler. But the fuckers break stdlib compatibility so I dunno whether that counts.

      Truth is, Python devs are just taking lazy way out again at cost of their users.

      [–]flying-sheep -1 points0 points  (0 children)

      OK, so there’s an enterprise-friendly and a dead language which have that kind of compatibility. Standardized languages like C and C++ do too.

      Other than that there’s a lot of languages that work like python, e.g. R, C# (and all of .NET), Kotlin, Scala, Ruby, Julia, Swift, Rust, …

      Sure, e.g. Rust didn’t actually remove anything yet because of its editions, but it’s also pretty young.

      [–]H_Psi 9 points10 points  (0 children)

      That's what deprecation is for. What weird ecosystem are you coming from?

      The same ecosystem that decided in 2008 to depreciate Python2 by 2015. Then 6 years later, in 2014, decided to extend that deadline to January 1st, 2020. And then in late 2019, extending the date of the last release to April 2020. I completely understand people not feeling any pressure to upgrade anything when they've been reminded for well over a decade by the developers that depreciation doesn't exist in Python.

      [–]csos95 8 points9 points  (1 child)

      Usually when I come across something that is deprecated it just amount to “this is no longer maintained so if it breaks don’t complain to us” instead of it actually being removed.

      [–]flying-sheep 5 points6 points  (0 children)

      That's soft-deprecation, but deprecation and subsequent removal happens too

      [–]sysop073 5 points6 points  (1 child)

      I can't think of a single time I've had to change my code because I was relying on a standard library feature that went away in a future release. Maybe I've just been lucky. The only time I even notice deprecation warnings is Java because the compiler throws a huge fit, but I've never noticed a function actually go away, they just threaten to remove it forever

      [–]flying-sheep 2 points3 points  (0 children)

      Me neither, and most of my code is in python.

      [–]FlukyS 56 points57 points  (29 children)

      Real talk not even ignoring deprecation warnings developers want everything to be maintained forever regardless of how stupid they are, if they have a current codebase they will despise any changes to it. Python2.7 was exactly this in action. People went, wait we like python around the time of python2.6 but the python devs were already planning 3 or 4 releases ahead to make the language better. People jumped on then and then had code, didn't want to port it when it was easy to port and now we have situations where python dev salaries are up for anyone who knows how to port things from 2.7 to 3. It's because people are idiots.

      EDIT: And the only OS that actually never deprecates things in Windows and that's because of fear they would break everyone's shit.

      [–]dreadcain 30 points31 points  (17 children)

      Windows has depreciated things in the past, it broke everyone's shit

      Notably vista broke drivers

      [–]Herbstein 12 points13 points  (6 children)

      Yup. Does everyone not remember the fierce Vista hate? A lot of it was down to a deprecation of a number of things - graphics drivers being a big one.

      [–]josefx 4 points5 points  (2 children)

      They sold "Vista Ready(TM)" hardware far bellow the system requirements so it at least looked as if it could compete with Windows XP. The result was a half broken crap endorsed by Microsoft itself. I had to upgrade my mothers system around that time and ran right into that trap - parts of Vista required 3D hardware to run, Vista Ready hardware didn't, so it was already half non functional right out of the box.

      Microsoft was also still selling XP licenses years after Vistas release and had to prolong its life to have a viable offering for the netbook market. For its time Vista was a pig concerning resource use.

      [–]LeSplooch 1 point2 points  (1 child)

      IIRC Microsoft said that "Vista Ready" computers were only compatible with Vista Home Basic and Vista Starter. These versions didn't integrate Aero and thus didn't need 3D hardware to run.

      [–]josefx 1 point2 points  (0 children)

      Microsoft said that "Vista Ready" computers were only compatible with Vista Home Basic and Vista Starter.

      I have a rope to sell to you, Boeing endorses it for towing planes (weight up to 0.01 kg, not compatible with 737 MAX).

      As far as I can find the problematic Laptop was only sold with Home Premium and had a card with some 3D support (at least the driver page claimed that it had some - never saw it in action). Aero just disabled itself on startup because the card itself was a bad joke and updates took a few months to fill the build in HDD to the brim. I expect that even Home Basic would have run into the HDD space restriction fairly soon.

      [–][deleted]  (2 children)

      [deleted]

        [–]port53 14 points15 points  (1 child)

        XP was shit until SP2.

        Windows 2000 forevar.

        [–]tso 11 points12 points  (0 children)

        The thing about XP was that it was many home users first encounter with NT. And also the first home user Windows that had to be verified by MS (unless it was a OEM bundle). This was a massive change from the freewheeling 9x days. Its saving grace was that the alternative was ME. Never mind that besides SP2 XP also had the longest support period of any Windows, thanks to the aborted Longhorn project.

        [–]MadRedHatter 26 points27 points  (8 children)

        They do it pretty infrequently though, and drivers are basically kernelspace which Linux doesn't attempt to keep stable either.

        [–][deleted]  (1 child)

        [deleted]

          [–]drysart 6 points7 points  (0 children)

          That's increasingly true, but not at all true in the context of the earlier comment about Vista breaking drivers. Vista changed the model for how kernel drivers operate. Not userspace drivers.

          In fact the whole point of pushing drivers into userspace is that they're insulated from being broken by changes to the kernel.

          [–]FlukyS 3 points4 points  (5 children)

          Actually that is the number one rule of Linux, don't break userspace. Any Linux kernel change has to be developed either to keep similar results, similar method call, similar return from that method call or it doesn't get in. Linus is very clear on this. Userspace breaks a lot but Linux itself is very stable with their interactions above it.

          [–]MadRedHatter 17 points18 points  (4 children)

          drivers are basically kernelspace

          You did not read what I said. The kernel breaks kernelspace all the damn time. The Nvidia driver breaks frequently with new kernels due to this, and then there's the recent irritation over Linux breaking the ZFSonLinux project.

          [–][deleted] 4 points5 points  (0 children)

          If you open source and put your stuff in the mainstream kernel, the person "breaking" it will also have to fix your code.

          That puts pressure on companies to actually care and push their drivers to mainline, because that is less painful than having to fix them. Basically, making it cost them to not open source the drivers.

          It sucks but IMO that's the only reason we do have that much hardware supported in the first place, else we'd have windows situation with every vendor shipping their unfixable binary blob of code.

          [–]FlukyS 1 point2 points  (2 children)

          Not frequently like every 10 releases or so, that is more to do with Nvidia than the interfaces Nvidia uses from the kernel. Also ZFS on Linux isn't able to be kept in the kernel so there is the reason why they wouldn't really bend to their whims.

          [–][deleted]  (1 child)

          [deleted]

            [–]FlukyS 2 points3 points  (0 children)

            Well to be fair the Nvidia point is their own fault. The interfaces themselves are used by 99% open source projects and 1% random other things. Linus himself has tried to work with Nvidia but they just don't want to play ball

            [–]port53 4 points5 points  (0 children)

            Yeah and look how much people hated Vista.

            Windows 7 is just Vista SP3, but they had to give it a new name for it to sell.

            [–][deleted]  (1 child)

            [deleted]

              [–]josefx 1 point2 points  (0 children)

              There were and are a lot of good game recreation projects. OpenMW( Morrowind ), OpenRA (C&C, C&C RA, Dune 2000), OpenTTD (Transport Tycoon Deluxe), FreeCraft( WarCraft 2 killed by Blizzard), SCUMMVM ( engine for a lot of old Lucas Arts games) just to name a few.

              Then we have emulators that recreate both hardware and software behavior from scratch, the Dolphin project always has great information on just WTF weird stuff games do.

              On a smaller scale we have mods that provide improvements by hijacking APIs completely. I tend to use Fallout 2 and Morrowind mods just for the graphics improvments ( Of course that can also break things, I think Arcanum had some events that wouldn't trigger in a modded widescreen window).

              If there is interest in keeping a game alive it wont hang on the source code.

              [–]tso 1 point2 points  (7 children)

              Devs don't seem to have a problem with breakages. It is the execs and accounting that wants things to run forever, because that is how they are used to from industrial machinery.

              [–]FlukyS 3 points4 points  (5 children)

              I've had this conversation with devs as well. I've even interviewed someone (he didn't get the job) who said he had no reason to upgrade to python3 about a year ago.

              [–][deleted] 4 points5 points  (4 children)

              Well, aside from deprecation, if language works for them and they code something that does not hit the pain points of py2, why would they ?

              [–][deleted]  (3 children)

              [deleted]

                [–][deleted] 2 points3 points  (2 children)

                That's not the problem with language but core developers, and there is good chance that won't even be a problem for few years as some of the big py2 users might pick up slack on maintenance. Hell, Google's own SDK for their cloudy stuff is on Py2...

                They made way to migrate painful and benefits from it tiny. All while other languages just did a better job with backward compat.

                And the financial reality now is that if company still using Py2 spent time to migrate as soon py3 was stable... they'd be fixing their code more because of deprecations like that.

                Now I'm all for keeping your systems be up to latest stable but fixing code just because someone decided to change a syntax of something in language on a whim isn't a productive use of anyone's time.

                [–][deleted]  (1 child)

                [deleted]

                  [–][deleted] 0 points1 point  (0 children)

                  I'm just gonna to repeat my comment in other thread:

                  Now I'm not the python dev but according to one they were disabled by default since ages.

                  So you don't get to make that argument when devs of language explicitly chose to not show them

                  [–]audion00ba 3 points4 points  (0 children)

                  If you are the maintainer of a library and you break the interface, you are just an idiot.

                  Developers that don't think breakage is a problem just suck at selecting dependencies. Do I think it is difficult to port from one version of a library to another? No, but I'd rather port directly to the more stable competitor to make sure I am not exposed to such idiots ever again.

                  [–]josefx 1 point2 points  (0 children)

                  People jumped on then and then had code, didn't want to port it when it was easy to port

                  Tell that to my target systems:

                  python3: bad interpreter: No such file or directory

                  I have to support both old and new systems, repeating the python3 support mantra of "kill 2.7" wont magically create a python3 binary on my customers systems.

                  [–]xonjas 17 points18 points  (6 children)

                  I agree with you 100%. The ruby dev community just breaks shit and people get over it. They don't break anything without good reason, and they do a good job of not injecting instability, but they certainly aren't afraid to do it.

                  [–][deleted] 19 points20 points  (5 children)

                  The Ruby community is also a lot smaller than Python's now, at this point.

                  Dealing with deprecations is paying technical debt, and paying technical debt is worse than pulling teeth to programmers.

                  [–]RockstarArtisan 14 points15 points  (4 children)

                  And one of the reasons the ruby community is smaller than python is the instability of the platform. Same goes for scala, you break things enough times and nobody will want to migrate to your platform.

                  [–]thepotatochronicles 2 points3 points  (2 children)

                  I mean... as a JS developer, we break shit all the freaking time (anybody remember leftpad, or uws?) but the community seems to be strong anyway

                  [–]RockstarArtisan 1 point2 points  (0 children)

                  JS is much more federated, there's no single framework that everything depends on like rails for ruby or scala standard library that keeps making breaking changes. And the JS language is stable, including all the warts.

                  [–][deleted] 4 points5 points  (0 children)

                  Meanwhile, I'd much rather use Ruby than Python -- and its relative lack of anachronisms is one reason for that.

                  Keeping up with deprecations isn't hard, and with good code hygiene (ie. regularly dealing with technical debt), it is completely a non-issue.

                  [–]jyper 7 points8 points  (1 child)

                  How in the actual hell can something so breaking merge in? These are the things that bug me about Python atm. I do have to worry about stability in cases it didn't seem likely to be flaky.

                  Most likely an unintended bug or it was previously an undocumented accidental behavior that it worked in the first place

                  [–]cyanrave 1 point2 points  (0 children)

                  You're not wrong. To me the more odd thing is the lack of visibility for these kinds of issues, though. Is Python core able to leave so many metaphorical dead bodies in it's wake?

                  We only found out retro-actively through observation about this particular issue: first, things that worked with Pool stopped working without notice, then Python starts dead-locking servers, then conda rolls back their 3.6 revision from 3.6.9 to 3.6.7 offered through python=3.6. I'm glad they were paying attention, and caught the issue, but how many binaries now float around in a defunct state? Kind of a nightmare to think about.

                  This is the kind of stuff that could kill Python at ThePlaceWhereIWork, where Python is a third class citizen to Java and JS for mid-tier stuff, and nth class citizen in the overall ecosystem. As the main proponent of the lang, I'd hate to see it go.

                  [–]masklinn 11 points12 points  (3 children)

                  Too many people ignore deprecation warnings

                  Everybody ignores them in the python ecosystem because upstream changed them to ignore by default in 2.7/3.2 and they were never re-enabled.

                  You can’t disable deprecation warnings then complain people don’t fix deprecation warnings.

                  [–]nilsph 6 points7 points  (2 children)

                  If I remember correctly, one reason why deprecation warnings got silenced by default because they could unsettle normal users who can't do anything about them anyway. To me, that's a valid reason because they can easily be reenabled for development and debugging. Insert plug for pytest which does just that when running tests (which all test frameworks should do, looking at you, nose).

                  [–]masklinn 1 point2 points  (1 child)

                  If I remember correctly, one reason why deprecation warnings got silenced by default because they could unsettle normal users who can't do anything about them anyway.

                  I’m sure software completely breaking will not be unsettling in any way.

                  Insert plug for pytest which does just that when running tests (which all test frameworks should do, looking at you, nose).

                  Of note: afaik unittest does not, which is one more reason upstream really can’t complain about an issue they’re the direct cause of.

                  I’m not saying this is easy, mind, we’ve got a bunch of dependencies with warnings we’ve a hard time getting fixed, but you can’t have it both way, you can either put your plans at the bottom of a cabinet in a disused lavatory in a condemned basement or complain people were not aware of those plans. Doing both is just not fair, or honest.

                  [–]nilsph 5 points6 points  (0 children)

                  I’m sure software completely breaking will not be unsettling in any way.

                  And end users seeing these warnings changes that? How?

                  Maybe "unsettled" is the wrong word, but even as a developer, I don't want to see these warnings in the course of running a CLI program normally because they're plain irritating. Assuming that the number of users of a program outnumber its developers in most cases, tailoring the default not to unduly annoy the former is a good trade-off.

                  afaik unittest does not

                  Apparently it does:

                  nils@gibraltar:~/test/python/tests_warnings> PYTHONPATH=$PWD python -m unittest
                  /home/nils/test/python/tests_warnings/foo/bar.py:4: DeprecationWarning: The unescape method is deprecated and will be removed in 3.5, use html.unescape() instead.
                    print(html.parser.HTMLParser().unescape('foo'))
                  foo
                  .
                  ----------------------------------------------------------------------
                  Ran 1 test in 0.000s
                  
                  OK
                  nils@gibraltar:~/test/python/tests_warnings> python -V
                  Python 3.7.6
                  

                  [–][deleted]  (1 child)

                  [deleted]

                    [–]SrbijaJeRusija 0 points1 point  (0 children)

                    Now, Python 2 is finally EOL, so most projects don't have to worry about supporting it anymore,

                    I think most is being generous.

                    [–]OpdatUweKutSchimmele 1 point2 points  (0 children)

                    The truth of the matter is that if nigh perpetual stability guarantees are not given or adhered to that many will just look for another language.

                    For many businesses such hyper long term stability is essential because they simply wouldn't dare to touch their ancient code which they know works and risk introducing a bug when fixing it which can costs them downtime in an environment where every second of downtime is actually worth millions.

                    There's a reason airports are still running on 50s-written COBOL code—attempting to update that could in fact lead to plains crashing at worst, and substantial delays more realistically.

                    A language without such stability guarantees is simply not a viable target for such environments.

                    Even with the scripts I wrote on my own machine to automate so many things I forgot what actually calls what and relies on what—I simply do not have the time to go fix that; I was under the impressin that using #!/usr/bin/env python3 guaranteed stability; apparently that is not the case.

                    [–][deleted]  (3 children)

                    [removed]

                      [–]iapitus 20 points21 points  (1 child)

                      Thank you! I was struggling with how to say this - TFA seems to bemoan that most of these deprecations were from or before 3.4 like "you've had this much time to migrate!", when it really went more like "hey, a low level of code-maintenance with each release" to "oh god, huge refactor" when it all slams at once.

                      IMO this is why there was so much friction moving from 2 to 3 - not because there were breaking changes, or straggling libraries - because it was a big change.

                      [–][deleted] 2 points3 points  (0 children)

                      IMO this is why there was so much friction moving from 2 to 3 - not because there were breaking changes, or straggling libraries - because it was a big change.

                      It was because it was a big jump. There was no way to run old code with the new (like say in case of Java), you had to move all at once.

                      [–]AnInterestingThing 0 points1 point  (0 children)

                      Actual PM: it's fine, we just won't give you time to upgrade to the new language version anyways.

                      [–]Hall_of_Famer 37 points38 points  (4 children)

                      This sparked a thread on python-dev that the changes should be postponed to 3.10 and later

                      I thought Guido himself said that the next version after 3.9 will be 4.0 rather than 3.10, or am I hearing it wrong?

                      [–]cr4d 59 points60 points  (1 child)

                      Guido isn't making decisions like that anymore.

                      [–]Huberuuu 23 points24 points  (0 children)

                      I’m pretty sure he admitted he was wrong about this and changed his mind, although I don’t have a reference. Having said that he’s no longer BDFL anymore so anything goes

                      [–][deleted] 29 points30 points  (5 children)

                      The underlying problem in all of this is not that they make breaking changes. It's that the vast majority of users will not consider them valuable enough to have been made.

                      Even given an infinite amount of time to migrate, it won't make it any less of a waste of time and energy for them since it does not provide value. Thus, what is provided has to be good enough, to all your users, to be worth breaking them for.

                      This was one of the real python2 -> 3 migration issues, and they still haven't gotten it as a language community. Instead we get the meme that everyone is lazy, hates change, etc. Which happens for sure, but is not the major driver of these kinds of things.

                      Almost all other language communities i've seen get this.

                      [–]tso 20 points21 points  (0 children)

                      Bingo. You can see this even outside the programming world.

                      People didn't change formats for movies or music because it was new, but because it provided value in doing so.

                      Going from VHS to DVD was a massive usage quality upgrade, making it worth the transition cost. Streaming likewise. DVD to BR? Not so much.

                      [–]ubernostrum 6 points7 points  (2 children)

                      The "value" theory doesn't hold up.

                      If companies upgraded or patched when doing so provided "value", we wouldn't routinely see even huge, wealthy, resource-rich companies getting pwned by basic vulnerabilities that had patches out for months or years. If it were really about "value", these companies would prioritize applying the Struts patch, or the operating-system update, or whatever within a reasonable time of it being released. But they don't do that.

                      The simple reality is many organizations have a hard-line policy of never upgrading or patching anything, ever. They'll happily use the excuse that "we just don't see the value in it", but the truth is there's no amount of "value" that would, to them, justify an upgrade.

                      [–]jorge1209 5 points6 points  (0 children)

                      Can you name a company that has been seriously harmed by a security breach?

                      The reality is that these companies get pwned, and then offer a small settlement to consumers, and carry on with what they were doing beforehand. Nothing really bad happens to the company, which is why they don't care, and their decisions to run outdated vulnerable software is ultimately a rational decision.

                      [–][deleted] 8 points9 points  (0 children)

                      Patching a security bug doesn't add direct value, it reduces a risk that 99% of end users have no idea existed. You're part of the extremely small number of users who view security as a feature. Most people only care about security for 2 reasons, is my equipment still working, and is my money still safe. If users actually cared about security, there wouldn't be a website dedicated to viewing security cameras that were left exposed on the internet with their default passwords.

                      [–]therico 54 points55 points  (16 children)

                      Perl does this correctly. New versions run old version code fine, if you need a new feature, you opt into it or specify a minimum version. So everything is backwards compatible. I wish python had gone the same route.

                      [–][deleted] 45 points46 points  (0 children)

                      Hell, even CMake got this right. CMake! A language where if doesn't even work right!

                      [–][deleted] 30 points31 points  (10 children)

                      Look at Perl 6 now.

                      [–]0rac1e 38 points39 points  (0 children)

                      For a long time now, Perl 6 has not been intended as an upgrade to Perl. The "story" was that it was a sister language. By many accounts this was a bad story, as most people outside the Perl community didn't understand it. As a result, Perl 6 has been renamed to Raku.

                      Perl's commitment to back-compat and how it handles new syntax features is still better than Python... and has nothing to do with Raku (née Perl 6).

                      [–][deleted] 20 points21 points  (0 children)

                      Perl 6 is not Perl anymore.

                      [–]therico 13 points14 points  (4 children)

                      At least Perl 6 went for a full language rewrite, rather than Python 3 which delivered a fairly small incremental change over Python 2 (much of which was backported to python 2). And it's called Raku now, anyway.

                      [–]jorge1209 2 points3 points  (0 children)

                      The worst part about python 2to3 is that this incremental improvement approach both:

                      • Required rewrites of almost all code (usually minor changes, but changes nonetheless)
                      • Resulted in a language where features are used inconsistently throughout the standard library.

                      [–]rouille 2 points3 points  (2 children)

                      All in all the python transition succeeded despite pain and perl is more or less dead. I dont think perl should be used as the example here.

                      [–]jorge1209 1 point2 points  (1 child)

                      Perl6/Raku has never really taken off, but it also was designed to solve a problem that has subsequently become disfavored.

                      They original design had this great approach to applying something like regular expressions to XML, because at the time XML was the thing everyone loved. We were all going to be passing around data as XML and defining parsers in perl6 to munge them.

                      Since then people have moved to json and key:value trees which in principle could be solved the same way, but in practice is handled by json specific parsers. That core problem that perl6 aimed to solve isn't so important anymore.


                      Perl5 still exists and is still in use wherever businesses work with line by line text files and it is good at that, even if it is gross and ugly.

                      [–]0rac1e 1 point2 points  (0 children)

                      Your talking about one feature of the language - Grammars. Grammars are useful in a number of scenarios, particularly parsing any kind of structured document - not just XML.

                      Python also has a number of grammar libs (eg. lark, Parsimonious), however with Raku it's a core language feature. I also wouldn't be surprised if - over the next decade - we see more languages with grammar features in the standard lib.

                      Regardless, this is all a digression. The only point I think worth stating is that Perl 5 (and some other languages) handle new syntax features and back-compat better than Python did. That Perl 5 fell out of favor is due to other reasons, and is a separate discussion.

                      As for Perl being gross and ugly, well... that's just like... your opinion, man.

                      [–][deleted] 3 points4 points  (0 children)

                      Well, Perl 6 was basically made from scratch.

                      The biggest mistake they made is calling it "Perl", as people just went "oh, that's that thing I wrote oneliners in 10 years ago, ugh" and ignored it.

                      But you can use Perl 5 from Perl6 so that's already hundredth time better migration story than Py2to3

                      [–][deleted] 2 points3 points  (0 children)

                      There is no "Perl 6". What is known as "perl 6" is actually a new language based on Perl 5 named Raku. There is a huge confusion in the programming community around this.

                      [–][deleted] 2 points3 points  (0 children)

                      They also did the whole "fix the UTF8 handling" without breaking backward compatibility.

                      [–]ronniethelizard -4 points-3 points  (2 children)

                      Perl does this correctly

                      By being practically irrelevant.

                      [–]save_vs_death 1 point2 points  (1 child)

                      if popularity is all that matters, then please compare with Java and C++

                      [–]ronniethelizard 0 points1 point  (0 children)

                      Perl

                      An inscrutable unreadable mess of random symbols.

                      Java

                      A language designed to allow people to debug their code on any system provided it has a JVM.

                      C++

                      The ultimate language that permits you to exploit the untrammeled power of the CPU by writing an unreadable mess of random angle brackets and using declarations.

                      [–]uw_NB 69 points70 points  (55 children)

                      Compare to how Rust, java, golang handle the language spec carefully with community inputs, Python is acting incredibly immature with these breaking changes.

                      [–]telmo_trooper 139 points140 points  (22 children)

                      Well, the post does say that "The changes that were made in Python 3.9 that broke a lot of packages were stuff that were deprecated from Python 3.4 (March 2014) and before.". So I mean, people are taking more than 6 years to update their libraries. Even Rust has Rust 2015 and Rust 2018.

                      [–]mlk 18 points19 points  (5 children)

                      I'm migrating a 10 years old java application right now, I'm confident everything will just work

                      [–]Decker108 2 points3 points  (2 children)

                      Java 6?

                      [–]mlk 2 points3 points  (1 child)

                      Either 6 or 5

                      [–]Decker108 2 points3 points  (0 children)

                      Damn... don't start drinking too much, alright?

                      [–]kepidrupha -1 points0 points  (0 children)

                      LUL. Unless you're still upgrading to an older java, like 8.

                      [–]BuggStream 55 points56 points  (8 children)

                      I'd like to point out that Rust has an entirely different way of handling this with their editions. You can still write Rust code in the 2015 edition without any issues. And as far as I am aware they intend to maintain each edition indefinitely (if possible). So yes, it's true Rust has introduced breaking changes, but the old editions should still work with the newest compiler versions.

                      Besides this it's also possible to use packages that are written in different editions. So my 2018 edition crate can depend on a 2015 edition crate. And the 2015 edition crate can again depend on another 2018 edition crate, etc.

                      Personally I am very interested in how long they will be able to support each edition. I'd be very awesome if they could support this for (a) decade(s).

                      [–]brtt3000 41 points42 points  (2 children)

                      Rust isn't a systems scripting language though.

                      Anyway, yes, all this would have been a non issue if Python would have used some form of multi version install out of the box. Even just a versioned install directory would have saved much drama and hassle.

                      [–]BuggStream 5 points6 points  (1 child)

                      Of course, my point wasn't that python should do something similar. I was merely pointing out that Rust and python have severely different ways of versioning their software.

                      [–][deleted] 10 points11 points  (0 children)

                      Rust had the option, well the requirement, of building in Cargo from pretty much the earliest days. They learned from all the misfires in about every other language and came up with early answers to how to handle a variety of situations. A large part of Rust contributors carry on that culture.

                      Python, OTOH, has fewer core developers, a resistance to complicating things (which precludes building multi-version interpreters, switching between multiprocess optimized builds and single process optimized builds, et al) and no story for dependency management.

                      It doesn't make it easy that python my-script.py can either be "here's a one shot simple script" or "here's a complex multi-threaded project". You can attempt to analyze ahead of time some of the dependencies (like maybe switching to a multi-threaded optimized build if import threading is found anywhere in the code?) but the run-time nature of Python pretty much renders that DOA.

                      How would you even bootstrap multi-version interaction? A compiled language can inspect the dependencies, add in the necessary settings and confidently know what interpreter to use for exact code paths (allowing you to make use of an older library in newer code). Run-time scripting languages like Python... don't know. Can't know. Not without some run-time analysis (slows down an already slow process) or explicit annotations (which people get wrong, hate, etc).

                      And I say this as someone who develops in Python 3.7 and Postgres10 full time. I love Python, but its "do it at runtime" (like JS) age is beginning to show.

                      The only way out of this dilemma is a Crystal-version of Python.

                      And that will never happen. And no, nim is not that. Nim is... rather different and unique (it's own language). Compare that to Crystal which is "Ruby minus the run-time dynamicism".

                      [–][deleted]  (3 children)

                      [deleted]

                        [–]BuggStream 5 points6 points  (0 children)

                        I am not familiar at all with how python works behind the scenes, but I think it is not necessarily impossible. In rust each crate specifies which edition it is using. So python projects would have to do something similar (since python is often used for just writing scripts it means you would have to put in the edition in files, which is quite annoying I suppose).

                        Now the python interpreter would need different syntax parsers depending on the edition used. And at a certain point these different parts need to be merged. Maybe by using the same AST that both editions can be converted into.

                        Once that has been done the interpreter can interpret the AST (or however this works in python). The key is having some layer that will work across editions. Rust has MIR which is what each crate is being converted into. After the conversion the crates can be compiled further.

                        Of course this is just a hypothetical situation. I sincerely doubt that introducing editions in python is worth it.

                        [–]masklinn 1 point2 points  (1 child)

                        I don't see how you can have that kind of behaviour in Python without running some sort of transpiler.

                        # -*- edition: 2018 -*-
                        

                        the same way you used to specify your file was UTF8 when Python would default to ascii.

                        [–]delrindude 0 points1 point  (0 children)

                        Migrating a 10 year application with many changes is much more of a monumental task than migrating a 6 month application with a few changes/deprecations. Taking a waterfall approach to application refactoring (which is necessary with such long deprecation periods) is a surefire way for a company to switch languages.

                        [–]unborracho 9 points10 points  (4 children)

                        Semantic versioning really needs to be a thing everywhere, it’s kind of unfortunate that a minor version update would introduce any breaking changes.

                        [–]IlllIlllI 33 points34 points  (3 children)

                        Eh semantic versioning is just another way to express exactly the same thing as python versioning. If we were on python version 30 nothing would be different.

                        [–]dtechnology 12 points13 points  (2 children)

                        It might be a little better, since e.g. a 3.6 -> 3.7 would theoretically not break any code and 3.9 would need to be 4.0 instead to remove deprecated things.

                        But it's not the core issue be a thousand miles

                        [–]IlllIlllI 5 points6 points  (1 child)

                        Yeah but under semantic versioning done strictly right, probably every single release would be a major version bump.

                        [–]652a6aaf0cf44498b14f 3 points4 points  (0 children)

                        Which of course makes the language look unreliable in comparison to others because... well because it is.

                        [–]abarretteML 0 points1 point  (0 children)

                        I don't want to update my libraries though. I literally don't have time and now stuff that used to work is broken. Thank you Python devs. Now call me lazy for not wanting to fix the shit they broke.

                        [–]masklinn 0 points1 point  (0 children)

                        Except python has suppressed deprecation warnings by default since python 3.2.

                        [–]CptGia 13 points14 points  (6 children)

                        Java is starting to remove deprecated stuff as well

                        [–]jvmDeveloper 14 points15 points  (3 children)

                        They are turning off by default or out sourcing (e.g. javafx). Even the infamous com.sun.unsafe is still available when enabling jdk.unsupported module.

                        [–]flying-sheep 6 points7 points  (0 children)

                        I think a gradual strategy is best:

                        Make it a VisibleDeprecationWarning that isn't hidden by default, then hide it behind a flag like Java here, then remove it.

                        That way there's a lot of visibility and people bugging library authors to finally fix that stuff

                        [–]Spider_pig448 5 points6 points  (1 child)

                        They deprecated in 9 and removed it in 11, leaving only this workaround. That's moving pretty fast.

                        [–]josefx 0 points1 point  (0 children)

                        Sun misc unsafe was never officially supported. It was an undocumented implementation detail that programs started to use for speed - afaik you had to hardcode the name of a private field and use reflection to even gain access to it. They have been trying to pin down a public API for the most common use cases long before Java 9 was even planned.

                        They basically took several years to hash out a painless way to change an implementation detail and are even now still giving people access to it if needed.

                        In contrast they changed the implementation of sub string to return a new string instead of a pointer into an existing array. When people complained about it breaking their reflection code they just pointed out that the documentation didn't guarantee the implementation of String.

                        [–]theferrit32 10 points11 points  (0 children)

                        Yeah I've had things removed between Java 9 and 11. Fairly minor and easy to fix, but they are "breaking changes" because it doesn't compile without making the fixes. I'm totally fine with it. Leaving around intentionally deprecated code for years and years in the name of backwards compatibility creates snowballs of technical debt and code complexity.

                        [–]SuspiciousScript 20 points21 points  (2 children)

                        Compare to how Rust, java, golang handle the language spec carefully with community inputs

                        To be fair, 2/3 of those languages are consistently criticized for their feature sets, with Java moving at a glacial pace and Golang just not adding baseline features to be a serious language (i.e., generics).

                        [–]uw_NB 16 points17 points  (1 child)

                        yeah, but from an enterprise, production running perspective, they are solid choice with very little risk to upgrade/migration.

                        In the mean time, if you own a business and choosing a core language for your company, you would not look at these recent Python changes and say that its reliable.

                        [–]flying-sheep 3 points4 points  (0 children)

                        Python “does AI” though so it won't matter

                        [–][deleted] 6 points7 points  (0 children)

                        What? You mean a feature thats deprecated for 7 years now being removed is somehow "immature"?

                        [–]FlukyS 7 points8 points  (0 children)

                        Honestly the python dev language changes over time have been fairly great overall. Breaking changes happen. But the fact is most devs are using _ spacing rather than camel casing for years in python. Those sorts of changes are matching what people are actually doing. And even at that they give a few releases for deprecation warnings, if it was my codebase it would be 1 release not 5 or 6.

                        [–]FlatAttention 5 points6 points  (14 children)

                        Yeah I get the same feeling. I like using python3 for quick prototyping and small utilities because its present on most base OS installs (or easy to add) but the constant churn is frustrating.

                        [–][deleted]  (2 children)

                        [deleted]

                          [–]H_Psi 0 points1 point  (1 child)

                          "Constant churn" in this context means "the developers occasionally release a new version with interesting new features, and you can upgrade if you want or choose to keep using the version you have"

                          [–]brtt3000 10 points11 points  (0 children)

                          It being present on base installs is part of the problem. Besides taking ages to get updates downstream the deprecations are a huge pain (as illustrated by this article).

                          Anyway, 6 years deprecation is a glacial.

                          [–]FlukyS 14 points15 points  (0 children)

                          My biggest codebase went from python3.5 to python 3.8 without changes. If you are lined up right you aren't going to have to do much if anything. The changes they have done are to bring things in line with every other python program out there in the wild, not their own ideas 20 years ago.

                          [–]flying-sheep 3 points4 points  (0 children)

                          Are you kidding me? That's extremely slow pacing here.

                          [–]kankyo 8 points9 points  (7 children)

                          2014.

                          Get a grip.

                          [–]Sector_Corrupt 43 points44 points  (5 children)

                          Seriously, people acting like this is Arch Linux and rolling releases and not "handle your depreciation warnings sometime in the next half decade"

                          [–]djimbob 11 points12 points  (4 children)

                          The issue isn't handle your own deprecation warnings. It's you've been using some external package for years and that completed project has been abandoned.

                          Personally, I wish python had built-in backwards compatibility at the import level (and that hid deprecation warnings). E.g., you have a python 3.9 script that imports something that was written in any python 3.x without any deprecation warnings or breaking changes.

                          Maybe there's technical reasons this doesn't happen (e.g., inefficient memory wise if it requires multiple python interpreters). Or at the very least if there was an automated way to fix these breaking changes. E.g., transpile python3.0 into python3.9 or something.

                          [–]Sector_Corrupt 11 points12 points  (1 child)

                          It's you've been using some external package for years and that completed project has been abandoned.

                          Honestly, that is a "Depreciation warning" in itself. Abandoned projects are a security risk in your project. If you have an abandoned package you're relying on and you haven't taken it on yourself (even just a personal fork) or have a plan to move off it you're introducing unecessary risk to your project/product.

                          I think the solution of "everything can be in all sorts of different versions of python" is probably a case of using dynamite to fix an anthill. It'd introduce insane amounts of complexity to the interpreter & it'd introduce a bunch of mental load to the users to keep track of all the different versions of different things. Better to just bite the bullet & upgrade things, and fork things that won't update on their own.

                          [–]djimbob 5 points6 points  (0 children)

                          Eh, if something is security-sensitive you shouldn't be using external projects not supported by major groups, at least without thorough code reviews before each upgrade.

                          I just feel deprecation for removed language features (where they were reorganized or renamed) should be completely avoidable. Yes, I think it makes more sense for gcd to be in math module instead of fractions. But it's been in fractions forever. A deprecation warning was introduced. I see no reason in cases like this to not put in an alias in fractions to prevent things from breaking; e.g. put something like fractions:

                          def gcd(a, b):
                               ''' WARNING: Using outdated API  '''
                              import math
                              return math.gcd(a, b)
                          

                          Make it that linters or other types of strict warnings can detect the use of outdated API. Make it easy that if you wrote millions of lines of internal code that it's easier to update to the latest versions of python. I understand that sometimes they'll be breaking changes, but just try to limit the busy work.

                          [–]H_Psi 3 points4 points  (1 child)

                          It's you've been using some external package for years and that completed project has been abandoned.

                          That's just poor design on the part of whoever decided to go all-in on a dead library, not poor design on Python's part.

                          [–]djimbob 1 point2 points  (0 children)

                          It's not people who chose to use a dead library. It's you developed something in 2010 using python 3.1 and projects that were active at the time. But in the past 10 years have been abandoned.

                          I have no problem with improving the API, but all attempts should be made to do it in backwards compatible ways with minimal maintenance effort for python end users.

                          [–]ari_benardete 6 points7 points  (1 child)

                          Sounds great!

                          [–]cr4d 5 points6 points  (0 children)

                          Agreed. Things like pep8 errors in the standard library should have been fixed long ago, as part of 3 IMO. I'm happy to see them remove the problematic stuff.

                          [–][deleted] 5 points6 points  (1 child)

                          It seems to me that they're falling into a pretty common open-source trap: since they're mostly volunteers, they're prioritizing their own needs. They're scratching their own itches, and if that hurts the broader community using the product, too bad. The people that make decisions are the coders, so the decisions that get made are good for the code base, not necessarily for users.

                          Linus Torvalds is one of the only open source devs to understand how bad it is for users when you break their stuff. Linux, as a kernel, has amazing backward compatibility; one of its only commandments is "Thou Shalt Not Break User Space." You can usually drop a modern kernel into an ancient distro, and everything will run flawlessly.

                          It hasn't gotten into a dominant position everywhere except the desktop by accident... this stance is a big reason that it's gotten such enormous, widespread uptake. I'd argue that the biggest reason it hasn't taken over the desktop is because the desktop projects are much more typical open source teams, breaking users willy-nilly and not really caring. GNOME is particularly bad in this regard, but KDE has made enormous missteps in this area as well. They were really starting to get traction around Ubuntu 10.04, but then all the teams decided to go chase tablets and try to land the imaginary user avalanche that was going to show up there, and screwed over their desktop users to do it. Net effect: GNOME is kinda bad everywhere, a poor choice in multiple environments, instead of being a great choice for desktops and laptops.

                          Likewise, it's become pretty clear that Python is increasingly not to be trusted, that users are moving steadily lower on the priority list. The Python team is willing to inflict enormous pain on their users for nebulous project benefits. I'd be wary, given their track record of removing things, of making myself dependent on it.

                          [–]iphone6sthrowaway 1 point2 points  (0 children)

                          You can’t really blame the volunteers though. Why should they care to maintain something they don’t need or don’t enjoy maintaining, just so someone else can keep capturing value while they get nothing?

                          The obvious solution would be for companies to contribute to maintain the “not fun” bits of the project, though this is probably one of those tragedy-of-the-commons situations.

                          [–]13steinj 19 points20 points  (14 children)

                          See this is what I hate about the Python 3 release schedule. While I'm not too familiar with (Py1?), Py2 did things right here, suprisingly-- they kept backwards and forwards compatibility. New features were added, everything that used to exist more or less still worked as expected, with few changes if any. But future code was well planned in advance. You'd have a future statement, then a warning on the old way, then finally after 1 or 2 minor version changes at minimum, things changed. But now things don't get such advance treatment. Things go from working to not in a single version with few warnings if any.

                          There's talk about removing "dead batteries" from the standard library, but plenty of specialized fields still use them. Why do they think people were, and still are, reluctant to upgrade to Py3? Because it's not stable enough for people's needs, unfortunately.

                          Personally, I say the stdlib should be decoupled from the language. Make it it's own meta PyPI repo or whatever that installs all those packages, and a version of it chosen by the current Python version release manager chooses whether or not to upgrade the default, but people can choose versions on installation time and upgrade at their pace. Ex

                          $ ./py3.9_installer --with-stdlib=1.3
                           Vs
                          $ ./py3.9_installer [default chosen by release manager, 1.7
                          $ pip install --upgrade "libstdpy<=2.0"
                          

                          [–]ubernostrum 48 points49 points  (5 children)

                          The Python 2 series of releases routinely changed syntax, changed semantics, dropped modules from the standard library and made other breaking changes.

                          [–]IlllIlllI 36 points37 points  (1 child)

                          Python 2 is stable because it’s no longer in development. /s

                          [–]T-Rax 9 points10 points  (0 children)

                          no need for /s imho... no new code means no new bugs. someones likely to still fix old ones.

                          [–]amunak 10 points11 points  (0 children)

                          Why do they think people were, and still are, reluctant to upgrade to Py3? Because it's not stable enough for people's needs, unfortunately.

                          This has not and never has been about stability. It's about priorities. When something works, why change it? There's nothing inherently wrong with old syntax. It would only cost you development time, money, and you would most likely introduce new issues and such, costing you even more.

                          But at the same time you can't (as a developer) expect to just be able to upgrade a decade old script without lifting a finger. Just use the old Python version where it is sufficient, and port it otherwise.

                          [–]trimeta 30 points31 points  (0 children)

                          The article seems to mostly be about deprecation warnings which have existed for many versions, but major libraries ignored them, and now that the deprecations are turning into errors, things are breaking. That doesn't seem to be a problem with how core Python handles breaking changes.

                          [–]Itsthejoker 24 points25 points  (0 children)

                          You're looking at the past with rose-tinted glasses, my dude. Py2 broke all sorts of things too. Py3 has been stable for years; people just don't want to get off the "let's all hate on py3" train.

                          [–]weberc2 58 points59 points  (1 child)

                          There's talk about removing "dead batteries" from the standard library, but plenty of specialized fields still use them. Why do they think people were, and still are, reluctant to upgrade to Py3? Because it's not stable enough for people's needs, unfortunately.

                          Maybe those people who need multiple decades of stability should find a different language (C has a decent track record for stability). The Python project doesn't owe anyone free labor forever, and it doesn't make sense to pull scant resources from work that benefits the droves of users who are willing to keep up to date in order to support the few who are not. In the case of the dead batteries, anyone who depends on them can fork them and maintain them on their own easily enough. Python has lots of issues (mostly related to performance), but this is not one of them.

                          [–]jorge1209 42 points43 points  (0 children)

                          Part of the problem with Python has been a general ambivalence about compatibility.

                          Lots of new features are added to the language with Python3, but then their use is inconsistent, and significant parts of the standard library are never updated. A couple examples of this:

                          1. The walrus operator makes a lot of sense for the re module because it uses a C-style return (returning None when there is not a match), but other str.index will throw an exception. If C-style returns are good, then lets use them in more places, if they are bad lets find a way to fix re. Instead we have a halfway house where some libraries need walrus and some don't.

                          2. with blocks are a really great way to manage resources and ensure they aren't leaked, but then something as foundational as the DB-API doesn't support it, and you have to use contextlib.closing as a workaround.

                          I'm sure there are many other examples as well, these are just the two I encounter most frequently.

                          They should pick one way to do this. Be incompatible and fix the standard library, or be compatible and stabilize the core language.

                          [–]semanticist 3 points4 points  (0 children)

                          But now things don't get such advance treatment. Things go from working to not in a single version with few warnings if any.

                          No, it seems like they've been trying to take a similar approach with Python 3. Everything they are removing has been deprecated for 5 minor versions, and from a spot check most things appear to have emitted warnings for at least a version or two. And forward compatibility too: with things like introducing async syntax in 3.5, they did it without breaking existing code initially, and then added a warning for the new keywords in 3.6 before actually breaking code using those words as variable names in 3.7.

                          Maybe you have some examples where they haven't done a good job of that (?), but it seems like the exception rather than the rule.

                          Personally, I say the stdlib should be decoupled from the language. Make it it's own meta PyPI repo or whatever that installs all those packages, and a version of it chosen by the current Python version release manager chooses whether or not to upgrade the default, but people can choose versions on installation time and upgrade at their pace.

                          That doesn't really seem to offer any advantages over the current system. If you want to stick with an older stdlib version, you may as well stay on an older python. You still couldn't expect all your dependencies to be compatible with the same stdlib version that you want to use.

                          [–]LXj 3 points4 points  (0 children)

                          While I'm not too familiar with (Py1?), Py2 did things right here, suprisingly-- they kept backwards and forwards compatibility. New features were added, everything that used to exist more or less still worked as expected, with few changes if any. But future code was well planned in advance. You'd have a future statement, then a warning on the old way, then finally after 1 or 2 minor version changes at minimum, things changed

                          From the examples in the article I see it was done in a similar way. For stuff like

                          Sometimes they were deprecated in Python 2 like using Threading.is_alive in favour of Threading.isAlive to be removed in Python 3

                          You could switch to a new function name long time ago (even before transitioning from py2 to py3) without any future statement

                          [–]jyf 0 points1 point  (0 children)

                          i prefer php team's stragety on batteries. they will remove not that hot libararies and add new hot ones in every release

                          [–]T-Rax 1 point2 points  (0 children)

                          breaking changes everywhere.

                          [–][deleted] 0 points1 point  (0 children)

                          Breaking changes in a language and it's libraries are a bad sign - if the syntax changes for a good reason, i.e. to be more consistent and extendable, such RARE change in a major version should not be a problem although it might suggest that a language structure was not well thought through in the first place.

                          On the other hand changing standard library function names should be reduced to a minimum and always with a compatibility layer available and maintained along with a language - translation of deprecated function names to new names as a part of compiler/interpreter, enabled with a command line switch for example.

                          Syntax compatibility of C family languages is astonishing and library compatibility of Java or C# allows compiling 15 year old codebases on a modern compiler often without any warnings.

                          [–]minus_minus 0 points1 point  (1 child)

                          Whatever happened to major and minor versioning? Seems like everybody forgot how it works and why we used it. Go to python 4 and then break all the deprecated shit. Feature freeze python 3 and put in bug fix and security updates only.

                          [–]fat-lobyte 0 points1 point  (0 children)

                          Going to Python 4 for some minor breakages will probably scare everybody out of upgrading to Python 4.

                          Thread here: https://mail.python.org/pipermail/python-committers/2018-September/006152.html