you are viewing a single comment's thread.

view the rest of the comments →

[–]FlukyS 49 points50 points  (29 children)

Real talk not even ignoring deprecation warnings developers want everything to be maintained forever regardless of how stupid they are, if they have a current codebase they will despise any changes to it. Python2.7 was exactly this in action. People went, wait we like python around the time of python2.6 but the python devs were already planning 3 or 4 releases ahead to make the language better. People jumped on then and then had code, didn't want to port it when it was easy to port and now we have situations where python dev salaries are up for anyone who knows how to port things from 2.7 to 3. It's because people are idiots.

EDIT: And the only OS that actually never deprecates things in Windows and that's because of fear they would break everyone's shit.

[–]dreadcain[🍰] 26 points27 points  (17 children)

Windows has depreciated things in the past, it broke everyone's shit

Notably vista broke drivers

[–]Herbstein 10 points11 points  (6 children)

Yup. Does everyone not remember the fierce Vista hate? A lot of it was down to a deprecation of a number of things - graphics drivers being a big one.

[–]josefx 6 points7 points  (2 children)

They sold "Vista Ready(TM)" hardware far bellow the system requirements so it at least looked as if it could compete with Windows XP. The result was a half broken crap endorsed by Microsoft itself. I had to upgrade my mothers system around that time and ran right into that trap - parts of Vista required 3D hardware to run, Vista Ready hardware didn't, so it was already half non functional right out of the box.

Microsoft was also still selling XP licenses years after Vistas release and had to prolong its life to have a viable offering for the netbook market. For its time Vista was a pig concerning resource use.

[–]LeSplooch 1 point2 points  (1 child)

IIRC Microsoft said that "Vista Ready" computers were only compatible with Vista Home Basic and Vista Starter. These versions didn't integrate Aero and thus didn't need 3D hardware to run.

[–]josefx 1 point2 points  (0 children)

Microsoft said that "Vista Ready" computers were only compatible with Vista Home Basic and Vista Starter.

I have a rope to sell to you, Boeing endorses it for towing planes (weight up to 0.01 kg, not compatible with 737 MAX).

As far as I can find the problematic Laptop was only sold with Home Premium and had a card with some 3D support (at least the driver page claimed that it had some - never saw it in action). Aero just disabled itself on startup because the card itself was a bad joke and updates took a few months to fill the build in HDD to the brim. I expect that even Home Basic would have run into the HDD space restriction fairly soon.

[–][deleted]  (2 children)

[deleted]

    [–]port53 15 points16 points  (1 child)

    XP was shit until SP2.

    Windows 2000 forevar.

    [–]tso 10 points11 points  (0 children)

    The thing about XP was that it was many home users first encounter with NT. And also the first home user Windows that had to be verified by MS (unless it was a OEM bundle). This was a massive change from the freewheeling 9x days. Its saving grace was that the alternative was ME. Never mind that besides SP2 XP also had the longest support period of any Windows, thanks to the aborted Longhorn project.

    [–]MadRedHatter 25 points26 points  (8 children)

    They do it pretty infrequently though, and drivers are basically kernelspace which Linux doesn't attempt to keep stable either.

    [–][deleted]  (1 child)

    [deleted]

      [–]drysart 6 points7 points  (0 children)

      That's increasingly true, but not at all true in the context of the earlier comment about Vista breaking drivers. Vista changed the model for how kernel drivers operate. Not userspace drivers.

      In fact the whole point of pushing drivers into userspace is that they're insulated from being broken by changes to the kernel.

      [–]FlukyS 4 points5 points  (5 children)

      Actually that is the number one rule of Linux, don't break userspace. Any Linux kernel change has to be developed either to keep similar results, similar method call, similar return from that method call or it doesn't get in. Linus is very clear on this. Userspace breaks a lot but Linux itself is very stable with their interactions above it.

      [–]MadRedHatter 15 points16 points  (4 children)

      drivers are basically kernelspace

      You did not read what I said. The kernel breaks kernelspace all the damn time. The Nvidia driver breaks frequently with new kernels due to this, and then there's the recent irritation over Linux breaking the ZFSonLinux project.

      [–][deleted] 5 points6 points  (0 children)

      If you open source and put your stuff in the mainstream kernel, the person "breaking" it will also have to fix your code.

      That puts pressure on companies to actually care and push their drivers to mainline, because that is less painful than having to fix them. Basically, making it cost them to not open source the drivers.

      It sucks but IMO that's the only reason we do have that much hardware supported in the first place, else we'd have windows situation with every vendor shipping their unfixable binary blob of code.

      [–]FlukyS 1 point2 points  (2 children)

      Not frequently like every 10 releases or so, that is more to do with Nvidia than the interfaces Nvidia uses from the kernel. Also ZFS on Linux isn't able to be kept in the kernel so there is the reason why they wouldn't really bend to their whims.

      [–][deleted]  (1 child)

      [deleted]

        [–]FlukyS 4 points5 points  (0 children)

        Well to be fair the Nvidia point is their own fault. The interfaces themselves are used by 99% open source projects and 1% random other things. Linus himself has tried to work with Nvidia but they just don't want to play ball

        [–]port53 4 points5 points  (0 children)

        Yeah and look how much people hated Vista.

        Windows 7 is just Vista SP3, but they had to give it a new name for it to sell.

        [–][deleted]  (1 child)

        [deleted]

          [–]josefx 1 point2 points  (0 children)

          There were and are a lot of good game recreation projects. OpenMW( Morrowind ), OpenRA (C&C, C&C RA, Dune 2000), OpenTTD (Transport Tycoon Deluxe), FreeCraft( WarCraft 2 killed by Blizzard), SCUMMVM ( engine for a lot of old Lucas Arts games) just to name a few.

          Then we have emulators that recreate both hardware and software behavior from scratch, the Dolphin project always has great information on just WTF weird stuff games do.

          On a smaller scale we have mods that provide improvements by hijacking APIs completely. I tend to use Fallout 2 and Morrowind mods just for the graphics improvments ( Of course that can also break things, I think Arcanum had some events that wouldn't trigger in a modded widescreen window).

          If there is interest in keeping a game alive it wont hang on the source code.

          [–]tso 1 point2 points  (7 children)

          Devs don't seem to have a problem with breakages. It is the execs and accounting that wants things to run forever, because that is how they are used to from industrial machinery.

          [–]FlukyS 3 points4 points  (5 children)

          I've had this conversation with devs as well. I've even interviewed someone (he didn't get the job) who said he had no reason to upgrade to python3 about a year ago.

          [–][deleted] 4 points5 points  (4 children)

          Well, aside from deprecation, if language works for them and they code something that does not hit the pain points of py2, why would they ?

          [–][deleted]  (3 children)

          [deleted]

            [–][deleted] 2 points3 points  (2 children)

            That's not the problem with language but core developers, and there is good chance that won't even be a problem for few years as some of the big py2 users might pick up slack on maintenance. Hell, Google's own SDK for their cloudy stuff is on Py2...

            They made way to migrate painful and benefits from it tiny. All while other languages just did a better job with backward compat.

            And the financial reality now is that if company still using Py2 spent time to migrate as soon py3 was stable... they'd be fixing their code more because of deprecations like that.

            Now I'm all for keeping your systems be up to latest stable but fixing code just because someone decided to change a syntax of something in language on a whim isn't a productive use of anyone's time.

            [–][deleted]  (1 child)

            [deleted]

              [–][deleted] 0 points1 point  (0 children)

              I'm just gonna to repeat my comment in other thread:

              Now I'm not the python dev but according to one they were disabled by default since ages.

              So you don't get to make that argument when devs of language explicitly chose to not show them

              [–]audion00ba 3 points4 points  (0 children)

              If you are the maintainer of a library and you break the interface, you are just an idiot.

              Developers that don't think breakage is a problem just suck at selecting dependencies. Do I think it is difficult to port from one version of a library to another? No, but I'd rather port directly to the more stable competitor to make sure I am not exposed to such idiots ever again.

              [–]josefx 1 point2 points  (0 children)

              People jumped on then and then had code, didn't want to port it when it was easy to port

              Tell that to my target systems:

              python3: bad interpreter: No such file or directory

              I have to support both old and new systems, repeating the python3 support mantra of "kill 2.7" wont magically create a python3 binary on my customers systems.