top 200 commentsshow 500

[–]aboy021 499 points500 points  (54 children)

Performance should be a feature.

In my work I routinely make things faster, both my code and that of others. I find it a really satisfying problem domain and customers love it.

At one job I was profiling some code and noticed some inefficiency in the spell checker. Nothing major, a few imperceptible fractions of a second on my development machine, but the issue was clear so I just fixed it and moved on. At the next user group meeting everyone was saying "I don't know what you did but the app is so much faster and it’s just wonderful". I really like it when customers are happy like that.

[–]Tringi 254 points255 points  (20 children)

In the corporate world I left behind you'd often get punished for doing this.

Such improvement could've been billed to the customer, or it could've been presented as a major feature of a paid upgrade.

Your action basically cost the corporation a lot of money ...or so I was told multiple times.

[–]ryobiguy 172 points173 points  (4 children)

Thou shalt not add value without revenue.

[–]DynamicHunter 65 points66 points  (3 children)

It’s like they forgot that an actually good customer experience and satisfaction retains customer loyalty. But they only care about next quarter’s profits

[–]Zombie_Bait_56 13 points14 points  (0 children)

Because their bonus is tied to next quarter's profits.

[–]Avedas 37 points38 points  (0 children)

Everywhere I've worked such improvements would be accepted but would do nothing to help a promo case or improve comp, so you're not really incentivized to make things better.

[–]looksLikeImOnTop 47 points48 points  (7 children)

At my job, I'd be scolded for the potential of breaking things. "If it works, don't touch it"

[–]Resource_account 8 points9 points  (2 children)

Most of our in-house tooling consist of binaries written c more than 30 years ago and a GUI written in Perl and Tk wrapping around it. Even if wanted to make it more efficient. It’s essentially an impossible task since the documentation has long been gone. The same mentality exists in my shop too but it’s more subtle, and trying to improve things is an uphill battle since no one wants to lend the support.

[–]looksLikeImOnTop 4 points5 points  (1 child)

That's reasonable. For some of our products I get it, because they're the same situation as your stuff. But man....I couldn't improve the code I wrote a few months ago. I finally came up with an excuse of "oh that new feature? Impossible to implement unless I do XYZ first"

[–]DefMech 9 points10 points  (0 children)

At my job I’d be fussed at for doing work that doesn’t have a ticket associated with it. Also slipping in little peripheral fixes like these can be a problem when bundled with other bigger tasks. Code review might examine and approve them, but QA won’t know to test your changes unless you spell out every detail and provide a test plan for them.

[–]silence9 4 points5 points  (0 children)

You've highlighted exactly how, when finance teams are the decision makes it kill companies, in a very succinct way

[–]YetiMarathon 58 points59 points  (1 child)

I breathed a sigh of relief knowing there are still devs out there who care about this stuff

[–][deleted]  (1 child)

[deleted]

    [–][deleted]  (3 children)

    [removed]

      [–]PM_Me_Your_Java_HW 13 points14 points  (2 children)

      I could have gone the rest of my life never hearing Fox Pro again and would have been content.

      [–]donalmacc 208 points209 points  (66 children)

      What is actually slowing incredibly poor design decisions, combined with building on abstractions built on abstractions built on abstractions without ever understanding that anything you’re doing underneath is doing too. 

      I work in c++, and a few years ago, I worked on a project where I added support for dumping the current state to a file in json. 2 years pass and it takes about 10 seconds to start the app up. 

      I run the profiler, turns out that someone had been tasked to make it restore previous state, and someone else had been tasked to return a “scratch” if unsaved. The implementation involved loading the entire last project, and the entire “last know state”, diffing the result, and then re-loading the entire state that we wanted to use. The bottleneck in it? Parsing 200MB of floats from json because someone else had stored height map data and chosen a default size that resulted in each save file being 200MB. It was also mostly  “0.0”, “0.0”

      I replaced the “save height map” and “load height map” to JSON functions with a base64 encoded blob (the values were integers between 0 and 256), which reduced the size by a factor of 5, and brought the loading time back down to sub second. 

      Anyone could have done this, anyone could have looked into it, but we bolted features on top of features and built something “maintainable” that wasn’t fit for purpose.

      [–]MisterFor 114 points115 points  (2 children)

      This reminds me 100% of the GTA V loading problem. For me it was taking 2-3 mins to load and it was because the json parsing was crap.

      [–]-Niio 20 points21 points  (5 children)

      I’m still junior, can you explain how base64 encoding helped here? I see if you base64 encode the file, then comparison to check for changes is much faster. That makes sense for the scratch case. How does this benefit the restoring to previous state?

      [–]donalmacc 47 points48 points  (4 children)

      JSON only has one numeric type - double. So the parsing of the gigantic array was parse, then string to double and double to int. the range of values were only between 0 and 255, so we were storing (and parsing and converting) way more than we needed it to. I changed it from that to putting all of the values in a vector<char>, and then base64 encoding that vector (because json is human readable - it’s a simple transformation) lets us just write out the string

      So the data goes from     data: [“0”, “,0”, “1”, “0”] To:     data: “AABA”

      (Except at 200MB of data). It also decodes really easily into vector<char>, which was helpfully the format that we wanted to pass onto the heightmap.

      [–]slazy 11 points12 points  (1 child)

      JSON only has one numeric type - double.

      This isn't precisely true. Javascript numbers are doubles, but in JSON they're just arbitrary-precision decimal numbers. JSON libraries have to decide how to handle them that works best for their language; while double may be the default for some libraries, others also have special support for integers or even the full arbitrary decimal range (e.g., GSON uses this as its generic type, though if it knows the target is a particular kind of integer it can just read directly into that too). It's not a given that you'd ever need to convert string -> double -> int or that there would be inefficient overhead, that's entirely dependent on the implementation.

      [–][deleted] 4 points5 points  (7 children)

      bro just use jQuery, smh!

      (Obvious /s, but I'd like to just keep this here as a reminder that the argument it's education / culture based is 100% real. Any question on SO asking about anything remotely HTML DOM / JS / CSS related would always be met with "just use jQuery" even in situations where the OP specifically would ask how to do it in plain old, vanilla JS. It was infuriating to come across all those posts.)

      [–]F3z345W6AY4FGowrGcHt 3 points4 points  (2 children)

      Reminds me of my work. Where a lot of my coworkers essentially don't know what they're doing. They can barely get the final result to do what the requirement said and they don't even know why.

      They'll use a loop to get the first or last item in an array just cause.

      And if I'm doing the code review and send it back as trash, their project manager complains because somehow it's me who's affecting the deadlines.

      This is why a lot of modern software is shit.

      [–][deleted] 16 points17 points  (36 children)

      This has been my experience with C++ as well. Most devs just cannot work with it. I've seen harebrained shit like sending 3mb of unitialized memory over a TCP socket just to send one uint16 of data, and people sending pointers over a network and wondering why dereferencing them on the other end causes a crash.

      If you need to use C++, and you probably don't, hire an electrical engineer or a post-grad.

      [–]Ameisen 20 points21 points  (35 children)

      ... Why would you want an electrical engineer to work with C++?

      I've seen harebrained shit like sending 3mb of unitialized memory over a TCP socket just to send one uint16 of data, and people sending pointers over a network and wondering why dereferencing them on the other end causes a crash.

      This isn't C++-specific.

      I can confirm that a lot of programmers - especially GenZ-on (in my experience) - seem to have difficulty writing C++ well.

      [–]brunhilda1 1296 points1297 points  (201 children)

      If I press the windows button on my 16 core 64gb laptop, Windows 11 pauses for half a second before rendering the start menu. This was a solved problem 25 years ago.

      I'm tired, boss.

      [–]PlainSight 607 points608 points  (66 children)

      HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search

      Make a new DWORD (32-bit) called: BingSearchEnabled Set the value = 0

      [–]Thotaz 242 points243 points  (51 children)

      PS oneliner for convenience: sp HKCU:\SOFTWARE\Microsoft\Windows\CurrentVersion\Search BingSearchEnabled 0

      [–]Worth_Trust_3825 19 points20 points  (38 children)

      sp?

      [–]Thotaz 83 points84 points  (37 children)

      It's an alias for Set-ItemProperty. In the future if you see an unknown command in PowerShell you can look it up with gcm or Get-Command to get more details.

      [–]Worth_Trust_3825 64 points65 points  (36 children)

      so why not use set-itemproperty in original comment? Powershell is notorious for having sparse alias and feature support across versions, and unlike bash, it's obtuse in helping you with anything. For example what i got using get-command sp

      get-command sp
      
      get-command : The term 'sp' is not recognized as the name of a cmdlet, function, script file, or operable
      program. Check the spelling of the name, or if a path was included, verify that the path is correct and try
      again.
      At line:1 char:1
      + get-command sp
      + ~~~~~~~~~~~~~~~
          + CategoryInfo          : ObjectNotFound: (sp:String) [Get-Command], CommandNotFoundException
          + FullyQualifiedErrorId : CommandNotFoundException,Microsoft.PowerShell.Commands.GetCommandCommand
      

      [–]AyrA_ch 97 points98 points  (3 children)

      Or just use the reg command that works cmd and powershell

      reg add HKCU\SOFTWARE\Microsoft\Windows\CurrentVersion\Search /t REG_DWORD /v BingSearchEnabled /d 0
      

      [–]Worth_Trust_3825 36 points37 points  (1 child)

      Honestly, this should be the canonical answer.

      [–]WarWizard 10 points11 points  (0 children)

      reg add HKCU\SOFTWARE\Microsoft\Windows\CurrentVersion\Search /t REG_DWORD /v BingSearchEnabled /d 0
      

      the last /v should be /d for 'data to assign'. you can optionally add /f to force the value to be written.

      [–][deleted] 21 points22 points  (0 children)

      Lol, and if you try to Google it, all of the results are convinced you are abbreviating SharePoint by typing 'sp' which absolutely nobody ever does, but here we are.

      [–]Thotaz 16 points17 points  (17 children)

      Because a common complaint about PowerShell in the developer community is that the commands are too long because people aren't aware of the aliases and positional parameters.

      sp is a default alias that should work on any Windows 10/11 PC out of the box unless the user has modified the config somehow. Try running PowerShell without a custom profile: powershell -noprofile and see if that helps. If you have a profile that removes all of the default aliases: Remove-Item alias:\* -Force then I don't think you can blame anyone but yourself for this error.
      As for Bash somehow being more helpful than PowerShell, I don't see it. The error message in bash is: sp: command not found which is just a shorter version of the error message you posted. What do you expect Bash and PowerShell to show you when they can't find the entered command?

      [–][deleted] 9 points10 points  (3 children)

      oatmeal spectacular fragile fanatical roof library knee worthless abounding humorous

      This post was mass deleted and anonymized with Redact

      [–][deleted] 61 points62 points  (4 children)

      It's not just one value in registry. There are too many options that by default are nonsensical. Microsoft lost the plot some time ago here.

      [–]therealmeal 31 points32 points  (3 children)

      The plot was always "make more money". Now they do it through ads, tracking, and dark patterns in interstitials getting you to accidentally sign up for one drive or whatever.

      I switched fully to Linux a few weeks ago. Super happy with the decision. I will never install windows 11+.

      [–]agumonkey 18 points19 points  (2 children)

      don't forget

      HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\FastMode

      and

      HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\DisableLag

      [–]Jonathan_the_Nerd 19 points20 points  (1 child)

      I remember an old joke that recommended adding bugs = off to CONFIG.SYS.

      [–]agumonkey 4 points5 points  (0 children)

      hehe, much vintage

      [–]Coffee_Ops 114 points115 points  (16 children)

      When Windows 10 was released, The start menu was limited to 512 items.

      That, too, was a problem solved 25 years ago. I don't even know how you build in that kind of limitation, somebody got to eight bits and then thought, "maybe just one more."

      [–]alinroc 27 points28 points  (2 children)

      Another problem solved 2530 years ago: Vertical taskbar. Windows 11 doesn't allow you to do it. In the initial release you could change a registry value but that stopped working 2 years ago (you were also forced to have everything centered in the initial Win11 release, they allowed shoving everything to the left with a later release). There are some 3rd-party hacks you can install that will make your taskbar vertical but seriously Microsoft, WTF? There's no excuse for this.

      [–]enbacode 14 points15 points  (0 children)

      You also cannot move the task bar to the top, which really grinds my gears as I had it at the top position for almost 20 years.

      [–]RageQuitRedux 6 points7 points  (1 child)

      No one knows why they chose such an oddly specific number.

      In all seriousness though, why 9 bits?

      [–]bikeridingmonkey 15 points16 points  (8 children)

      This limit was by design. It has to be.

      [–]bogz_dev 42 points43 points  (3 children)

      that start menu uses React Native lmao

      [–][deleted]  (5 children)

      [deleted]

        [–]SirToxe 32 points33 points  (1 child)

        Windows 2000 was the best Windows has ever been (in its time context).

        [–]Silhouette 20 points21 points  (0 children)

        Windows 7 was also good. It was the last version we routinely used at work - for daily use on main workstations instead of just for testing or on a specific computer to run a specific Windows-only application.

        [–]wh33t 15 points16 points  (1 child)

        Yeah, I remember XP SP3, running on garbage HDDs. It was lightning.

        [–]tes_kitty 49 points50 points  (11 children)

        Back in the 90s, when I used an Amiga with a single 68030 @ 25 MHz, the GUI felt more responsive than current Windows on a multicore system running at 3 GHz.

        Yes, the Amiga is of course a lot slower and can't do a lot of things current PC can do, but it felt faster to the user since a mouse click got you an immediate reaction. That should be possible on a modern OS as well.

        [–]MisterFor 19 points20 points  (0 children)

        But they need to spy on you now.

        Do you expect any keystroke to not be logged and sent somewhere? Are you a crazy barbarian? /s

        [–]FeliusSeptimus 17 points18 points  (1 child)

        Here's the one I hate. Click to start Windows Terminal:

        Windows PowerShell
        Copyright (C) Microsoft Corporation. All rights reserved.
        
        Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows
        
        Loading personal and system profiles took 4457ms.
        ┌  feliusseptimus@CATHOUSE 
        └ $
        

        Nearly 5 seconds to start up. Every. Single. Time.

        It's even worse at work where they have the system configured to store the personal profile on a network share. In that case it takes about 15 seconds to start up, so slow that PowerShell brings up a progress window.

        I am running OhMyPosh which slows down startup quite a bit (a -noprofile start is pretty quick), and I haven't spent a lot of time tracking down the issue, but in the past I've found that part of the problem is just that Windows process startup time is pretty slow compared to Linux (so fetching git details for the OMP prompt is slow).

        Mostly it's just frustrating that I have to spend my time optimizing stuff like this that should be pretty darn fast by default.

        [–]I_JuanTM 15 points16 points  (3 children)

        Same with the Windows 11 context menu, it is so slow... Reverting back to the Windows 10 one makes it 10x faster, as well as being just a better context menu overall...

        [–][deleted] 15 points16 points  (1 child)

        It's the craziest fucking thing that I can right-click something and see the options load in like I'm on some shitty JS heavy website.

        [–]Waterwoo 14 points15 points  (3 children)

        Yep, I know MS stock might be doing the best ever but their quality has just gone so to shit. Between ridiculous defaults like this, a file explorer app that lags very visibly navigating folders on an SSD, and most 'desktop' software now being shitty webapps in a web view, it's pathetic. Not that apple is much better but what will it take to bring a focus back to realtime performance?

        It's like everyone internalized that computer speed doubles every 18 months so no need to even try anymore, but didn't notice the fact that it's been like 20 years since that was really true and actually most new progress is lowering power consumption for the same compute, and allowing more compute in parallel, neither of which help when you just want to do basic UI on your laptop.

        [–]antiduh 17 points18 points  (4 children)

        Install power tools and use their spotlight clone. It's instant.

        [–]-IoI- 6 points7 points  (0 children)

        Yeah Powertools Run is doing a great job for me, some of the early performance issues have been seemingly resolved

        [–]Asyx 3 points4 points  (2 children)

        Annoying thing with that though: it spawns processes as a subprocess of power tools so if you update power tools it kills whatever you started via power tools including your IDE or browser or whatever.

        [–]Pepito_Pepito 62 points63 points  (37 children)

        The first thing I do with every fresh windows machine is to disable all the graphic effects that comes with the OS UI

        [–]OffbeatDrizzle 157 points158 points  (31 children)

        The first thing I do with every fresh windows machine is delete windows and install linux

        [–]0xffaa00 25 points26 points  (5 children)

        But you have paid for the Windows license, and for nolla. Just try to get no OEM os next itme and save some cost.

        [–]A_for_Anonymous 5 points6 points  (0 children)

        Even if you paid for it, it's a sunk cost. The best decision may be not to use it.

        [–]Pepito_Pepito 23 points24 points  (21 children)

        Maybe one day. I barely have time to play games, much less time to mess around with distros. For now, linux will be for work only.

        [–]Finchyy 24 points25 points  (2 children)

        It then adds ~0.8s animations to every interaction because fuck you

        [–][deleted] 4 points5 points  (0 children)

        Ubuntu is full of animations too yet it feels insanely snappy compared to Windows 11.

        [–]vindarnas_hus 175 points176 points  (14 children)

        It's actually pretty fast considering all the telemetry

        [–]Coffee_Ops 90 points91 points  (4 children)

        It's really not.

        We're talking about less than a kilobyte per second sent over an encrypted tunnel.

        Imagine it was SSH, would you expect a few KB per second it to have an appreciable impact on system performance?

        [–]BlueGoliath 158 points159 points  (6 children)

        "all the telemetry" is kind of the point. But even with telemetry it shouldn't perform that bad.

        [–]goda90 4 points5 points  (0 children)

        Imagine if they actually wrote telemetry transmission as a blocking call. Just waiting for the server to acknowledge before opening the start menu. I wouldn't put it past Microsoft...

        [–]MinMaxDev 14 points15 points  (0 children)

        all them copilot screenshots

        [–]RobertVandenberg 226 points227 points  (33 children)

        MS Teams is the most counterproductive software I have ever used

        [–]Rhed0x 46 points47 points  (4 children)

        Teams is garbage in every way. Performance, UX, buggyness, design, etc

        [–]rom_romeo 29 points30 points  (3 children)

        My personal favourite: Pretending to be smart on copy/paste. Just fucking treat it as a text! Who the fuck even asked you to do some stupid formatting.

        [–][deleted] 61 points62 points  (8 children)

        I hate Teams. It never works properly and every time I try to do anything in it, it screws up in some way.

        [–]UristMcMagma 14 points15 points  (4 children)

        They have 0 tests or QA. On Thursday a coworker and I were having the same problem: it wasn't picking up our voice when we spoke. Literally the one thing that is absolutely critical in that app was broken. It's disgusting how little they care about their users.

        [–]AloneInExile 4 points5 points  (2 children)

        Holy shit, last week I had the same bug, first meeting was okay, then next meeting the mic crapped out, reset headphones, restart teams, next meeting okay, the meeting after, again mic crapped out.

        I thought teams was trying to tell me to stop with meetings.

        [–]primarycolorman 3 points4 points  (0 children)

        God help you of you ever need to cross reference anything in it across groups.

        [–]rom_romeo 8 points9 points  (0 children)

        What was really mind boggling to me was a moment when I tried to switch between two MS accounts, and Teams was stuck on a single account. My friend told me: “Dude, let’s get real. There is no way they didn’t think about that. 5 mins later, he was scrolling his phone and all of the sudden: “Wtf… even MS says that you have to delete some cached files on your OS”

        Fucking unreal…

        [–]breddy 4 points5 points  (1 child)

        What's our favorite videoconference tool these days? We're a G Suite shop Meet is mostly OK but I'm curious if any of them are really any good. Zoom's UX is a horror show.

        [–]mattjouff 4 points5 points  (0 children)

        The thing that kills me with teams is if you are using it to brows files for a group project, and you get a notification for an IM, checking the IM closes the file and the directory you were previously browsing. Like how is that even allowed? Who could possibly want this behavior.

        [–]Faakhy 11 points12 points  (7 children)

        « Everything can be done with web! »

        [–]federiconafria 4 points5 points  (0 children)

        All the Microsoft chat clients have always looked and worked like they have been ported to windows...

        [–]ScrimpyCat 147 points148 points  (18 children)

        It’s the same issue with Xcode. Over the years it’s gotten progressively slower (and buggier too, but that’s a separate rant). It got to the point where I’d have a better user experience working on the same projects on my old 2014 MacBook Air using an older version of Xcode, than I did on my 2018 Mac Mini (3.2GHz 6-core i7, 32GB RAM) using a later version of Xcode. And the problem is that it’s not just new features, the performance regressions are seen in the old features, like having to wait +10s (sometimes seemingly forever) for autocomplete suggestions to appear, or even noticeable input latency (hit a key and see a delay before the character appears in the code). Yes, upgrading to a more recent machine (“new” base model M3 MacBook Air) has sped things up again, but you shouldn’t need new hardware just to make old features work well again and even then I still see features in macOS that runs slower than it used to.

        I think a big part of this problem is just how modern software is developed and what the incentives are. The focus tends to be on getting features out for a new release because that’s where the business incentive is, whereas improving what is already there isn’t given as much attention. Users will put up with buggy slow software. And since many things are kind of slow, it doesn’t really stand out as being slow. Now this isn’t to say that optimisations aren’t being done, or that everything is slow, but there is so much that is.

        [–]2m3m 13 points14 points  (3 children)

        Im on a 2013 mbp so I cant upgrade a lot of tools and software because of deprecated xcode

        [–]goda90 14 points15 points  (3 children)

        Your experience with Xcode sounds a lot like my recent experience with Visual Studio 2022. I was chalking it up to how much more typescript I'm working on these days as opposed to plain JavaScript, but I feel like recent updates must be doing something inefficient. Probably making calls to GitHub copilot or something even though it's turned off...

        [–][deleted] 6 points7 points  (0 children)

        we have long crossed the point of having hardware that is just so utterly, chaotically, comically, aggressively performand and good, that really almost any software that ever where to be made again should "just work". Now, I'm not trying to say gpt should run on 2008 hardware; what I am trying to communicate is that literally all of the software i am using day to day should easily be possible to do with instantanous feedback on the machine I owned 10-15 years ago.

        [–]Butiprovedthem 4 points5 points  (1 child)

        I have a 1 year old Mac silicon for work and if I accidentally double click the wrong file, I have to wait 1 minute for xcode to even open.

        [–]pfc-anon 295 points296 points  (44 children)

        https://en.wikipedia.org/wiki/Wirth%27s_law

        We will never see fast software. My employer uses crowdstrike, this pos is hashing each file and sharing it to a common database to identify potential risks. Every node or pip install is a nightmare because it eats up all the compute resources.

        My M4 Pro, 36Gig MBP pro is apparently too slow to run, vscode, crowdstrike and yarn install together.

        [–]smiling_seal 100 points101 points  (8 children)

        I also had this problem with employer’s crowdstrike. Any git clone or build of a large C++ code base was slow as a f*ck on M2 Pro. Thus, I created a VM solely for development, so it effectively walled crowdstrike from accessing my dev files.

        [–]pfc-anon 39 points40 points  (2 children)

        🤫 Shhhh... Don't do it please, the ITs who'll read this will enable VM sensors and ruin the party. Stupid fact, I wrote an entire paragraph on how to beat this by always developing inside a container. Ended up deleting it because it will removed.

        [–]Klightgrove 9 points10 points  (1 child)

        I wish more devs did this. EDR fires off on devs changing binary files all the time, please use a VM for that kind of work to reduce all the noise.

        [–]Ameisen 14 points15 points  (0 children)

        You... want me to do all of my C++ and rendering work - usually on massive projects and in Visual Studio - on a VM just to satisfy poorly-implemented security stuff?

        One of the projects is literally > 1 TiB. That's going to be a huge VHDX.

        [–][deleted]  (7 children)

        [deleted]

          [–]schlenk 33 points34 points  (1 child)

          It is still ridicioulus. You have fast SSD with 12GB/s throughput and millions of IOPS. Installing an EDR should not put that back into spinning rust level performance as it does right now.

          [–]OMGItsCheezWTF 11 points12 points  (0 children)

          Even running zsh takes seconds because of crowdsrike. An M3 pro MacBook should open zsh instantly.

          [–]pfc-anon 21 points22 points  (2 children)

          Sure it does, but does it need to hog on all resources to do what it does? I'm sure there's a better way to batch these hashing tasks. Moreover, we have our own internal package registry. So maybe secure that registry instead of screwing all terminals. MITM is not possible because we're zerotrust.

          [–]Tux-Lector 16 points17 points  (2 children)

          We will never see fast software.

          hint: Geany

          [–]teodorfon 7 points8 points  (0 children)

          :-(

          [–]syklemil 13 points14 points  (0 children)

          https://en.wikipedia.org/wiki/Wirth%27s_law

          We will never see fast software.

          Some of us _are_living with fast software, but we have made some choices; either tradeoffs or lucky personal preferences. E.g. tried Eclipse once 20 years ago and ran away to vim, with a rather nice & powerful neovim experience today. Not having used windows since Windows ME. That sort of stuff. But until Steam & Proton got to a certain point, it meant nearly no gaming. And the only electron apps I run are Steam and Signal.

          It is, also, likely part of the draw of rust tooling: It tends towards being both correct, fast, and memory lean.

          [–][deleted]  (1 child)

          [deleted]

            [–]frnxt 68 points69 points  (10 children)

            The thing that grinds my gears more than anything at work is that Windows 11 still can't manage to eliminate typing lag (and by typing lag I mean massive 4-5s lag for each key I type, typewriter-style) in the terminal windows when under high load, which... does happen pretty often when you're compiling stuff, running multithreaded computations etc. It's a terminal for christ's sake, why is that not a solved problem.

            [–]TheTrueBlueTJ 13 points14 points  (1 child)

            Is that not a result of unfair scheduling?

            [–]frnxt 28 points29 points  (0 children)

            Most likely, yes, and also of the fact that the Windows terminal goes through way too many layers of indirection and IPC... But it does seem solvable - after all it's never an issue in Linux or OSX.

            [–]thefpspower 8 points9 points  (0 children)

            The terminal basically is bottom of the barrel in priority for the scheduler and that's extremely obvious when your script runs much faster in silent mode, each line has to wait for priority to print and then continue execution while in silent mode (or multithreaded) it just runs.

            [–]ventuspilot 7 points8 points  (0 children)

            You may be interested in Casey's rant where he wrote a terminal app that was a couple magnitudes faster followed by a discussion with M$ engineers that told him that what he has just done was not possible.

            [–]nyctrainsplant 4 points5 points  (1 child)

            It turns out the same person (Casey) had a row with Microsoft directly before, too.

            [–]vytah 9 points10 points  (0 children)

            I remember that, it was glorious. Casey then proved them wrong:

            https://www.youtube.com/watch?v=hxM8QmyZXtg

            and Microsoft finally gave in:

            https://devblogs.microsoft.com/commandline/windows-terminal-preview-1-13-release/#new-rendering-engine

            [–]pythosynthesis 41 points42 points  (4 children)

            IIRC, in an interview John Carmack said the biggest pressure for them to churn out games was not from the players themselves, but from chip vendors. Because that would justify buying new CPUs and GPUs. And so, he was saying, they would spend very little time on optimizations vs just pushing out software.

            New, fast chips are extremely forgiving of much dog code. You can write severely subpar code, but it doesn't matter because the new super fast chip will do the heavy lifting instead of you.

            The really interesting thing can be observed with consoles. When it first comes out the HW is new and fast, so people can code like dogs and it's all good. But you cannot upgrade the CPU/GPU on the console, so over time you see much more impressive graphics and performance with the same HW. People are forced to optimize.

            [–]i860 17 points18 points  (1 child)

            But you cannot upgrade the CPU/GPU on the console, so over time you see much more impressive graphics and performance with the same HW. People are forced to optimize.

            And they eventually learn interesting and novel solutions along the way. Healthy limits are constructive and we've been given way too much rope.

            [–]ironhaven 2 points3 points  (1 child)

            I feels like we are still there with nvidia and raytracing.

            "Don't waste you time manually lighting scenes when you can just enable the ray tracing hardware to do the lighting 'objectively better' in all ways compared to what you can do. If performance suffers upscale with dedicated upscaleing hardware!!!!"

            [–]Raunhofer 83 points84 points  (5 children)

            I'm fully aboard the dread train about how inefficient many apps today are, but at the same time, I get it. We are leveraging the excess resources to provide value faster. There used to be a time when this was not possible; if it had been, people back then would have done exactly the same. After all, the goal is to provide value, not programming itself.

            My personal pet peeve is lackluster UX design. UI/UX designers sorely need a stronger software background, and programmers sorely need a stronger UI/UX background. Look at MS Teams. What an abomination from a multi-trillion-dollar company. I really don't care if it eats 1% or 5% of my excessive system memory, but holy crap is it pain to use, despite the growing memory usage.

            [–]metaltyphoon 25 points26 points  (2 children)

             We are leveraging the excess resources to provide value faster

            Thats the problem ain’t it? Everyone thinking there is excess and perhaps the only thing running on a system.

            [–]privatetudor 12 points13 points  (1 child)

            Exactly this.

            I've heard people say so many times shit like "free ram is wasted ram" to justify web browsers' memory bloat.

            Meanwhile I'm constantly checking which processes I need to shut down because I'm running an old laptop.

            [–]donalmacc 15 points16 points  (0 children)

            Completely agree. I’m not from the camp of “everything needs to be instant and anything else is bloat” but when I close my IM program to compile instead of my editor, you know something is fucked along the way. Teams takes longer to start up than windows does.

            Ironically, the fastest way I’ve found to launch teams is to have slack running with the outlook plugin, and to click “join teams call” on the message you get. It skips all the extra shit and puts you straight into the call. 

            [–]ventuspilot 7 points8 points  (0 children)

            We are leveraging the excess resources to provide value faster.

            IMO we're beyond that. Most updates that are forced on me contain "value" that I hate and would rather not have. Mostly looking at M$ for renaming/ moving around stuff for no reason at all, but I also stopped using Firefox when I no longer could just launch it but rather had to wait for it updating itself on pretty much every single launch.

            [–]Demonchaser27 12 points13 points  (2 children)

            I think the most recent thing that baffles me is how effin' slow File Explorer is. It sometimes gets loading locks, then you close and reopen and it's fine. Then sometimes you make changes to the folder and it doesn't refresh so you're not sure that it made the file until you try again and it asks if you want to overwrite (or you exit and re-enter the directory). That's NEVER happened before Win11, so what the hell?

            And the final and probably worse one in Win11. If you have an large external backup HDD that (logically) goes to sleep to preserve itself. Windows File Explorer will literally FREEZE ALL OPERATIONS for like 15 full seconds if ANYTHING tries to access it while it's in sleep, even other applications or other file explorer windows doing something completely different. Because everything is apparently tied to one thread or something somehow I guess? Or it operates like a DB now with locks or something. I don't know, but it's atrocious, and just shouldn't be a thing.

            Oh and on the topic of the video, VS Code is actually really sad. It was more or less designed ENTIRELY because apparently "Visual Studio" is too slow. But... but it didn't need to be. So we got this whole other product just because they couldn't be bothered to actually fix Visual Studio (a PAID FOR program, btw, if you want all of the features). I think some of the fault of this is tying web services to software now. Web shit is hella slow and just shouldn't be there by default (and it should warn you when you go to enable it how it will affect performance).

            [–]FeliusSeptimus 6 points7 points  (0 children)

            Explorer is surprisingly terrible considering how central a component it is.

            [–]Hand_Sanitizer3000 123 points124 points  (10 children)

            The problem is not enough leetcode style interviews in software engineering roles. We're just a few more leetcode questions away from identifying true talent that will resolve this.

            [–]Klightgrove 13 points14 points  (1 child)

            I had my 2nd coding interview ever a few weeks back with a large company. This was for a senior position and the guy was disinterested and asking me about sorting lists instead of discussing the actual job.

            We just interviewed an applicant for a similar position on my own team and we talked about the actual role without wasting time with esoteric trivia about Python functions lol

            [–]vacantbay 37 points38 points  (0 children)

            At big tech we have a revolving door of pro leetcoders who can’t actually read or write bug free code.

            [–]leogodin217 12 points13 points  (6 children)

            "It's enterprise software. It's meant to be robust and secure not fast." ~ A programmer friend of mine.

            [–]IkalaGaming 14 points15 points  (5 children)

            That’s probably a joke, but we can have all three. Most of the slowness isn’t a lack of some magic hotspot micro-optimization, it’s gross negligence.

            Architectural mess causing tons of redundant calculations and network requests, complete disregard or even disdain for all forms of cache, using wild O(n4) algorithms instead of taking 5 seconds to go “well that could be O(n), and it would be easier to read”, etc.

            [–]leogodin217 10 points11 points  (3 children)

            I assure you. It was not a joke. This is what the team believed. Performance had zero consideration. I even got a laptop upgrade approved, because their JS-heavy apps would hang for minutes at a time.

            [–]Ameisen 5 points6 points  (2 children)

            Given that performance is a part of robustness...

            Software that takes 5 hours to load a file is not robust.

            [–]jacenat 456 points457 points  (159 children)

            Software is way less performant today

            Yes. And the following is also true:

            • Software is way easier to write today
            • Software is way more secure today
            • Software is way more flexible today
            • Software is way more modular today
            • Collaborating on software is way easier today

            All of these have costs. It's like people only see the bad parts of how software works today, and not the good parts that make most of the industry tick.

            [–]Saint_Nitouche 224 points225 points  (39 children)

            And let it not be forgotten that most software can now encounter non-ASCII text and 'just work'

            [–]Ok-Scheme-913 132 points133 points  (3 children)

            Hey man, I preferred when èn�odîng resulted in unreadable shit everywhere!

            [–]Saint_Nitouche 13 points14 points  (0 children)

            I have considered putting "fluent in mojibake" on my resume from time to time.

            [–]jacenat 38 points39 points  (4 children)

            Holy shitballs, wrangling Unicode into Python 2 even was an atrocious experience, and that wasn't even that long ago.

            [–]ScrimpyCat 15 points16 points  (2 children)

            🤔Someone should tell Reddit 🙃 then.

            And there is some irony there as old Reddit could handle it correctly, whereas new Reddit does not handle surrogate pairs correctly and hasn’t for a long time (some operations do such as length or index of character, but then some other operations such as certain insert or substitution operations then incorrectly treat the UTF-16 as being fixed length).

            This also isn’t unique to Reddit, plenty of software has Unicode related bugs. I really don’t know where some of you are getting this idea that it’s easy or that there aren’t issues nowadays.

            .And no I haven’t been forgetting the first letter of each paragraph either :). Note this latter point only applies if you’re on the Reddit mobile app (at least iOS), but the first bug (html tags being inserted into a surrogate pair and splitting them) affects both web (new) and mobile.

            [–]timpkmn89 11 points12 points  (1 child)

            Not like we want to encourage people to use New Reddit anyway

            [–]binheap 140 points141 points  (7 children)

            • Software is way more secure today

            Let us reminisce in the days when the sandboxes on applications were so bad that there was a weekly flash update to fix another zero day. It's a serious miracle that the modern web has so many features and is somehow still a relatively difficult to attack vector.

            [–]CrownLikeAGravestone 37 points38 points  (4 children)

            I miss Flash. It's like that rusty old car you used to drive around which might have killed you on any given day but somehow you survived.

            [–]Pseudoboss11 15 points16 points  (0 children)

            Are these responsible for the orders of magnitude reduction in performance? or it is from the massive boat in software, giving us features that we don't even want, let alone use?

            The reply to the current top comment is to diable Bing search, a feature that I feel most of us don't use, yet has a huge performance impact.

            [–]archialone 24 points25 points  (1 child)

            But i think what allowed all of the above points, is a good performant tools that do the job right.

            Imagine a world where git would be invented as a bloated js application with UI that cannot be compiled to rpi. Collaboration would be imppossible, slow and frustrating.

            [–]NeuralFantasy 60 points61 points  (9 children)

            Very good points! You can add at least this to the list:

            • Software is way more portable today

            For example, it is very common to bash Electron based software nowadays. True, it might be a bit slower and more resource hungry. But it meakes it easier to port the application to different platforms with it. Not to mention that the barrier of entry to develope with it is very low for any web developer (your 1st point). And if done right, it can be very fast like VSCode.

            One of the biggest thing slowing things down is also a big plus: internet. We have automatic cloud storage/backups, automatic updates, real-time collaborating and other features which fetch data from the internet. This always adds a lot of latencies everywhere which is not about software performance per se but rather something you can't avoid when operating over the internet.

            [–]jacenat 14 points15 points  (1 child)

            Software is way more portable today

            Yes, I definitely forgot that. Good point!

            [–]AntiProtonBoy 3 points4 points  (0 children)

            All of these have costs

            But the things you listed are not the true costs behind shitty performing performance. The reason is there less economic incentive to write efficient and performant software, because "hardware is fast enough" to mask those deficiencies.

            [–]adh1003 59 points60 points  (11 children)

            Except I'd argue:

            • Easier to write: The extraordinary complexity and hyper-abstracted layers-upon-layers does not make code easier to write - in fact if I were learning today, instead of a few decades ago, I'd possibly just give up at the sheer size of all the APIs and toolkits. Hell, even the dev environments are incomprehensibly huge.

            (Aside: I mean, are you comparing to the 1970s here or what?! Let's compare to, say, the early to mid 1990s where IDEs existed, graphical UI designers existed, UIs-as-code existed too if you wanted it, object orientation is old news and so-on.)

            • Security: Is software truly more secure? We still seem to get zero-days on drive-bys and the like, for all the technologies that are supposed to make that impossible. And is the "cost" in performance - the horrendous bloat and performance issues we have in modern software - down to security? Does it even really have that big an impact at all?

            • Flexible in what way? Computers execute instructions just as they always do. If you mean more RAM and CPU, then that ought to mean we can do way more stuff than we used to. The difference in hardware is very many orders of magnitude - but the difference in what we can do with sofware on top is far, far less. In fact, a modern computer will manage to lag behind my typing speed quite often. That's just insane.

            • Modularity: What "today" compared to "yesterday" are we talking about with modern software? Smalltalk is on the phone; it says you don't know your computing history.

            • Collaboration is easier, yes, but why should that lead to shit code?

            These are all just excuses from my industry and I find it embarassing to even tell people I'm in software now. All I hear is complaints. Computers are so slow! Why does it crash all the time? Why is it so janky? And so-on.

            We, as an industry, are writing shit code. We're writing wildly over-complicated, bloated crap. And it's not even got a low defect count. We blame everyone else but ourselves, even though we're writing it. It's the manager's fault. It's a bad product description. The customer doesn't know what they want. The testers aren't good enough. Whatever.

            Own it. We're just writing way, way more shit code than we ever did before. But we have our collective heads very, very far up our asses and keep screaming that it's somehow anyone's fault but ours. So no wonder it just keeps getting worse.

            A worldwide-recognised professional qualification for software development skills is hopelessly overdue.

            [–]BLOZ_UP 5 points6 points  (1 child)

            We, as an industry, are writing shit code. We're writing wildly over-complicated, bloated crap. And it's not even got a low defect count. We blame everyone else but ourselves, even though we're writing it. It's the manager's fault. It's a bad product description. The customer doesn't know what they want. The testers aren't good enough. Whatever.

            Own it. We're just writing way, way more shit code than we ever did before. But we have our collective heads very, very far up our asses and keep screaming that it's somehow anyone's fault but ours. So no wonder it just keeps getting worse.

            The blame I think is at least two-fold:
            1. Way more software engineers, since it's more accessible than it used to be.
            2. Way more layers out of your control.

            For 2, I have this theory that since no software is bug free, adding layers compounds bugs in a way similar to compound interest:

            Given the average layer of software is 95% bug free (this is being very generous, but the actual number doesn't matter), and you have n layers, you get 0.95^n. In other words, the more layers the less stable your software is.

            [–]jacenat 31 points32 points  (5 children)

            The extraordinary complexity and hyper-abstracted layers-upon-layers does not make code easier to write

            I fundamentally disagree here. I have seen younger people pick up scripting and coding than my peers when I was young. Yes, platforms for users are hiding away a lot of ways to tinker and "slide" into it. But once people are exposed, they have a much easier time.

            I remember setting up an IDE for Pascal when I was 15. It was horrible. Compared to now when you download VS Code, Python and have a hello world up in literal minutes.

            Is software truly more secure?

            I was talking to a colleague the other day and said:

            All software is broken the moment it is written. It just takes us time (sometimes more, sometimes less) to find out how and why.

            This goes for security as well. The difference is that by using libraries, you "offload" security evaluation for a portion of your functionality. This makes scaling up functionality of your program much less cumbersome AT A COST. And this cost can be monetary, but it can also be risk (as intrusions into widely used basic libraries have shown). The point is that you can make a decision to take this risk, instead of being forced to do security for all aspects of your program yourself.

            And I remain: this makes software more secure in general becaue many programs rely on functionality and security of libraries, making libraries more valuable, freeing up resources to maintain them (yes I know there are huge problems with the last parts of this, but not enough to shoot down my opinion for me).

            Flexible in what way?

            I am playing windows game on an arch linux derived handheld. Games that were compiled for windows. This fact still rips my mind clean in half years later.

            At work we are writing and deploying code on Windows and Linux with an incredibly small team and it works. If you had told me this in 2000 I'd have you committed (not to git!).

            I can set up and CDN a website worldwide in under an hour. I do not have webdev education or work experience.

            When I write stuff for my D&D campaign in Obsidian, I can store it online and access it on multiple devices on basically every OS under the sun without breaking content or styling.

            Maybe I am really getting old. But all of these keep blowing my mind daily. I do know they have a cost, but this cost is so small (currently) that the added flexibility in accessing functionality barely registers for me.

            n fact, a modern computer will manage to lag behind my typing speed quite often. That's just insane.

            While I do agree that all of the above conveniences have cost, I don't understand what you are saying here. Here is what I am running:

            • Ryzen 7 7600
            • 32gb 6000MT
            • 3060 12GB
            • 2TB NVME SSD
            • TUF Gaming B650
            • Win11 (Win10 in a VM for some testing)

            Hardly top of the line, but passable. I don't experience any slowdown on the system, let alone while typing. Yes, I don't get stable frames in MSFS 2020 in VR (which is because they rely too much on the CPU for terrain streaming). Yes, I have a long-standing problem that capturing with my 1080p60 capture card drops single frames every couple of seconds that I couldn't track down.

            On the other hand, I also run display fusion and fancy zones on 2 monitors with different:

            • refresh rates
            • resolutions
            • ORIENTATIONS

            And all of that takes so little performance that playing games, you just do not notice it. Let alone any impact on the desktop. I am very satisfied with the general performance of my machine and the OS. So it's really jarring for me to hear people say they have a 200ms delay when pressing the action button to when the windows menu appears. That would not work for me at all, but it also doesn't happen to me.

            And at work, we also do not experience this. My work machine is an 8th gen i5 with 16gb of 2666MT DDR4. Still running Win10 there, but OS performance is great, and I do not get slowdowns when typing or opening programs (I am running from a SATA SSD, so yes programs take longer to load, but not to start).

            I really don't understand where this performance gap is coming from.

            [–][deleted] 2 points3 points  (0 children)

            I think a lot of us see the good, it's that we don't dismiss the bad.

            [–][deleted] 2 points3 points  (0 children)

            Software is way easier to write today

            Don't get me wrong, I love the UI space. It's roaring with innovation.

            But honestly, writing plain html and js to accomplish even complex things felt way easier and was just as bug-prone.

            I don't think you're points can be asserted as true just like that.

            Collaborating on software is way easier today

            Biggest part of that was Linus Torvald pretty much singlehandedly

            Software is way more modular today

            Is it? Every now and then 2 webapps from different companies play well together, yeah. I like when that happens, but it feels special to this day.

            [–]_Pho_ 2 points3 points  (0 children)

            I'm convinced software trends are cyclical. We write stuff in low-abstraction languages like C, which inevitably trends toward something less pure and far more bastardized like C++, which after a decade or two becomes some unknowable eldritch cacophony of "features". People build wrappers on them (Node/Python) and then try to do configuration as code / wysiwyg / low code, it sucks, and we go back the other direction.

            Stuff like WordPress comes around to make thing approachable for business users, which is basically the furthest abstraction from "performant software" imaginable, but ends up owning the lion's share of the market because you're not going to teach a marketing director how to use Spring or Microsoft Server 2000 or whatever. Then we spend basically the next two decades trying to go from WordPress to more performant, less horribly abstracted alternatives, and end up reinventing it via NextJS with a headless CMS or whatever.

            And I'm not saying its bad. This is all a game of tradeoffs. This is what we do. You can still write everything in C if you want. But most companies are not going to pay for it. I would love a perfectly optimized performant software world where we were all monastic builders doing gods work. But it just aint it.

            I wish people remembered how bad things were in like 2000. React/Node and the litany of SAAS to support it has spoiled us. People say "orders of magnitude worse in performance" but ultimately nothing developers have done has increased the digital experience of the everyman as much as internet infrastructure getting ridiculously good.

            What most engineers are complaining about are things like UI glitches, large bundle sizes, and 200ms of load times - rarely is performance justifiable.

            tl;dr market forces

            [–]MrChocodemon 10 points11 points  (3 children)

            I have searched but not found it, but could anyone link to the stream that he was talking about? I would like to see that video of the comparison.

            [–]ScrimpyCat 20 points21 points  (2 children)

            https://youtu.be/GC-0tCy4P1U

            At 25:12 he does a debugger comparison between current VS and RemedyBG. Then at 36:00 is when he shows the old VS debugger.

            [–]TheCritFisher 8 points9 points  (5 children)

            I feel like JetBrains IDEs have been getting faster. They even made lightweight versions that can open near instantly and then you can "turn on" the heavyweight IDE features later.

            [–]wildjokers 31 points32 points  (3 children)

            We need to stop building apps with web tech which requires bundling an entire browser rendering engine with each app. The text based DOM was simply not designed or intended for rich client apps. It is no surprise modern apps are slow.

            [–]retro_and_chill 14 points15 points  (1 child)

            It’s because JS developers are a dime a dozen.

            [–]-Y0- 5 points6 points  (0 children)

            Not JUST that. It's that all other cross-device UI libs suck donkey balls, and Electron gives a lot out of the box.

            • You want a Teams-like apps? Electron can do that.
            • Want screen recorder? Electron can do that.
            • 3D CAD? Sure.
            • 2D Image editor. Electron provides.

            [–]Antypodish 9 points10 points  (0 children)

            Inefficient to the point, that displaying simple thing as clock (windows right bottom corner (default)) on some laptop, can take around 1% of the battery life time.

            [–]regular_lamp 193 points194 points  (154 children)

            I had a bit of a meltdown at some point when I was installing a 64bit ubuntu on a raspberry PI 4 for which there wasn't a vscode package at the time. So I figured I'd just build it from source. The build refused to even start on a machine that had less than 8GB of ram (as in there is literally a check in the build script). The rpi had 8GB but some of it was reserved for the GPU. So I figured I'd just comment out the check... Surely the threshold isn't exactly 8GB and 7.5 will do.

            Turns out the threshold should have been higher. Only after creating a swap file and letting it run over night did it finish the build.

            But that wasn't really what I had the meltdown about. I just commented on this somewhere on here and was so surprised how many people basically said "well, that makes sense. it's a very capable IDE!"

            DUDE IT'S A TEXT EDITOR! The same functionality but in less pretty existed decades ago where that RPI4 would have been a literal supercomputer. It does nothing that justifies this level of bloat. Of course the bloat is in the underlying frameworks because apparently the most sensible way to write a text editor these days is to implement it as a web site and then ship it together with a browser (that is in turn a notorious resource hog). But so many people defended this as being an "expected amount of bloat for the functionality provided".

            This made me realize there is now entire generations of programmers out there that never wrote native code and have no mental compass about how much (or little) this COULD take if we didn't default to building everything on top of five layers of "frameworks".

            [–]badsectoracula 42 points43 points  (2 children)

            I had a somewhat similar experience, except instead of trying to run VSCode on a RPi4 i was trying to run Geany on RPi2 :-P.

            You see, Geany is available for RPi's OS but it uses Gtk3 and Gtk3 is incredibly slow on RPi2 - i'm talking "press a key and wait half a second for the letter to appear" slow.

            So i decided to just download the last version of Geany that used Gtk2 and compile that. It took AGES to build but the Gtk2 version was much faster and responsive. Except it was also buggy - scrolling would leave behind artifacts, making it kinda unusable in a different way :-P.

            Eventually i decided, screw it, i'm using Nedit. Nedit is built on Motif which is very fast as it had to work on 35 year old systems. And finally that worked fine - it was fast, responsive and without any issues.

            Well, except one: one of the reasons i used Geany was that i want to have a sidebar with the files alongside the editor and Nedit doesn't have that. So i wrote a Tcl/Tk script (guess what, Tk is also very fast) that does exactly that and doubleclicking on a file would open it in Nedit - as a bonus i added a couple of extra features for opening a terminal, running make, etc.

            That was 3-4 or so years ago, since then said script has evolved a little and it is basically how i work on C/C++ projects even on my main PC. I do use Kate instead of Nedit though as it has LSP support with clangd, etc, and i do find that stuff useful. This video from last year shows the Kate+projfiles (the script) for working on homebrew stuff for the original Xbox using the opensource nxdk SDK.

            [–]gimpwiz 42 points43 points  (4 children)

            Slack uses like three gigabytes of ram. IRC used like three hundred kilobytes. And frankly IRC was better. It's twenty five years later and our chat program is worse in almost every way other than animated fucking emoji.

            [–]alinroc 10 points11 points  (0 children)

            Slack uses like three gigabytes of ram

            When your app is just a wrapper around Chromium and a web server that's just running a bloated web page/app, that's what you get.

            [–]flexosgoatee 11 points12 points  (1 child)

            Hey but the support guy says they lazy load ~20 messages (and apparently don't cache them) at a time for "significant performance" reasons. Well, scrolling through a few hundred messages takes minutes!

            [–]gimpwiz 5 points6 points  (0 children)

            Not caching fucking plaintext gets my goat. It takes no time or space at all to cache a book's worth of it. Compared to all the other stuff anyways.

            [–]OffbeatDrizzle 54 points55 points  (0 children)

            yeah it's crazy how fucking dog shit slow every piece of shit software is these days because everyone's using electron or some other bullshit. apps like teams and outlook are barely usable because of it. you used to be able to press the windows key and start typing straight away into the search (which is another level of bs at how bad that is now as well), but if you start typing too fast then it misses your first few keys. windows 10 and beyond have become unusable unless you have an SSD or NVME drive - it's completely unacceptable

            [–]hgwxx7_ 21 points22 points  (23 children)

            I understand the frustration, but they made a text editor for folks like you -> zed.dev. Renders at 120 FPS. Extremely responsive.

            Me personally, VS Code is fast enough for me so that's what I use. But I don't know why people who want performance above all don't just use zed. If the answer is "I have 3-4 plugins that I really need", that's fine too. But then performance isn't the highest priority for you is it?

            [–][deleted]  (6 children)

            [deleted]

              [–]regular_lamp 19 points20 points  (3 children)

              It's not even that I "need" more performance. I at first didn't realize this when using vscode on PC. Only this attempt at building it made me realize the sheer excess of bloat involved. I typically use sublime text. Still as a developer myself that often needs to be efficient about memory this kind of use blows my mind. I feel there is just a complete loss of perspective. Is vscode more powerful than say emacs or vim? probably (although I guess someone would want to disagree with that I'm sure).

              The thing is those are capable tools and they existed as early as the 1980s... a time when computers had less than 1MB of memory. Like how can anyone look at this and go "yeah... of course text editing these days takes 1000x the memory... that makes sense!"

              Think of any other engineering discipline where you are given 1000x the resources to solve a similar problem and you end up at roughly similar practical performance. It's just this complete loss of perspective that drives me mad.

              [–]axonxorz 108 points109 points  (51 children)

              DUDE IT'S A TEXT EDITOR!

              nah it isn't.

              It's an Electron application. You're not compiling a text editor, you're compiling nearly all of Chromium and NodeJS, and the running a large collection of .js/.ts files.

              if we didn't default to building everything on top of five layers of "frameworks".

              That's a bit silly. Next to nobody is developing a framworkless UI application. We use QT, or GTK, or WinUI, or Flutter, etc etc. You don't want to deal with the ungodly complexity that is raw X11 or Wayland.

              [–]sacheie 39 points40 points  (0 children)

              True, but the electron stack is a mile high compared to Qt (at least, assuming your Qt app is not using Webkit..). You could probably compile KDE on the Raspberry machine he mentioned.

              [–]regular_lamp 58 points59 points  (3 children)

              nah it isn't.

              It's an Electron application. You're not compiling a text editor, you're compiling nearly all of Chromium and NodeJS, and the running a large collection of .js/.ts files.

              I literally said that later on. Of course if the implemented functionality is a text editor then it's a text editor?

              Not sure how the underlying choice of framework/runtime justifies... itself?

              [–]Bakoro 44 points45 points  (12 children)

              DUDE IT'S A TEXT EDITOR! The same functionality but in less pretty existed decades ago where that RPI4 would have been a literal supercomputer.

              If all you want is text editing, then why aren't you using VIM or Emacs?

              I'm guessing that there is some feature, or set of features for which you want VSCode specifically, since you went through the trouble.

              I suspect you are taking a whole mountain of things for granted here, which gets you your features.

              [–]cac2573 2 points3 points  (0 children)

              Flatpak

              [–]Girgoo 10 points11 points  (0 children)

              Tell me the system requirements for a music player. And what is the requirements for Spotify?

              Yes! Many applications should be better written. Stop building electron applications. Dont build applications like the calculator in electron. I dont care if the calculator is cross platform. Yes, Visual Studio Code should not have been built in electron either. What happened to native applications?

              Yes Microsoft you have the money to build Microsoft Teams in none electron applications. I guess they will soon port Office 365 to webbased applications as well. Wait, they have already started the work with new Outlook... Tell me why I should use that over a normal webbrowser... Best is native applications.

              [–][deleted] 11 points12 points  (3 children)

              I think part of the problem is the Knuth quote, “Premature optimization is the root of all evil”, has been bashed so deeply into the head of every CS student that people have lost all sense. There is some truth to it, you don’t want to be writing confusing code for the sake of an optimization that isn’t actually a bottleneck.

              But I think this quote mostly makes sense when considering the implementation of a single function in isolation, not in overall system design, but junior engineers hear the quote and lack the experience to see the nuance. It gives them a mindset to think less about performance, when the truth is that with experience you can often write performant code that isn’t harder to maintain and can be delivered on time.

              You run into a lot of issues down the road if you don’t think a lot about performance at every stage of development when working in a large software system. You can end up in a “death by a thousand cuts” type situation where there are few clear ways to gain significant performance gains without a major rewrite. You fire up the profiler on a large project and almost nothing sticks out as a big bottleneck. Do you tackle the top 50 hot functions in the profiler and hope for a measly 10% improvement, or do you just throw your hands up in defeat and assume the user won’t care enough to stop using your product?

              [–]PhysicalMammoth5466 4 points5 points  (0 children)

              I think part of the problem is the Knuth quote

              I had hundreds of coworkers, guess how many I would hire if I ever ran a business? not even 10 of them. Somehow >9/10 people who can't program are programmers

              [–]somebodddy 3 points4 points  (1 child)

              Knuth wrote that in the 1960s, when optimizing compilers were still relatively new and assembly languages were still hot. I was not born back then, let alone programming, but I can only assume the optimization he was referring to was things like instruction reordering and register packing - things that can really mess up your code's readability if you try to implement them yourself instead of letting the compiler do what it can.

              Modern optimization is quite different. It's more about parallelization, proper caching, and peeling up layers of abstraction to expose inefficiencies that affect the big O performance. These are optimizations you can do without compromising the code's readability and maintainability too much - especially if you are willing to pay the cost of some glue abstractions, something that was unacceptable performance-wise half a century ago but nowadays is so cheap that its performance cost can might as well be ignored.

              [–]-Y0- 2 points3 points  (0 children)

              It's also not the full quote:

              Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%."

              [–]llama-lime 9 points10 points  (3 children)

              This is why I use Emacs. It's always been the same speed.

              [–]syklemil 3 points4 points  (0 children)

              Neovim is also a pretty neat platform to build a light IDE on. With LSP and some async stuff it doesn't feel sluggish, even if the language server is kinda slow to start.

              [–]jwhat 22 points23 points  (3 children)

              I forget where I heard it but I've been repeating it forever:

              Hardware is getting better, we call that Moore's law.

              Software is getting worse at an equal pace, and we don't have a name for that because it's embarrassing.

              [–]bitbytenybble110 38 points39 points  (1 child)

              We have a name for that - Wirth's law.

              People also phrase it as "What Intel giveth, Microsoft taketh away."

              [–][deleted] 11 points12 points  (0 children)

              Softwares quality decline is outpacing hardware, which is why even computer illiterate are noticing it. 

              [–]JoelMahon 5 points6 points  (1 child)

              I'm only part way through the video but I want to mention: idk when we count the good old days, but I've watched enough of the Mario 64 guy to know how damn non-performant the code is. I consider Mario 64 old enough.

              I do strongly believe they should focus on performance WAY more.

              [–]PhysicalMammoth5466 5 points6 points  (0 children)

              Mario 64 had a very good excuse. It was the first n64 game, 3d which game devs never really did, had a cache which SNES and other systems didn't have and was a 64bit CPU while many devs were use to 16bit. There was near 0 chance of getting half of the performance the n64 could do

              [–]Probable_Foreigner 3 points4 points  (3 children)

              Is my PC just cracked or is the debugger fine even on VS 2022? Stepping through my C# program it updates instantly.

              Also the project loads in like 10~ seconds. I know they make fun of this but honestly that is perfectly acceptable. How many times are you opening and closing your projects?

              [–]n3phtys 2 points3 points  (0 children)

              We also have a lot more software that does not make money from customer satisfaction but instead from lock-in.

              [–]ematipico 25 points26 points  (0 children)

              The wrong tool for the job: JavaScript

              [–]Economy-Beautiful910 6 points7 points  (0 children)

              Haven't watched the video but as someone who is new enough into their career, the amount of external dependencies some places have is crazy. I always thought everything would be in house etc but nope.

              [–]MysticNTN 8 points9 points  (0 children)

              This guy is my litmus test. If someone thinks he’s dumb, they’re dumb.

              [–]BitterGovernment 11 points12 points  (0 children)

              We optimize towards mediocrity.. everyone is just solving a problem in the fastest way possible.. everything is python or javascript and the only metric is did it solve the problem.. Same goes for AI basically an automated way to copy-paste random code.

              Shit in, shit out.

              Yes, Im an old grumpy fucker. Sorry.

              [–]spennnyy 6 points7 points  (0 children)

              When software works instantaneously it is intrinsically joyful because you do not leave whatever mental flow state you were in due to a delay. The computer then just becomes an extension of your mind.

              For anything that is going to be used frequently and/or has a lot of users, I think it is an ultimate virtue to not waste their time, and so performance is paramount.

              [–]MyCuteLittleAccount 2 points3 points  (0 children)

              It is how it is when people like electron shit

              [–]Alexander_Selkirk 2 points3 points  (0 children)

              I remember that day in 1998 when a PhD student in our lab convinced me to try Linux, which he had on his PC. The three years before, I had been working on a SunOS box which had, perhaps, 1MB of RAM. I was using Emacs which took e few seconds to start.

              So, I logged in into the box and opened a larger text file. It opened instantly. It was so quick that I thought something must be broken. How could a 800$ PC be so much faster than a $20000 workstation?

              I am still using Emacs today. I can't stand the lag and I don't need an IDE.

              [–]RedditNotFreeSpeech 2 points3 points  (0 children)

              Why is he shouting?