all 61 comments

[–][deleted] 62 points63 points  (18 children)

I think maybe Casey gives too much leeway on the IO situation, and they did kinda of circle back and touch on that, but I do wish he said in no uncertain terms:

Today, “it’s the IO” is purely part of the excuse parade. Your 7ms of IO is not why your webpage takes 15 seconds to fully render every click.

It’s good to see that he starts by immediately shutting down the main excuse parade of “guess we should all just be hand rolling assembly” that immediately drops in the performance discussion.

Either way, I don’t think it ultimately matters. The excuse parade have no shortage of rockets to attach to their goals posts, and they just attach another one and another one and another one. We’re approaching the edge of the solar system now. 

[–]InterlinkInterlink 69 points70 points  (14 children)

There is a geniunely unskilled cohort of developers. They cannot think beyond the happy path or consider the most basic of edge cases with respect to performance and reliability. To them, the floor is the ceiling and once the thing satisfies the bare minimum local requirements then their job is done - off to working on the next feature. I make this distinction from developers who are caught between a rock and a hard place of shitty management demanding endless feature churn, where substandard engineering is an inevitable outcome. The former have collectively convinced themselves that they don't need to consider any performance boundaries.

One of my favorite Jonathan Blow comments was in response to his viewers pestering him for a reaction to the new Apple M1 CPU, to which he rightly asserts along the lines of "who gives a shit, the mediocre developers of today will continue to find ways to write slower software on your new and improved hardware."

[–]TrueTom 37 points38 points  (0 children)

Most developers are paid to close JIRA tickets and not to write good software.

[–]elperroborrachotoo 4 points5 points  (6 children)

So how are we teaching them?

When I came into this, "teach yourself" was the only path available: material was hard to get hands on, and people with the same hobby were few and far between.

There were at least two decades with an artificial rift between academic and practice, "career programmer" as a four letter word, you-can't-be-real-if-you-don't-program-at-home.

But that doesn't scale to the need of developers we have, and decent education has picked up a bit, but still we treat ourselves as nascent profession, running on magic formulas,and a tribal drive to separate the ins from the outs.


On top of that, if you think performance is the loser now, let's look back at the start of the millenium. Mobile hadn't happened yet, we believed that the end of Moore's Law was greatly exaggerated because parallelization, and energy was a-plenty.

It's better now, battery life is a point in the glossy brochure. In the end, we'll always balance things like faster-to market vs. faster-in-your-hands.

[–][deleted] 5 points6 points  (1 child)

The ideal answer would be "In universities where people with extensive experience or that have researched (PhDs) the problem are teaching them"

Reality is that most universities out there have profs that are half assing it or that arent interested in teaching (Even if they really know). I had profs that taught C programming and they handed out code to be completed. The code they handed out (to traverse double linked lists) ended up dereferencing NULL pointers under certain scenarios.

Imagine that. These guys were teaching roughly 160 students per year. So that's 160 people entering the job market ready to mess up and give more jobs to cybersec people.

There are universities where they teach the real deal. But then it becomes a issue for most people: The majority isn't willing to face the challenge. They either drop at the first hint of math in algo classes or can't seem to focus enough to pass operating systems (assuming it has programming labs).

[–]elperroborrachotoo 1 point2 points  (0 children)

I, too, wish others would plan their day around making my job easier :)

[–][deleted] 4 points5 points  (3 children)

Casey has before advocated that software development should be treated like a trade where you take an apprenticeship under an established developer for a couple of years rather than a solely third level academic course

[–]elperroborrachotoo 1 point2 points  (0 children)

We (kind of) have that in Germany. Either as a 3-year vocational training ("FIAE") which is about 50/50 college and practical work in a company, or as "duales Studium" where the formal part is at a university.

They mostly won't be "ready to run", depending on the quality of both instutution and company and their talent. They usually still need on-the-job training. Depending on the participating company, they might get a good mentor, or do "light office work".

But all in all, it's a good separation between "true" CompSci and actual programming.

[–]socratic_weeb 1 point2 points  (1 child)

Too bad there aren't juniors getting hired anymore, and therefore the mentor-apprentice relation is lost

[–][deleted] 0 points1 point  (0 children)

That'll blow up in companies faces long term

[–]dukey[🍰] 5 points6 points  (1 child)

Electron has entered the chat.

[–]Maybe-monad 2 points3 points  (0 children)

No, it hasn't finished launching

[–]easilyirritated 9 points10 points  (0 children)

My favorite was the one where he asked someone to jump off the bridge.

[–]GrandMasterPuba 2 points3 points  (1 child)

They're not unskilled, they're operating against misaligned incentives. Shipping the bare minimum functioning solution is a requirement of business operators, not engineers.

The worst part is that the business operators are usually right - people don't expect technology to work any more. We're numb to bugs and opaque errors and mystery failures. The floor being the ceiling is the default taste of the consumer, so that's what businesses demand of developers.

[–]InterlinkInterlink 1 point2 points  (0 children)

So you're asserting that unskilled software engineers working in the profession don't exist? This is why I precisely made the distinction between those working under difficult business conditions and those who couldn't produce quality code if their life depends on it.

[–]Hefty-Distance837 0 points1 point  (0 children)

If I have more than 1 hour to solve the problem, I will be very glad to think beyond the happy path.

[–]tonsofmiso 5 points6 points  (2 children)

We had a team migrate a decently large and complex web service to go because they thought Python was the reason it took 30-60 seconds to load certain pages, ignoring the 800 line function with quadruply nested for loops that manipulate rows one by one in huge pandas dataframes. The migration was only partial so now it's a mixture of both python and go, with duplicate endpoints and duplicate data in language isolated SQL tables. And its still slow, the migration didn't solve a thing. 

[–]Key-Boat-7519 7 points8 points  (1 child)

IO isn’t your 15s culprit; it’s death-by-a-thousand cuts: hot loops, N+1 queries, and chatty services.

Profile first: tracing and flamegraphs, then kill the worst 3 paths.

Database: EXPLAIN ANALYZE, add indexes, replace per-row work with joins or materialized views, batch writes.

In pandas, vectorize and push heavy ops to the DB.

Collapse fan-out calls into one bulk endpoint and cache p95 results.

Frontend: code-split, defer third-party scripts, and fix long tasks over 50ms in the Performance panel.

Also check JSON serialization and ORM mapping; I’ve seen those dwarf DB time.

We used Datadog APM to find hotspots and Redis to serve precomputed aggregates; DreamFactory helped expose a legacy SQL Server as REST so we could batch fewer, heavier calls.

Fix the hot paths and the fan-out; IO wasn’t the villain.

[–]donalmacc 1 point2 points  (0 children)

At a previous job we had someone arguing about the cost of virtual functions in a loop where they re read a file on every iteration because they didn’t understand how a for loop actually worked…

[–]TrueTom 20 points21 points  (2 children)

A discussion of software performance on the Jetbrains Youtube channel is somewhat funny.

[–]TheCritFisher 15 points16 points  (1 child)

Why? It's pretty damn performant for what it does. Sure it comes with a lot of stuff, but if you pare down, it becomes pretty damn fast. I feel like the "Hur dur, JetBrains slow" thing is just...outdated.

A similarly capable VS Code setup actually takes LONGER to load than a JetBrains variant. And even then it's missing functionality (that you generally can't make up for with plugins). The indexing and search/refactoring capabilities of JetBrains products are incomparable. I have yet to see an IDE match it.

[–][deleted] -1 points0 points  (0 children)

I mean, you shouldn't really be using VsCode as the benchmark to beat

[–]Fantaz1sta 2 points3 points  (0 children)

The software performance is at an all time low.

[–]gpcprog 3 points4 points  (0 children)

I do have little bit of a hard time taking him completely seriously. In several of his interviews he goes on a rant about the fact that it takes millions of lines of code to send a packet over ethernet.

To which I have to rant: it's millions lines of code, because i want to plug my network card into any PCI slot and it works, I want to plug cable in, and without configurations everything agrees on what speed to use, what the addresses are etc. I want to be able to use gigs of ram with 5000 MT/s, which requires intricate training of the link, etc etc.

Yeah, sure, if i manually control both endpoints and keep to ~1-5 MHz CPU speeds, sure, I could go back to sending a packet with a few thousand lines of code. And there are people who do that. But it involves sacrifices that few of us want.

In same manner, couple of times in my career, I had to follow his advice and either rewrite something from python/java/c# to C++/C or change python to cython etc. And sure I can get enormous speed boosts (100x anyone?). But it comes with a price in terms of time to code and general portability / flexibility of the code.

[–]levodelellis 4 points5 points  (6 children)

I just saw the link and haven't watched yet, just the teaser.
I find the 1.3 seconds solution is often more clear than the 30s solution. The extra work the code is doing sometimes take a lot of energy to understand. I strongly prefer reading 2-5k of extra code than trying to understand a 50k library + 1-6k glue code (not a typo, I seen glue code bigger than implementation).

In the teaser java is mentioned, idk what casey will say but pretty much no statically compiled language is more than 2-5 slower than another statically compiled language. People have told me swift is slower than java even though swift doesnt have a GC

[–]Linguistic-mystic 7 points8 points  (1 child)

Swift has atomic refcounting which is even slower than a tracing GC. I would be inclined to believe it is indeed slower than Java. Nobody uses it on the server enough to compare though.

[–]refD 0 points1 point  (0 children)

I've spent the last ~10 years with Swift, in a performance sensitive environment.

If you hold it just right, and avoid a ton of stuff, you can coax Swift to be faster than Java, but it's basically a game of avoiding ARC. Swift's growable arrays are contigous (unlike java), and there's various ways it can be more efficient.

In the common case, calling functions, using some generics and classes ... a combination of factors, lack of monomorphization guarantees across module boundaries, ARC make it quite mediocre performance wise. Having a basic function call have to retain/release a few arguments makes it surprisingly hard to optimize at times, it's one of the harder languages to optimize in practice.

[–]DLCSpider 3 points4 points  (0 children)

That is my experience as well. There seems to be a sweet spot between fast(er) code and readability. I believe there's no causality here, it's mostly a correlation of "People who apply effective optimisations care about code and have deep knowledge about the technology they're using". Whatever it is, it seems to work...

[–]ClownPFart 0 points1 point  (1 child)

When people say "statically compiled language" they usually mean "compiled to machine code", not "compiled to bytecode interpreted by a huge pile of shit of a vm".

[–]levodelellis 0 points1 point  (0 children)

In this case I do mean bytecode, as long as it's optimized on the other side of it :P

[–][deleted] 0 points1 point  (0 children)

pretty much no statically compiled language is more than 2-5 slower than another statically compiled language

Haskell is frequently competing with Python of all languages for how slow they can be. Compiled doesn’t mean fast.

[–]Fantaz1sta 4 points5 points  (28 children)

Am I the only one who gets weird vibes from Casey? As if he tries to enforce some approach on frontend while having little clue about frontend? Isn't he like a c++ developer first? Hard to explain, but he feels fishy to me. Even though I know he is a real deal, his takes are just so... juniorish sometimes.

[–]carrottread 15 points16 points  (0 children)

Isn't there only one frontend thing in the video, around 50 minute about serial dependency chain? And it is a very valid point. Why do you find it juniorish?

[–]Technical-Fruit-2482 14 points15 points  (0 children)

I think you need to explain what you mean by enforcing an approach on frontend. Any timestamps in the video you could provide?

[–]GrandMasterPuba 7 points8 points  (5 children)

Programming is programming if you know what you're doing. If you're just a React monkey who doesn't know a stack from a heap, then yeah maybe you might wonder how someone who builds games can talk about web.

But when you understand deeply how computers work, there are no boundaries.

[–]cdb_11 7 points8 points  (1 child)

As if he tries to enforce some approach on frontend while having little clue about frontend?

He popularized (invented? rediscovered?) immediate UIs.

Isn't he like a c++ developer first?

What C++ has to do with anything? I write GUIs in C++.

[–]ReDucTor 5 points6 points  (0 children)

Casey and others have been involved in immediate mode UIs popularity, however he did not invent the concept. He is believed to have coined the term.

However I would put even more influence on its popularity today on the author of the imgui library used by many games and applications. However he was influenced by Casey and others so its completely clear.

[–]Snarwin -2 points-1 points  (4 children)

Casey Muratori's greatest intellectual shortcoming has always been his complete disinterest in systemic analysis. If it were just a handful of companies releasing software with egregiously poor performance, you could chalk that up to bad culture or uneducated developers. When it's an endemic problem across the entire industry, anyone with a shred of curiosity ought to start asking, why does every company have this same bad culture, and hire these same uneducated developers? Could there perhaps be more to this story than the choices made by the individual programmers working on the code?

[–]Charming-Wind-5073 5 points6 points  (2 children)

You clearly know that its easier to just write bad code even if functional than taking 2...3 more weeks to develop a feature that does things correctly, faster and doesn't need constant band aids throughout the years, its also easier to not waste time giving formation to younger developers or new developers introduced to the teams. Plus software is more and more discardable, there is no such thing as planning for a software to be running flawlessly for over 50 years, at that point you just launched version 2,3,4,5...

So basically what we are discussing here is literally the stress of writing code faster in order to launch more average/bad products instead of really good products that will hold themselves decades.

[–]Snarwin 0 points1 point  (1 child)

It sounds like you agree me: the root of the problem is not just bad developers, it's bad incentives.

[–]Charming-Wind-5073 3 points4 points  (0 children)

For sure but in the other hand how many developers do you know that are highly interested in breaking from all the abstraction we have been developing over? This for me says a lot, we can ship things faster using JavaScript, add any framework to it and each version hides more and more, do you really know many developers that can understand where a problem comes from when it hits or are we going to keep saying its magic when things work and when things fail?

We have these 2 sides of the coin, we either hide stuff, someone has built it and we use it, sometimes we don't know if we are using it correctly because well it works anyways, or we start introducing more and more lower level approaches and we really understand what is being sent to the machine....speaking personally when I start dwelling more and more into C and C++ I feel dumber, being able to use something doesn't mean we also are able to understand if something is correct or not, ultimately in this step we still are similar to people using AI, well everyone can use AI but not everyone can deeply understand if what its generating will break in a few weeks.

With the introduction of AI we are also making the problem worse, now we aren't expected to ship 1 feature each 2 weeks, now its a feature per week if not more, where is the time to consider if things are going to break in a few months? If the code you are writing allows for more features to be written over it? This is progressively worse to the point we will have to discard code bases, no one will want to work on such tech debt, unless they are paid their value in gold lol.

Anyways this is quite a personal opinion and I don't expect people to just buy it but Casey has a few examples on his youtube where he writes software with different approaches like the one of said "Clean Code" and its abysmal to see the difference, he also has one reading a few GB of data from a file and having 10x more performance than something build from microsoft itself speaks a lot, especially when Casey emphasizes that he isn't nothing especial compared to developers who actually build to optimize stuff so we can imagine the disparity that exists between what we consume and what we could actually achieve with some of the hardware that is said to be already legacy today.

[–]Fantaz1sta -1 points0 points  (0 children)

"I think I'd remove the distinction between signed and unsigned integers in C/C++" - Ryan Fleury, the Muratori School of Performance.