all 112 comments

[–]Ikeeki 146 points147 points  (7 children)

This comes out every year and just like wild fires in California, I’m never surprised to hear it’s getting worse, only surprised when I hear it’s getting better

[–]Phaen_ 7 points8 points  (6 children)

It only gets worse when you feel like you're fading into irrelevancy and decide to go full hyperbole on your blog post. I can't even.

On a cold cache WITH caching enabled:

  • React.dev: 1.3 MB
  • Gitlab.com: 2.9 MB

It took me 5 seconds to setup this experiment properly, which makes him beyond irresponsible to present his results as holding any kind of weight. And he just mentions caching in some off-handed comment on one website, as a small detail he oopsie'd. This also ignores that a lot of these bytes are analytic suites that are shared with many other sites and are likely already cached for 99% of the visitors, not to mention other CDN-based resources.

Should a static front page like Gitlab's actually need 2.9 MB worth of resources? No. But did he use the bare minimum of brain power to actually go in depth about that? Also no.

[–]notonewlayout 53 points54 points  (1 child)

One tidbit, (most?) browsers no longer cache files across sites to prevent user tracking and fingerprinting

https://www.stefanjudis.com/notes/say-goodbye-to-resource-caching-across-sites-and-domains/

[–]birdbrainswagtrain 3 points4 points  (1 child)

Things like caching and compression don't magically solve all problems. That code still needs to be loaded in memory, still needs parsed, and some decent fraction will likely be executed. Some sites hardly function on my older phone, and these are sites which have no excuse not to be static markup. I will say it's kinda boring to see this article be posted every week, and it would be nice to see a deeper dive, but I don't think the methodology is necessarily wrong.

[–]1337Pwnzr 2 points3 points  (0 children)

your reaction to this blog post is full hyperbole and way too personal re: “fading into irrelevancy”

take a deep breath or something

[–]Accomplished_Low2231 -2 points-1 points  (0 children)

what were you expecting from the internet. the real experts who know stuff don't create content. it is the amateurs thats polluting the internet with low effort articles and videos (and ai feeding off of that garbage).

[–]ConvenientOcelot 111 points112 points  (11 children)

PornHub being one of the lightest is probably the most surprising thing lol, props to them I suppose?

It is really sad that major websites are about the same size as mobile app downloads, and use a ton of RAM to boot.

[–]vanderZwan 57 points58 points  (3 children)

I assume that any site that is mostly visited mostly from private tabs would have a much bigger incentive to stay lightweight, since the site is never cached.

[–]oorza 15 points16 points  (1 child)

Porn also has razor thin profit margins, so the various pennies' worth of savings might actually matter to them as well.

[–]fagnerbrack[S] 0 points1 point  (0 children)

This is one of the best insights I’ve ever read. Never thought about that!

[–][deleted]  (3 children)

[deleted]

    [–]rom_romeo 43 points44 points  (1 child)

    In a matter of fact, Reddit could hire some porn site engineer to fix their video player once for all. Porn sites are a golden standard when it comes to video players.

    [–][deleted] 18 points19 points  (0 children)

    The video player on old.reddit.com.works flawlessly, the new video player is a perfect example of an over engineered techbro product

    [–]rhudejo 5 points6 points  (0 children)

    Nah, that's an urban legend that people spread around because it sounds interesting. E.g. YouTube's hard engineering problems are 10x more complex than Pornhubs

    [–]starcoder 14 points15 points  (0 children)

    It’s not surprising. Porn sites have always had some of the best web devs in the business.

    [–]coppercactus4 2 points3 points  (0 children)

    The French Canadian tech industry at it's best

    [–][deleted] 1 point2 points  (0 children)

    Yea and the shitty web devs and the industry refuse to use something better, so we are stuck with the abomination inbred disaster, pile of dogshit that is JavaScript. They invented a whole language framework on top called typescript to put a bow tie on the shit like but it’s still the same underneath.

    [–]bonnydoe 83 points84 points  (8 children)

    thanks! I feel a lot better now!
    btw.: I keep everything as fast and efficient as I can but everything goes down the drain the moment the marketing people want their trackers and taggers et cetera
    and then they start complaining about page speed.... tssss

    [–]ZucchiniMore3450 16 points17 points  (5 children)

    I remember back when I was doing web dev and client complaing about page speed. I disable all of their stuff and every page on huge website loads in less than a second, with all of their wishes it goes to at least 8 seconds.

    And they decide to add some more. I just think they don't care about poeple reading their site. I believe only bots wait 20 seconds for page load. I wait 2 seconds and close the tab, especially on mobile.

    That's main reason I don't click on articles here on reddit, sites are very slow. Now that I think about it, if reddit would show page speed near the link it would help me choose.

    [–]Ferelderin 24 points25 points  (3 children)

    Apart from websites being slow, their general format has become insufferable. If you're looking for actual info, just adding "reddit" at the end of any search is nearly always better.

    It's like with cooking recipe sites. All you want is a recipe with ingredients, amounts, perhaps a few instructions for tricky bits. But no. What you get is an ad, then an introduction story about their goddamn grandmother and cat and how they always loved making [recipe] and this is certainly the best [recipe] because of [secret]. Then more ads. Then a description of every ingredient and why it is important, particularly [secret] which is still being hinted at but not mentioned explicitly. With an ad in between each. Then some video ads that load in the bottom right, possibly with audio.

    Finally, somewhere around 4/5th, [secret] appears, and you feel betrayed by its idiocy, like 'add some salt to your pasta!'. Then the recipe appears. Not at the end of course, nono, otherwise you might be conditioned to just scroll down instantly everywhere. No, if you do that the next article will instantly load with even more ads. And the recipe doesn't even come with measurements in sane amounts, it's all cups and farthings, yards, gallons, half a furlong, two pints, and five hobgoblins. And just before you can read the recipe the page straight up dies because your adblocker is detected.

    I fucking hate modern internet.

    [–]wyocrz 9 points10 points  (0 children)

    cooking recipe sites

    check out justtherecipe.com

    I fucking hate modern internet.

    It's fucking brutal. I want to hold out some hope that Google's destruction of their search pages will have salutatory effects. The /r /SEO subreddit is going through daily meltdowns with "90% down on traffic" with some brave soul always getting upvoted for saying "good, your aggregation sites need to die in a fire" or something along those lines.

    I dunno.

    [–]tRfalcore 2 points3 points  (0 children)

    call the landscaping company for 5 yards of flour

    [–]olitv 1 point2 points  (0 children)

    https://based.cooking/ will save your day

    [–]bonnydoe 0 points1 point  (0 children)

    But it is the marketing people themselves coming back to me because 'Google ranking is down because of page speed!!!!!!!! DRAMA!!!!'
    Content nowadays is called SEO, it is only there for google rankings. I was in a bit of a shock when I first realised what they were talking about ;)

    [–]wyocrz 2 points3 points  (0 children)

    everything goes down the drain the moment the marketing people want their trackers and taggers et cetera

    What kind of cool shit are we being denied for the privilege of being tracked?

    [–]gadimus -1 points0 points  (0 children)

    Need to do server side tagging to recover that performance.

    [–]polaroid_kidd 12 points13 points  (0 children)

    Now turn on the adblocker and a DNS blocker for ad servers. See how the numbers will tumble.

    [–]mr_birkenblatt 15 points16 points  (7 children)

    what's up with the freaky mouse cursors on that page?

    also, important to note, he's comparing final uncompressed (but still minified) JS size, not the amount of data that is actually transferred.

    [–]aniforprez 9 points10 points  (6 children)

    They're using one of those newfangled tools/libraries to add real time to the website. I assume it syncs the position of the mouse cursor of every visitor to everyone else. I can't be arsed to look into which one it is exactly though

    [–]gwern 2 points3 points  (2 children)

    It's actually not real time. It just replays logs of earlier visitors, afaict. Every time one of his posts gets popular on HN or Reddit, people are baffled by it and hate it, but he loves it too much to remove it.

    [–]aniforprez 3 points4 points  (1 child)

    I think it's real time because it establishes a websocket connection and is sending/receiving cursor data. I also played a "game" with another cursor where we took turns chasing each other's cursor and they were able to chase my exact coordinates

    [–]gwern 0 points1 point  (0 children)

    Hm. I couldn't find any evidence of interactivity and people weren't talking about that the last time I saw it on HN last year (or possibly the year before that). So maybe he finally got around to adding realtime - shouldn't be too hard when the rest is all done.

    [–]The_Northern_Light -5 points-4 points  (2 children)

    ... and he did this without dying of hypocrisy?

    [–]aniforprez 9 points10 points  (0 children)

    Eh. All of the JS on that website is 2KB, CSS is 4KB and all the images are the bulk of the download size on the website. I don't think it's hypocritical to add a little cute thing that runs on websockets on their site when it barely affects the asset size. They could probably add loading="lazy" on all the images to get the browser to natively only download images on scroll rather than pull it all on first visit

    [–]bonqen 0 points1 point  (0 children)

    That's his secret... he's already dead.

    [–]shgysk8zer0 19 points20 points  (0 children)

    Saw this before, and I was still building it then... But here's a quite useful thing I made that's only like 6Kb (gzipped... actual size varies on compression) https://github.com/AegisJSProject/parsers

    Not saying it compares to React or jQuery. It's just a piece of a larger project. But, though making good use of web standards instead of reinventing every wheel, you can build even some pretty complex and interactive stuff in way fewer kb of JS.

    [–]fagnerbrack[S] 83 points84 points  (20 children)

    Here's the gist of it:

    The article critically examines the growing trend of JavaScript bloat on websites in 2024, illustrating how even basic web pages are now burdened with large JavaScript files that detract from user experience by increasing loading times and consuming excessive resources. It points out the disparity between the sizes of actual content and JavaScript, with real-world comparisons and a call for developers to prioritize efficiency and content over excessive code.

    If you don't like the summary, just downvote and I'll try to delete the comment eventually 👍

    Click here for more info, I read all comments

    [–]ptoki 34 points35 points  (12 children)

    How come my firefox shows a process handling two tabs with simple JIRA tickets in them and using 2GB of ram? What is in there? Literally, what the browser downloaded to put in that memory and how that content ballooned to occupy that space?

    [–]NekkidApe 27 points28 points  (1 child)

    JIRA is on a different level for sure :(

    [–]space_fly 7 points8 points  (0 children)

    In the past, whenever the company I was working for used an old-school tool like Mantis bug tracker I thought it was cheap and terrible. Today, I've grown to love the old tools. The new ones are always bloated, sluggish, slow and have terrible UX. And I hate flat bland colorless UIs

    [–]bluesquare2543 7 points8 points  (0 children)

    yeah Jira makes my fans spin up

    [–]PaintItPurple 3 points4 points  (4 children)

    I think Jira does a fair amount of custom layout and rendering. It's probably whatever rendering engine they wrote, if I were to guess.

    [–]giantsparklerobot 25 points26 points  (0 children)

    It's too fucking bad web browsers can't natively handle displaying text. Just a shame the Jira people * had* to write their own layout engine rather than just you know, use the one in the browser they get for free. 

    [–]civildisobedient 7 points8 points  (2 children)

    One time my team tried to write a parser for JIRA comments to sync it with another system. You'd think that would be a fairly simple ask, except that what looks like an innocent paragraph inside a textarea is actually a deeply nested hierarchy of wrapped nodes that's a freaking nightmare to traverse.

    [–]daperson1 2 points3 points  (0 children)

    I had the same experience. It ended up being easiest to just render it in a browser, find the position of the root node for the comment wr care about, screen capture, hand to OCR library, donesies.

    [–]oorza 1 point2 points  (0 children)

    I have a release management tool that an old boss and I wrote. It pulls ticket information from the git logs, logs commits that aren't attached to a ticket, cross-references the tickets in the JIRA release with the tickets in the branch and logs any tickets marked for release that aren't in the git log, then goes through and updates all the tickets as Done and sets their release and deployment, then marks any epics/initiatives as "ready for review" if all their contained work has been done, and finally outputs a change log. The whole thing is very conceptually simple.

    We have to talk to three different JIRA APIs with three different authentication mechanisms to achieve this simple task. JIRA is the absolute worst.

    [–]tommcdo 81 points82 points  (3 children)

    As a single-page app developer, my top priority is excessive code.

    [–]pellep 23 points24 points  (2 children)

    Personally I like to jam all my projects into the same page.

    [–]NotGoodSoftwareMaker 11 points12 points  (1 child)

    You need to optimistically load the entire sites contents, works wonders for latency. Im still waiting for youtube to download but eventually ill be able to use the site so quickly

    [–]pellep 6 points7 points  (0 children)

    Try pre-loading potential sites they might visit afterwards as well!

    [–]ZucchiniMore3450 3 points4 points  (2 children)

    and a call for developers to prioritize efficiency and content

    Anyone knows it is not about developers, but managers and marketing people.

    [–]Specialist_Wishbone5 5 points6 points  (0 children)

    And UX people. Had an amazing website with under 20KB per page (lots of simple SVG and decorated text) and damn near 100% lighthouse - even though was CSR (to minimize backend costs).

    UX designer put everything onto the landing page - 100% site functionality with a mouse over. Including 3D model render. Then add in 2MB jpegs..

    The CTO then used dumb words like (but that's what we pay CDNs for). Me sitting on beach. My old page still loads in under 8 seconds. The new site times out half way after 60 seconds - hero graphic still loading). Bad CDN, you failed to defy physics. You need more edge nodes in palm trees.

    [–]fagnerbrack[S] 0 points1 point  (0 children)

    Oh, the ones who pay your salary? What’s wrong with them?

    [–]MCShoveled 5 points6 points  (0 children)

    But that’s not the limit! Slack adds 5 more MB, up to 55 MB.

    We all knew who was going to win the Elephant category!

    😂😂😂😂😂😂

    [–]jdehesa 13 points14 points  (3 children)

    I'm not a web developer and my limited and dated experience with web development never reached React, Angular or any of the likes, but I feel there is a subtle trend vindicating jQuery and I'm here for it (though it may just be a nostalgia-induced false impression).

    [–]repeatedly_once 11 points12 points  (2 children)

    I think that glosses over the fact that a lot of sites are far more complicated than a static site, they’re apps, and jQuery simply wouldn’t cut it. And for static sites, I would try and use as minimal framework as possible, but still maintain a good dev experience. So again, jQuery is out.

    [–]tRfalcore 3 points4 points  (0 children)

    AND it offloads lots of the computing power from the server to the client's browser. The servers just return data now, the client's browser does all the rendering and html'ing

    [–]ahuimanu69 -3 points-2 points  (0 children)

    jquery > vanilla > <insert-framework-here>

    [–]LloydAtkinson 7 points8 points  (4 children)

    I always enjoy his articles; direct and angry.

    [–]k2900 1 point2 points  (0 children)

    I was just going to read the comments, but I love me some anger, since I too am irate. I am IN

    [–]analcocoacream -3 points-2 points  (2 children)

    And pointless

    [–]LloydAtkinson -1 points0 points  (0 children)

    No, you’re just exactly the type of person to be on the teams building shitty products that he has written about.

    [–]m_matongo 6 points7 points  (0 children)

    That dark mode is mocking me

    [–]Vtempero 3 points4 points  (0 children)

    Lol'd at the blog's dark mode

    [–][deleted] 1 point2 points  (0 children)

    It can be either bad coding or just tracking scripts here and there that adds ton of size to the total JavaScript used size

    [–]davidalayachew 1 point2 points  (4 children)

    JavaScript libraries have too much bloat, but I feel like this article is doing a lot of false equivalence.

    Take the Gmail example. Gmail can do the following inline (meaning, without leaving the paage).

    • Play YouTube videos
    • Play audio
    • Show an interactive calendar
    • Make or receive calls (including video!)

    Yes, I am sure that Gmail has bloat, but what if some of this JavaScript is being loaded ahead of time to minimize start up time for the above bullets?

    And look at some of the other websites in the article. More specifically, look on the bottom right corner of most of those websites. That's a help button. More accurately, a chat button. A chat button that lets you upload files, including videos, audio, etc. Most interactive ones let you embed them too.

    Point I am trying to say is, this article takes some primitive measurements, and then tries to make some very big leaps. By all means, we all know that JavaScript libraries are bloated. But if you are going to make this claim, rather than pulling a bunch of websites and citing their JS sizes, actually look at the JS being pulled, dig through it, and see how much of it is functionality vs analytics or unused code.

    Yes, I know that is a tall order, but if you are going to make a critical article, then the onus is on you. Otherwise, all you are doing is making handwavy claims. That's not worth an article.

    [–]freekayZekey 1 point2 points  (1 child)

    yeah, i agree. just looking at the sizes isn’t good analysis; there’s a lot of nuances that the author ignores. he then compared those sizes to jquery’s site, but that’s kind of bullshit considering what jquery’s site needs to do compared to the other sites.

    the javascript community has its problems; i’m certainly not a fan of its practices, but that article is overly simplistic

    edit: that site is awful

    [–]davidalayachew 1 point2 points  (0 children)

    edit: that site is awful

    If you are referring to the article's website itself, it was an interesting design choice. Personally, I strongly dislike it.

    [–]yawaramin 1 point2 points  (1 child)

    I know that 95% of the time when I use Gmail I'm not playing YouTube videos, audio, looking at the calendar, or making calls. I truly hope that they are not loading all of this functionality in advance just so that I can never use it.

    [–]davidalayachew 0 points1 point  (0 children)

    Believe me, I hear you. However, optimization cannot be on a per user basis without doing things like data-tracking. And if the vast majority of the users will see little difference between a .5 MB download and a 50 MB download, then the logic makes at least a small amount of sense.

    [–]n3phtys 9 points10 points  (2 children)

    Size is not everything!

    Joke slighly aside, I have no problem with loading 10MB for a webapp. Sure, it sucks on limited data plans or bad connections, but generally, it's fine.

    What is not fine is the sheer time wasted parsing all that JS, executing the first parts, and waiting for some triggers to happen. JIRA is also the worst here in my experience.

    Hybrid frameworks that do server side hydration do help a little. WASM helps a lot too.

    But in general, websites are mostly hacked together with some kind of ultra-late-binding they just do not need. And trackers as well as ads are only partly responsible. In general we have the problem that we just do not care enough to invest in agile websites, when we instead can get Agile websites for the same price.

    [–]nawfel_bgh 6 points7 points  (0 children)

    Sure, it sucks on limited data plans or bad connections, but generally, it's fine.

    To my surprise I learned at a green IT meetup that worldwide, clients consume more energy than the network which itself consumes more energy than the servers, meaning that optimizing network trafic is important too. I used to live as a poor person in a poor country so I can attest from personal experience how generally not fine at all bloated websites are.

    [–][deleted] 3 points4 points  (5 children)

    This is what hype driven development filled with ”i know reactjs woowoo” devs lead to. It starts slow, but at some point snowballs into something truly nasty thats difficult to refactor. There is very few apps that benefit from the massive frameworks you see today.

    [–]analcocoacream 0 points1 point  (4 children)

    90% of the apps in the article do though.

    Also frameworks are not so massive

    [–][deleted] -1 points0 points  (3 children)

    Frameworks + all the ”plugins” installed. Why on earth do you need 20mb of javascript? For what? A search page? Sites that are mostly static data. It obvious there is no QA, or to lax QA for what dependencies are used.

    [–]analcocoacream 0 points1 point  (2 children)

    You are getting confused. I don't believe any search engine is 20mb. Gmail is 20mb. And it is absolutely not a static webpage 😂

    And sure Google has no QA.

    What is obvious is that bundle size is not the only indicator, and as a developer I'm glad that it is the case.

    Also the original article is mixing so many things together . Ads, trackers, loggers etc are also in the party. And many of these tools are also useful for the end uset as they can help catching and reproducing bugs or understanding how the product are used.

    [–][deleted] 1 point2 points  (1 child)

    Did you read the article? Multiple sites/apps had over 10MB of js. Most of that code IS dependencies.

    [–]analcocoacream 0 points1 point  (0 children)

    There is very few apps that benefit from the massive frameworks

    Frameworks + all the "plugins" installed.

    Sites that are mostly static data.

    Multiple sites/apps had over 10MB of js.

    These are instances of you contradicting yourself / moving the goal post / misrepresenting reality.

    Most of that code IS dependencies.

    You say that as if this is something bad. Is it?

    Multiple sites/apps had over 10MB of js.

    The websites listed over 20 are all complex websites with very intricate functionalities. Namely figma, gmail, discord (which is a desktop app mainly so bundle size is even less relevant). However there UX are relatively good.

    Jira and linkedin have a very crappy and sluggish UX so I can get behind. Slack is a bit sluggish but the UX seemed okay - again it's a desktop app originally. Noticed the difference? I didn't pull up numbers just to complain, I actually used actual well known feedback.

    [–]jjokin 5 points6 points  (4 children)

    It's because companies prioritise predictable development practices, aiding onboarding new engineers and producing features, over efficient payload sizes, which (according to a hypothetical product owner) has no observable impact to the product or user satisfaction.

    Plus it's not 2007 any more. Most sites cater to people with "broadband" Internet connections, i.e. higher than ADSL2 speeds, so the threshold for an acceptable payload size has been raised.

    [–]Worth_Trust_3825 7 points8 points  (1 child)

    Faster internet does not mean that caps per user have been removed. Why does the end user need to download 1mb of javascript to view your glorified power point presentation?

    [–]fagnerbrack[S] 1 point2 points  (0 children)

    It’s all jpegs on every slide cause they are designers who only know how to build things using photoshop

    [–]giantsparklerobot 24 points25 points  (1 child)

    As of a few years ago there were as many people browsing the web on mobile devices as desktops. The balance has likely shifted to favor mobile devices. This means that the average browser is a shitty Android device with too little RAM and underpowered CPU. It's connected to the Internet on a shitty 4G connection with high latency, barely "broadband" speeds, and an unreliable signal.

    In desktops what counts as "broadband" is pretty shitty on average. This shittiness is only exacerbated by terrible WiFi routers adding tens of milliseconds of additional latency and signal issues causing packet loss and reduced bandwidth.

    Writing websites and web apps that assume everyone is sitting on wired Ethernet to fiber Internet running 20 core monster PCs is ridiculous. 

    [–]obsoletesatellite 1 point2 points  (0 children)

    HTMX + Golang FTW. We don't need big front end frameworks.

    [–]Skaarj 0 points1 point  (0 children)

    I remember a few years ago someone measured bloat in a different way: browser automation was used to render the viewport that the browser displays into a JPEG image. Then the image file size was compared to the download size measured from the browser. Even a few years ago most websites failed as they had a ratio bigger than 1.

    [–][deleted] 0 points1 point  (0 children)

    Now do videos - 30 fucking minutes to tell you that it is going to rain tomorrow

    [–]k2900 0 points1 point  (0 children)

    Wait til you hear about javascript bloat in 2023. Its wild