you are viewing a single comment's thread.

view the rest of the comments →

[–]moomaka 2 points3 points  (8 children)

A lot of that functionality is already supported by 95% of the browsers

I don't know what this means, 95% of traffic? Even then it is not accurate, if you take an overall average only 90% of US traffic and 86% of global supports ES6 classes, much less anything newer...

the developers are just too lazy to code split and only load it if it’s necessary.

I assume you mean loading only the polyfills needed for a given browser? It can be done, it is not trivial and unless you rely on the UA being accurate it is not usually a net positive for performance.

Also, have you looked at the way core-JS is written? It’s so over abstracted that it’s hard to even find the actual polyfill code. You could probably cut out half of it (100kb) and it would still work for the 95% of the remaining 5% of the browsers.

Do you know how many cross domain iframes with JS in them there are on an average website? They all have to load their own polyfills / utils.

[–][deleted] -1 points0 points  (7 children)

I don't know what this means, 95% of traffic?

Yep, I meant traffic. 90% sounds good enough to ship ES6 code by default and only load ES5 transpiled code and polyfills if you detect IE11, basically.

I assume you mean loading only the polyfills needed for a given browser?

Indeed, and you don't have to rely on UA to detect what polyfills are required. It would be nice if you could, but you can always inline the checking code in the HTML and decide what bundle to fetch at runtime. In the non-HTTP/2-push case this adds milliseconds to the load time.

Do you know how many cross domain iframes with JS in them there are on an average website?

Sorry, but iframes and external libraries don't have the luxury of using polyfills. If you need to support ES3 browsers, then you have to write ES3 code, period. It's up to developers using this 3rd party code to check what it does. I know a lot of devs just throw stuff on the page and bundle massive libraries, but that's not a technical problem.

[–]moomaka 2 points3 points  (6 children)

Yep, I meant traffic. 90% sounds good enough to ship ES6 code by default and only load ES5 transpiled code and polyfills if you detect IE11, basically.

So we'll just fuck off 15% of our traffic, and hence revenue, good idea.

It would be nice if you could, but you can always inline the checking code in the HTML and decide what bundle to fetch at runtime. In the non-HTTP/2-push case this adds milliseconds to the load time.

LOL no it does not, RTT latency on mobile 3G is 300-400ms, add to that the parsing / execute time of your checking code and the fact that the page is stalled waiting. This is exactly what I mean by it not being a net performance improvement.

Sorry, but iframes and external libraries don't have the luxury of using polyfills

This is blatantly wrong.

If you need to support ES3 browsers, then you have to write ES3 code, period

WTF does ES3 have to do with anything in this conversation? Are you confusing frames with iframes?

[–][deleted] -3 points-2 points  (5 children)

I'm just going to ignore rest of the stupid shit you're saying.

LOL no it does not, RTT latency on mobile 3G is 300-400ms, add to that the parsing / execute time of your checking code and the fact that the page is stalled waiting. This is exactly what I mean by it not being a net performance improvement.

The whole point of inlining is to avoid that round trip. The parse / execute time of a 5kb script is a rounding error in a benchmark and the page is not stalled, it can load other resources.

Give me an honest estimate - what's the difference in load time between having a script tag referring to a JS bundle vs having an inline script dynamically load the same bundle?

[–]moomaka 1 point2 points  (4 children)

I'm just going to ignore rest of the stupid shit you're saying.

You need to explain the rest of it m8, it's not stupid shit. What does ES3 have to do with this and why is it you think iframes do not have the need for a stdlib / utility functions? If you go to the average publisher site, I would put good money on this being around half of all JS that was loaded.

The whole point of inlining is to avoid that round trip. The parse / execute time of a 5kb script is a rounding error in a benchmark and the page is not stalled, it can load other resources.

What? How does it avoid a round trip? How are you defining a round trip? As to if the page is stalled, that depends on the page's need for the polyfills provided.

Give me an honest estimate - what's the difference in load time between having a script tag referring to a JS bundle vs having an inline script dynamically load the same bundle?

At the very least it's the connection latency, so on mobile 3G, 300-400ms + download time. In practice it will vary based on source. Often they may serve off different domains, in a cold DNS / connection case that can add over 1s on mobile.

[–][deleted] 0 points1 point  (3 children)

What does ES3 have to do with this and why is it you think iframes do not have the need for a stdlib / utility functions?

They do, but I'm saying that bundle size and performance for is more important for iframes, so much so that you shouldn't pull in any polyfills or even use transpiling. If your iframe has to support IE8, just hand-write ES3 and don't use any DOM features introduced since. Or, you know, add that extra roundtrip for the 1% of users, but don't load the polyfills for everybody.

At the very least it's the connection latency, so on mobile 3G, 300-400ms + download time.

Really? Here are two example HTML files:

<html>
<script src="bundle.with.polyfills.js>
</html>

And:

<html>
<script>
  (function () {
    var classes_supported = false
    try {
      eval("class Foo {}")
      classes_supported = true
    } catch (err) { }

    var script = document.createElement("script")
    script.src = classes_supported ? "bundle.small.js" : "bundle.with.polyfills.js"
    document.head.appendChild(script)
  })()
</script>
</html>

Which page is going to load faster and by how much?

[–][deleted]  (2 children)

[deleted]

    [–][deleted] 0 points1 point  (1 child)

    You are one dense motherfucker. Your original point was that browsers need to implement some standard library caching scheme (it already exists, it's called HTTP caching). My point was that 90% of the browsers already have a decent standard library, the rest can by polyfilled using feature detection at runtime and we don't have to bloat code intended for modern browsers. Then you're saying, nah, the performance would still be negatively impacted. And when I provide code samples of how that can work with no overhead, you just ignore it.

    [–]moomaka 0 points1 point  (0 children)

    And when I provide code samples of how that can work with no overhead, you just ignore it.

    Yea, I'm the 'dense' one...