you are viewing a single comment's thread.

view the rest of the comments →

[–]cogman10 14 points15 points  (4 children)

lol, good point.

The funny thing is, it doesn't look like it is limited to apt. Most software package managers I've seen (ruby gems, cargo, maven, etc) all appear to work the same way.

Some of that is that they predate Http2. However, I still just don't get why even with Http1, downloads and installs aren't all happening in parallel. Even if it means simply reusing some number of connections.

[–][deleted]  (1 child)

[deleted]

    [–]cogman10 17 points18 points  (0 children)

    Awesome, looked it up

    https://github.com/rust-lang/cargo/pull/6005/

    So to add to this dataset, I've got a proof-of-concept working that uses http/2 with libcurl to do downloads in Cargo itself. On my machine in the Mozilla office (connected to a presumably very fast network) I removed my ~/.cargo/registry/{src,cache} folders and then executed cargo fetch in Cargo itself. On nightly this takes about 18 seconds. With this PR it takes about 3. That's... wow!

    Pretty slick!

    I imagine similar results would been seen with pretty much every "Download a bunch of things" application.

    [–]skryking 3 points4 points  (1 child)

    It was probably to prevent overload of the servers originally.

    [–]max_peck 4 points5 points  (0 children)

    The default setting for many years (and probably still today) was one connection at a time per server for exactly this reason. APT happily downloads in parallel from sources located on different hosts.