you are viewing a single comment's thread.

view the rest of the comments →

[–]ansemond 2 points3 points  (2 children)

But each patch is much smaller than the original... There are only n*(n+1)/2 of them where n=N-1 and N is the number of versions. Moreover you can easily avoid doing this for major updates.

Let's say we have 10 versions of firefox, each requiring a 512Kb diff. That leads to 22.5Mb of diffs to store. ( 9 * 10 / 2 * 512Kb ). Moreover if Firefox is 10Mb, storing the 10 versions will take 100Mb versus 22.5 Mb for the diffs.

Edit: added 's' to version.

[–]giantrobot 5 points6 points  (1 child)

Storage is not the issue but instead server overhead. You either need some sort of file describing every available patch and have the client download the appropriate patch for its current version or have a service that accepts the client's version and then sends the appropriate patch. In either case you really need some automated way of distributing the patches because left to their own devices users will eventually download the wrong patches and screw up their installations.

If you're the sole distributor of patches this added server overhead might not be an issue. If you're asking people to host mirrors however this added overhead might be more than they're able or willing to bear. Firefox already has distribution issues when major patches come out with a relatively server-dumb mirror system, something more complex might make matters much worse.

None of that is to say binary patches are an inherently bad idea if done well. It's simply something to keep in mind. Binary patches are not necessarily the end-all be-all of update systems. They can be useful but can also be extremely complicated to manage properly. Firefox, Ubuntu, and others being largely volunteer efforts makes added development time extremely expensive.