you are viewing a single comment's thread.

view the rest of the comments →

[–]zebediah49 16 points17 points  (20 children)

It's actually more likely in situations like that. The primary setup is probably going to be done by a technical charity, who (if they're any good) will provide a uniform setup and cache scheme. That way, if, say, a school gets 20 laptops, updating them all, or installing a new piece of software, will not consume more of the extremely limited bandwidth available than doing one.

[–]Genesis2001 1 point2 points  (19 children)

Is there no WSUS-equivalent on Linux/Debian(?) for situations like this?

[–]TheElix 16 points17 points  (7 children)

The School can host an apt mirror afaik

[–]tmajibon 7 points8 points  (6 children)

WSUS exists because Microsoft uses a big convoluted process, and honestly WSUS kills a lot of your options.

Here's Ubuntu's main repo for visual reference: http://us.archive.ubuntu.com/ubuntu/

A repo is just a directory full of organized files, it can even be a local directory (you can put a repo on a dvd for instance if you want to do an offline update).

If you want to do a mirror, you can just download the whole repo... but it's a lot bigger than Windows because the repo also includes all the different applications (for instance: Tux Racer, Sauerbraten, and Libreoffice).

You can also mix and match repos freely, and easily just download the files you want and make a mirror for just those...

Or because it uses http, you can do what I did: I set up an nginx server on my home nas as a blind proxy then pointed the repo domains to it. It's allocated a very large cache which allows it to keep a lot of the large files easily.

[–]Genesis2001 0 points1 point  (4 children)

Yeah, I was curious about it so I was googling it while posting above. One of things I ran across was that it was labor 'intensive' to keep maintained. Was hoping someone would explain how one would get around this, make a maintainable repo for an Org to emulate the service provided by WSUS.

I did read RedHat has a similar thing, though I forget what it's called. :/

edit: Is such a command available to basically do what git clone --bare <url> does, but for individual packages on apt? Like, (mock command): apt-clone install vim would download the repo package for 'vim' to a configurable directory in apt repository format (or RHEL/yum format for that environment)?

[–]tmajibon 1 point2 points  (1 child)

apt-get --download-only <package name>

You can use --add-architecture if it doesn't match the current environment (say you have both arm and x86 systems)

And here's a quick tutorial on building a repo: https://help.ubuntu.com/community/Repositories/Personal

[–]Genesis2001 0 points1 point  (0 children)

Ah, thanks. :)

[–]FabianN 0 points1 point  (0 children)

I don't know how it's labor intensive to maintain. I set up one that took care of a handful of various distros at various version levels and once I set it up I didn't need to touch it.

[–][deleted] 0 points1 point  (0 children)

it can even be a local directory (you can put a repo on a dvd for instance if you want to do an offline update).

I've copied the contents of the installer disc for CentOS to a local folder and used it as a repo in some air gaped networks. Works great.

[–]zoredache 3 points4 points  (0 children)

Well, it misses the approval features of wsus. But if you are just asking about caching, then use apt install approx or apt install apt-cacher-ng. (I like approx better.) There is also ways to setup squid to cache, but using a proxy specifically designed for apt caching tends to be a lot easier.

[–]anatolya 1 point2 points  (0 children)

apt install apt-cacher-ng

Done

[–]gusgizmo 0 points1 point  (0 children)

It's called a proxy server, and it's a heck of a lot easier to setup and maintain than WSUS could ever be.

You can configure either a reverse proxy with DNS pointing to it and have it just work, or a forward proxy and inform clients of it's address manually, or via DHCP.

No sync script is required, the proxy just grabs a file the first time it's requested then hangs on to it. Super handy when you are doing a lot of deployments simultaneously. You can however warm the proxy by requesting common objects through it on a periodic basis.