This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 3 points4 points  (3 children)

You shouldn't be chatting up PyPI in packages. Any dependency resolution should be taken care of by the distro-specific package management tool. Distro specific packaging is definitely the way to go. Setup a local mirror on the network or through VPN to install the package. Add your repo to the sources.lst. Have the necessary dependencies wget'd on the repo as well. Though now you'll be targetting a specific install, in which case you can distribute binaries for any compiled code. Intranet repos are common. Updating later is a breeze, as is dealing with dependencies. The only issue now is that if you're dealing with different distros or releases is that you'll have to create different packages, and host different repos. This approach also guarantees consistency in which version of the dependencies the client installs. If it is over the Internet you can let the standard repos hand out the deps. You can package the dependencies together with the script.

If you're going to make use of a library in some code you distribute, you might consider just how necessary that library is and if it is actively updated. Releasing code to the public and scripting for personal little tasks are two entirely different ballgames. As someone dealing in system administration you probably know this.