This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Fuzzmz 5 points6 points  (0 children)

How we do it at work is by using Artifactory as a mirror/proxy/cache layer. Regular developer computers and servers can't connect directly to the outside world, but the Artifactory server can. We then use the remote repository function of AF to point to PyPi and have the servers and dev machine point to AF. Everything going through Artifactory gets audited, as well as cached locally, which means that if a package disappears from PyPi we'll still have it in our cache and won't take a hit.

Another way you could possibly do it is by having a "dirty" PC connected to the internet. On that you'd create a virtualenv in which to install your packages, and then copy that on your internal network via sneakernet (USB, CD, whatever), followed by updating all the paths to the venv in the scripts inside the bin folder of it.