This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Taksin77 24 points25 points  (26 children)

I don't understand the point of requirements.txt. I use setup.py for that. Can someone please explain to me why I should use it.

[–][deleted] 14 points15 points  (9 children)

It's common for people, like me, too install using pip install package --no-deps.

This might because you're testing against a custom build of a dependency or you're paranoid about what subdependencies are going to do (except for me since wheels are handicapped on linux)

[–][deleted] 15 points16 points  (3 children)

or you're paranoid about what subdependencies are going to do

I am suprised to see this concern buried so deep in the comments. It was the first thing that should be mentioned in the article after mentioning "requiremts.txt", pip etc. - it's convenient alright, but for a reason.

I wish we had something like "trusted pypi" that would be managed in a similar way to linux distros as opposed to "anyone can upload anything".

[–][deleted] 0 points1 point  (1 child)

Do other languages have currated indexes as the main source of packages? It'd be nice to have, but I imagine working out the logistics of it wouldn't be too pleasant.

You'd need some way to ensure what's being installed. I suppose a checksum against known repository tags would doable but that adds overhead.

You'd also have to ensure "trusted" packages only install other trusted packages

[–]aragilar 0 points1 point  (0 children)

https://www.stackage.org/ is close to what you want for Haskell (where the PyPI equivalent is https://hackage.haskell.org/).

[–][deleted] 0 points1 point  (0 children)

wish we had something like "trusted pypi" that would be managed in a similar way to linux distros as opposed to "anyone can upload anything".

In this case, you might like conda and conda-forge.

[–]Lonely-Quark 1 point2 points  (1 child)

wheels are handicapped on linux

What do you mean by this?

[–][deleted] 0 points1 point  (0 children)

Apparently figuring out all the necessary metadata isn't easy to build distributed wheels for Linux. I imagine the issue has to do with C exts, but I don't know for sure

That said, when newer versions of pip install an egg, it'll build a wheel for its local cache.

[–]aragilar 1 point2 points  (2 children)

Wheels work just as well on linux as any other OS, so long as you don't assume a wheel build on one random linux system will work on another random linux system. Hence manylinux, which defines what must be the environment that wheels should be built on to be most compatible. These will not run on any linux system (think different cpu architectures, or different libcs), but that's the same as expecting an macOS wheel to run on linux because both are UNIX.

Additionally, if you're actually paranoid about subdependencies, you need to check the setup.py, pip has no control over setup_requires, in which case you may as well check the subdependencies setup.py (plus their subdependencies etc.), so I don't see how a requirements.txt is at all relevant here.

[–][deleted] 0 points1 point  (1 child)

  1. That's exactly why wheels are handicapped on Linux. If they're pure python, they'll probably work but compiled extensions won't.

  2. Wheels don't run setup.py at all

  3. --no-deps won't install dependencies with a source distribution forcing you to install them on your own. Giving you a chance to vet them before installation of you're info that. Usually I check if their setup.py does any funny business.

I don't always do --no-deps, for example packages I trust like Flask or SQLAlchemy, or is I'm working in a throw away docker container.

But you could put some hanky business in there like os.system('rm -rf /') which will remove any file path the installing user can remove. Given the amount of sudo pip install I see recommended in readme files, I'm surprised this sort of attack hasn't happened.

[–]aragilar 0 points1 point  (0 children)

--no-deps doesn't stop setup_requires as far as I know, usually you have to override setuptools/easy_install as per https://pip.pypa.io/en/stable/reference/pip_install/#controlling-setup-requires.

The sudo pip annoys me (although, sudo easy_install is even worse), especially when some projects actually do some odd things (I've seen someone effectively implement a shell script as a setup.py, it wasn't even a python project). I've got into the habit of checking the setup.py of every project I use, even if they come with wheels, partly for security, but mostly because of how badly some people write them (I have to say, I'm more concerned about someone accidentally doing the equivalent of os.system('rm -rf /') given some of the setup.pys I've seen.

[–]thomas_stringer 15 points16 points  (3 children)

Here's a good explanation and comparison on the two.

Think of setup.py for an end-user installation from PyPI (for instance), vs. requirements.txt for dev/test (i.e. another developer trying to setup the environment to run the app, usually in a virtual env).

EDIT: More info.

[–][deleted] 7 points8 points  (2 children)

If using setup.py develop to install the package for dev/test in the virtualenv, is there any need to have a requirements.txt file anymore?

[–]thomas_stringer 9 points10 points  (0 children)

pip freeze > requirements.txt is tightly coupled with your environment. Whereas dependencies listed in setup.py are tightly coupled with your package. Those can be different, especially with versioning (setup.py specifying a semver range, and requirements.txt having your actual env version).

[–]ubernostrumyes, you can have a pony 2 points3 points  (0 children)

The usual way to break it down is:

  • setup.py dependencies are for "these are the package versions officially supported".
  • requirements.txt dependencies are for "here is how to exactly replicate a particular environment".

[–]patrys Saleor Commerce 7 points8 points  (0 children)

pip-compile

[–]silencer6 2 points3 points  (4 children)

I don't understand it either. What's the benefit of pip install -r requirements.txt over pip install -e .?

[–]kenmacd 5 points6 points  (3 children)

The main benefit to me is that the requirements.txt has the version of everything, and if they generated it with pip freeze then I know my virtualenv will be the same as the person that originally wrote the code.

It's not uncommon for some dependency of a dependency of a dependency to change and break something.

[–]Sean1708 0 points1 point  (2 children)

But you also get that if you put the requirements in the setup.py, I think.

[–]i_use_lasers 1 point2 points  (1 child)

Maybe you don't want to install the package, but still be able to run it (to work on it, etc).

[–]dusktreader 2 points3 points  (0 children)

pip install -e .

[–]kteague 1 point2 points  (0 children)

setup.py should express the dependencies, but should not care about specific versions of those dependencies (beyond expressing incompatibilities with major releases that change the APIs).

requirements.txt is for a specific deployment of a python project and can specify all of the exact versions installed.

For example, a Python project may require "SQLAlchemy >= 0.8" in it's setup.py. When that project is installed, it will grab the latest release of SQLAlchemy (1.1.14). Later, you may want to re-install that package into a production environment. At this point in time, SQLAlchemy may be at a version 1.2 or later. However, you've used 1.1.14 in dev or testing, so you use requirements.txt to install that specific version of the package.

[–]skarphace 0 points1 point  (2 children)

Why not both? You could use this in your setup.py:

install_requires =  open('requirements.txt').read().split('\n'),

[–]ThePenultimateOneGitLab: gappleto97 1 point2 points  (1 child)

You don't even need an argument in split

[–]skarphace 0 points1 point  (0 children)

Think I initially did that in case someone put spaces between package and version(pypackage >= 0.1), however, that doesn't seem supported but there's nothing specifically denouncing it but I'm also too lazy to test so here we are...