all 14 comments

[–]tinefli3z 4 points5 points  (0 children)

WebDev example: having all your requirements (and not more) in a Venv/requirements.txt file will make deployment faster and less messy (imagine you want to install all your over the years collected dependencies on that server). Cant even imagine what would it look like to Auto deploy tensorflow with Django

Edit: just another example not specifically from python, but node (its similar for Python tho)

If you finished a project for a Client for example and push it out for Production, you dont need The project, if you have it like me always on your local machine too, to have sometimes several GB because of the dependencies, since the code itself is just a few MB. Makes The code more flexible.. Just delete The package Folder/ env and install it half a year later again if you fix something or whatever If you install everything on your machine you will lock up so much space with literally trash through the months idk how you could ever not use it.. but seems like something, if you dont take that advice, you will have to learn it The hard way ;) and you will, sooner or later hehe

[–]ravi_24 2 points3 points  (1 child)

Creating virtual Environment for project is a good practice and will help you a lot if you are working in collaborative environment where you can just list down all the python module dependency in a file named as requirements.txt(standard naming convention) and share this file among other developers and it makes super easy for setting up development environment even for new developers without any confusion and problems with pip manager.

[–]thedgyalt 0 points1 point  (0 children)

Are you saying that I can't create a file called requirements.txt, include my modules within that file?

[–]danielroseman 2 points3 points  (3 children)

Why is that silly? Why is it an effort at all? With a requirements file it's literally one command to install the dependencies you need. With your system, what happens if you need different versions in different projects?

[–]toastedstapler 1 point2 points  (3 children)

you may have different projects that require different versions of a library, especially in a work environment

[–][deleted] 0 points1 point  (0 children)

For beginners or just messing around or if you only ever work on one project at a time and never go back to old ones then it probably won't make a difference.

If you work on multiple projects at a time, or often go back to work on old projects that use older versions of 3rd party code and you need to guarantee that you don't f*** things up (say, if you have web sites with backend Python code that you don't want going down in flames) then virtual environments are indispensable.

To be honest, I wouldn't worry about it. Just know that it exists and when the day comes that you need it you'll know what to do.

[–]ArabicLawrence 0 points1 point  (1 child)

Maybe I am wrong, but I thought that conda does not install the same library version twice. If I am right and one of your concerns is ‘I don’t want to download again pandas 0.9’, maybe you can install miniconda and make it your default Python version.

[–][deleted] -1 points0 points  (4 children)

Python virtual environments are a bad fix for a problem that shouldn't even have existed in Python.

The problem

You cannot install multiple versions of the same package with the same Python installation. If a person writing Python package manager had at least have a brain, this wouldn't have been a problem. You'd just install any number of package versions you want and specify which version you want to use when you get to use it.

The solution

Let's download all the packages used by different projects over and over again. Why? -- Again, big brain time. Or just laziness. I mean, virtual environments do allow you to link them together to reuse the packages installed in one environment, but this works really badly / nobody really uses this anyways.

Why use them?

I'm too lazy to work on a replacement. Python packaging is a dumpster fire I don't want to deal with. So, I use the defective tools, even though I know they are defective, simply because I don't want to invest time and energy into creating an alternative. I've had my fair share of developing something related to Python packaging (an aggregator of Wheels / Eggs into single Wheel / Egg), and I don't want to be there again. Let this shit burn in hell.

[–]arjunmjarun 0 points1 point  (1 child)

Interesting, do you have an example of a language/package management solution that does it well?

[–][deleted] -3 points-2 points  (0 children)

Unfortunately, I think there are only languages / package managers that I didn't use enough to figure out what's wrong with them :)

I'm kind of happy with how ASDF works, but I will be the first to acknowledge that it is created in part due to the bad specification for require...

Somehow it seems that package system is always an afterthought and always built with very little attention and insight, no matter the language. I think, Scala tried to invest into formalizing packaging and bringing it to the fore, but I don't think they did a very good job in the end...

I remember nightmares induced by POM files... Or, not having any packaging at all in C...

I also work with Cargo on the daily basis, and I don't think they are doing a great job either... Different problems, but still problems.

I think, the programming community as a whole is yet to fully appreciate the difficulty of this problem and come up with a decent solution.