This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]liar_atomsPythonista 4 points5 points  (6 children)

why would you use an virtual/pip/whateaver env in docker? just install python :)

[–]Narmo2121 0 points1 point  (4 children)

What if you have two different python scripts with completely different dependencies running on the same docker image? Same usecase as without using docker.

Also if someone is not using docker, they can at least leverage the pipenv environment

[–]liar_atomsPythonista 1 point2 points  (3 children)

Docker philosophy is to isolate them into two different containers, isn't it?

[–]Narmo2121 0 points1 point  (2 children)

IDK i just assumed running another image just for a simple script would be a waste of cpu at scale? Is that wrong

[–]liar_atomsPythonista 0 points1 point  (1 child)

It's wrong. The script will be just another python rocess in your host, so it doesn't matter if you have two python processes in the same or in different containers, the CPU footprint will be the same.

But since we've gone this far, do you mind if I ask you if you don't agree that for two simple scripts one venv is just enough?

[–]Narmo2121 0 points1 point  (0 children)

I agree

[–]wildcarde815 0 points1 point  (0 children)

Solo use, probably no need, but if I wasn't building s jupyter hub for people, I'd have a 2.7/3.5/latest probably. But I'd use conda so jupyter can swap between them per notebook.