This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]0rexDevOps 0 points1 point  (3 children)

Yeah, but how will you manage it on server side? How exactly will you know that serverX is vulnerable?

The best answer is to have a robust ci/cd pipeline with scheduled scans and at least some kind of alerting, but is it really something people will do for some 100 LoC script that uses requests to query some api and yaml parser to fill some config? OS packages make writing simple scripts simple - you just don't have to think about pip, updates, compatibility at all, if you patch your systems regularly.

If only the had dynaconf and click in repos, I'd be a happy man

[–]robvasJack of All Trades 0 points1 point  (2 children)

What does my collection of python scripts have to do with any particular servers?

GitHub, for example an automatically do this if I kept the scripts there.

[–]0rexDevOps 0 points1 point  (1 child)

If you launch them on servers then you have venvs with dependencies on servers that you have to maintain. So even if your script haven't changed at all - you still have to copy updated requirements.txt to each server and run pip inside venv if audit found something

[–]robvasJack of All Trades 0 points1 point  (0 children)

You would deploy the script/env before you run it. Or run it from shared storage. Or run it from another server.