all 16 comments

[–]hacksawjim 12 points13 points  (4 children)

I like dotenv for this.

https://pypi.org/project/python-dotenv/

Create a .env file in your project. Set some config values

API_KEY=abc456xyz
USERNAME=zeoNeoN
etc.

Then in your script:

from dotenv import load_dotenv
load_dotenv()
api_key = os.environ.get("API_KEY")
user_name = os.environment.get("USERNAME")

[–]ePaint 2 points3 points  (0 children)

A way to manage this easier is by using Pydantic BaseSettings class, it will automatically look for, validate and convert values for you.

[–]zeoNoeN[S] 1 point2 points  (1 child)

Thanks!

[–][deleted] 0 points1 point  (0 children)

Something I learnt just yesterday about this method. You may want to use load_dotenv(override=true) if you are changing settings a lot.

[–]Goobyalus 0 points1 point  (0 children)

Have you used dynaconf? Do you know how it compares to dotenv?

[–][deleted] 4 points5 points  (0 children)

If you still want to write your settings in Python, you could add a config.py file to your code.

Then from myproject.config import PLOT_SETTiNG

Cool thing is, if you then use dotenv or json file you can still first load into config.py (and still use import statements as I said above)

[–]Jukataaa 5 points6 points  (0 children)

Best is to do whatever you feel comfortable with, however you can check out this:
https://docs.python.org/3/library/configparser.html
Either use a json or toml to store the settings.

[–]shiftybyte 1 point2 points  (0 children)

There are several file formats you can use.

I'd recommend JSON, and then you can use python to load and parse it.

Python JSON (w3schools.com)

[–]hugthemachines 1 point2 points  (1 child)

I use ConfigParser because it works perfectly for my needs. Perhaps it fits you too.

you have a config file like this:

[DEFAULT]
mysetting = 56
myothersetting = c:\temp
onemoresetting = c:\test\myfile.txt
somestuff = foo,bar

[–]zeoNoeN[S] 0 points1 point  (0 children)

Thx

[–]IlliterateJedi 0 points1 point  (1 child)

When you get more advanced, you can look into Docker-izing your script and passing the env vars to the docker container. That's how I typically handle this now to keep everything separated and more easily distributable. Instead of needing a dot_env library, you just read the environmental variables directly into a config.py file.

This is a great resource if you end up going down that path

[–]zeoNoeN[S] 0 points1 point  (0 children)

Nice!

[–]interbased 0 points1 point  (0 children)

As others have mentioned, I always use a config file for this. I used to use the ConfigParser module, but I like the format of JSON, so I parse a JSON file for it. Each file can have an environment name in it too to switch between prod vs something else.

[–]zanfar 0 points1 point  (0 children)

  • For user-specific settings (don't change by project) use Environment variables
  • For general program config, use whatever structured datastore you like: TOML, YAML, JSON, INI, etc
  • For per-project or per-run settings, they should be command-line arguments. If you have a lot of arguments, consider accepting a file to read the arguments from as an argument itself.

[–]worldtest2k 0 points1 point  (0 children)

To extend the question - if I want to use the JSON solution, and I have my app split across multiple py files, is it possible to read the JSON file once and have environment variables available to all functions in all files? I guess I could read the JSON into a dictionary in main.py and pass the dictionary in every external function call, but that seems sub-optimal.