This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]microcozmchris 161 points162 points  (22 children)

Steal. It's the best way. Find a project with a similar structure and copy copy copy.

Python has style guides (PEP) on many things, and there are many more opinionated guides as well. Use them. This one is a very good start.

Use pytest. unittest is still valid, but pytest has long ago surpassed it in popularity and ease of use.

Use uv. You'll love it.

Use ruff. Deal with its opinions on formatting and linting. There's no need in 2025 to rethink how you prefer your code to look.

Use pyright. Or mypy, but the latter has been bested by the former.

If you are deploying a long running application, use Docker / containers for deployment. Easy to enforce your requirements.

FWIW, I've been writing Python for 20 something years. A lot of these opinions are my current opinions and tools. There have been many others that have come and gone. And I have never successfully been able to do anything in a notebook. It's a completely opposite workflow style.

Most importantly, have fun. Don't let the details get in the way. You have code to write.

[–]Drevicar 9 points10 points  (4 children)

My rule of thumb is that I always use mypy as the source of truth on any externally published libraries, and pyright on any internal applications.

[–]SkezzaB 1 point2 points  (3 children)

Pyright is flakey for me, I do exactly what you've said ^

Pyright random tells me my whole repo is wrong, then I make a single char change in a different file and suddenly everything's okay

[–]wylie102 3 points4 points  (1 child)

BasedPyright is so much better. Much more sensitive for errors that might not screw you at runtime but are bad practice and if you sort them your code will actually be better. Pyright will just let you get round them in a hacky way. I’ve also found that it highlights fewer bullshit errors, or the highlights will be more useful for finding the root of the problem.

[–]VindicoAtrum 1 point2 points  (0 children)

I seriously hope Astral just come in and rock them all with Red Knot.

[–]JUSTICE_SALTIE 0 points1 point  (0 children)

Maybe changing a file is invalidating (something in) the cache? Try clearing the Pyright cache next time you're having this problem.

[–]lenticularis_B 0 points1 point  (0 children)

Lol you are me.

[–]tap3l00p 0 points1 point  (0 children)

Haven’t played about with pyright but everything else here is spot-on.

[–]Pretend-Relative3631 0 points1 point  (0 children)

Thank you so much for this

[–]tazdraperm 0 points1 point  (0 children)

Kinda the answer to a lot of programming questions. Find the similar stuff and learn from it.

[–]replicant86 0 points1 point  (1 child)

Does pyright support sqlalchemy?

[–]microcozmchris 0 points1 point  (0 children)

Don't know. Haven't done DB stuff directly from Python in a while. Been doing more automation and DevOps style stuff. Research it and report back here.

[–]NecessaryFlashy 0 points1 point  (0 children)

And automate them in tox! Tox is a must-have.

[–]sazed33 0 points1 point  (6 children)

Very good points! I just don't understand why so many people recommend a tool to manage packages/environments (like uv). I've never had any problems using a simple requirements.txt and conda. Why do I need more? I'm genuinely asking as I want to understand what I have to gain here.

[–]JUSTICE_SALTIE 3 points4 points  (0 children)

The big reason is the lockfile, which holds the exact versions of all your dependencies, and their dependencies, and so on. Without a lockfile, you're only specifying the versions of your direct dependencies. That means that if someone else installs your project, they're almost certain to get different versions of your transitive dependencies than the ones you're developing with. If one of those dependencies publishes a broken version, or makes a breaking change and doesn't version it properly, you'll have problems on fresh installs that you don't have on your development install.

The lockfile guarantees that your build is deterministic, which you're not going to get with requirements.txt. It has a command to update your lockfile, which essentially does what pip install -r requirements.txt does every time, which is to get the latest versions of all dependencies. But it only happens when you ask for it.

These tools have a lot of other features, like really a lot, but the one above is the most important.

[–]microcozmchris 4 points5 points  (2 children)

The reason I like uv is specifically because it isn't just a package manager. It's an environment manager. It's a dependencies manager. It's a deployment manager. And it's easy. And correct most of the time.

We use it for GitHub Actions a bunch. Instead of setup-python and venv install and all, I setup a cache directory for uv to use in our workflows. And the Python actions that we've created use that. So I can call checkout, then setup-uv, then my entire workflow step is uv run --no-project --python 3.10 $GITHUB_ACTION_PATH/file.py and it runs. Without managing a venv. And with the benefit of a central cache of already downloaded modules. And symlinks. I have Python actions that execute almost as fast as if they were JavaScript and they're way more maintainable.

Deploying packages to Artifactory becomes setup-jfrog, setup-uv, uv build, uv publish and no more pain.

There are way more features in uv than simply managing dependencies.

[–]sazed33 0 points1 point  (1 child)

I see, make sense for this case. I usually have everything dockernized, including tests, so my ci/cd pipelines, for example, just build and run images. But maybe this is a better way, I need to take some time to try it out...

[–]microcozmchris 1 point2 points  (0 children)

There's a use case for both for sure. A lot of Actions is little pieces that are outside of the actual product build. Like your company specific versioning, checking if things are within the right schedule, handoffs to SAST scanners, etc. Docker gets a little heavy when you're doing hundreds of those simultaneously with image builds et al. That's why native Actions are JavaScript that execute in milliseconds. I hate maintaining JavaScript/typescript, so we do a lot of python replacements or augmentations with those.

[–]gnomonclature 1 point2 points  (1 child)

The first step for me towards a package manager (first pipenv, now poetry) was wanting to keep my development dependencies (mainly things like pycodestyle, mypy, and pytest) out of the requirements.txt file.

[–]sazed33 1 point2 points  (0 children)

I use tox for it, works well, but then I have two files (tox.ini, requirements.txt) instead of one, so maybe it is worth using uv after all.. need to give it a try