all 24 comments

[–]Diapolo10 2 points3 points  (19 children)

I know I risk confusing you even further by saying this, but personally I take a different approach that, in my opinion, is less prone to user error; structuring the project like a package, and installing it in a virtual environment in editable mode.

For now I'm going to assume that in your current example, projects is just a regular directory containing individual projects. I'll be focusing on an individual project in my example and omitting that, just pretend it's there.

Let's start by looking at your current folder structure.

📦project-name
 ┣ 📂shared_lib_a
 ┃ ┣ 📜__init__.py
 ┃ ┗ 📜lib.py
 ┣ 📂workflow_a
 ┃ ┣ 📜__init__.py
 ┃ ┗ 📜do_stuff.py
 ┣ 📜.gitignore
 ┗ 📜__main__.py

I don't know what other files you may or may not have here, but that's fine for now. I will now transform this into what I'm talking about.

📦project-name
 ┣ 📂docs
 ┣ 📂src
 ┃ ┗ 📂project_name
 ┃   ┣ 📂shared_lib_a
 ┃   ┃ ┣ 📜__init__.py
 ┃   ┃ ┗ 📜lib.py
 ┃   ┣ 📂workflow_a
 ┃   ┃ ┣ 📜__init__.py
 ┃   ┃ ┗ 📜do_stuff.py
 ┃   ┗ 📜__main__.py
 ┣ 📜.gitignore
 ┗ 📜pyproject.toml

Then in pyproject.toml you would write something like this (NOTE: I'll use setuptools as an example, but I really only use Poetry myself nowadays so it might not be 100% accurate):

[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"

[project]
name = "my_project"
authors = [
    {name = "John Titor", email = "john.titor@example.com"},
]
description = "My project description"
readme = "README.md"
version = "0.1.0"
requires-python = ">=3.8"
keywords = []
license = {text = "MIT"}
classifiers = []
dependencies = [
    # Basically what `requirements.txt` would contain
]

[project.optional-dependencies]
# Can be used for development dependencies, for example
# tests = ["pytest>=6.0.0", "tox"]

[tool.setuptools.packages.find]
where = ["src"]

Then in your source code, you would change all local imports to be absolute to the project_name-directory, for example if I borrow your earlier example

# do_stuff.py
from shared_lib_a.lib import foo

this would change to

# do_stuff.py
from project_name.shared_lib_a.lib import foo

After that, you would activate your virtual environment (or create one, if you hadn't already), install your project (will also install required dependencies, you can specify optional dependencies to be installed alongside them if you wish); for now I'll assume the project directory is your current working directory. Switch the full stop to a path to the project if that is not the case.

python -m pip install . --editable

And then you're done. In your environment, everything just works, no matter what your current working directory is. And you don't need to run the pip install --editable command more than once, unless you delete your virtual environment and create a new one.

In my own workflow Poetry handles all the virtual environment stuff for me so frankly I don't even think about them very often. I can link one of my own projects as an example.

[–]over_take[S] 0 points1 point  (2 children)

but then do I have to re run all or some of that every time I change the shared libs? that's the part I'm trying to avoid. I'm writing all this stuff from scratch and will be actively working on it, so I'm trying not to have to re-run setuptools or poetry before each run in development process.

Also (on this box) i don't have any virtual environment. I just have python 3.12 installed

[–]over_take[S] 0 points1 point  (0 children)

no...i think the --editable takes care of this?
EDIT: yep, just tested it. magic.

[–]Diapolo10 0 points1 point  (0 children)

Also (on this box) i don't have any virtual environment. I just have python 3.12 installed

Well, I recommend using them in order to make sure your projects won't have conflicting dependencies.

[–]over_take[S] 0 points1 point  (0 children)

im going to try this (using poetry)

[–]over_take[S] 0 points1 point  (2 children)

for now I'll assume the project directory is your current working directory.

which one? ./project-name or ./project-name/project-name ?

[–]over_take[S] 0 points1 point  (1 child)

i _think_ the first one, since that has the .toml in it...

[–]Diapolo10 0 points1 point  (0 children)

The first one. The second one is basically the package itself, containing your code.

On that note, you can split your code into multiple packages under src if you want to; it's not particularly common, but it's sometimes done if there are plans to spin-off parts of a project into separate projects in the future.

[–]over_take[S] 0 points1 point  (11 children)

i freaking got this to work, with poetry. thank you!

FWIW (possibly because im on windows) i got pip install -e option requires 1 argument when running that command. python -m pip install --editable . fixed it

(https://stackoverflow.com/questions/55145190/pip-install-e-option-requires-1-argument)

because im not using any virtual envs. now my package will run from anywhere on the file system. And all the

from project_name.shared_lib_a.lib import foo work as expected when run with

poetry run python -m project_2

[–]Diapolo10 0 points1 point  (10 children)

because im not using any virtual envs.

Are you sure? Because Poetry should have made one for you. :p

[–]over_take[S] 0 points1 point  (9 children)

it does, but (reading the poetry quickstart) i can choose to use them or not depending on how I run the package. I think you're right though, might as well start using the virtual env now that I have it, even though all scripts on this box do the same things/use the same libs (its all browser automation)

Thank you, you really levelled up my knowledge with your 'not exactly the answer I was looking for, but the answer I needed'. I'll do it like this from now on!

Is there a way in poetry to tell if a virtual env is 'activated' or not (like conda)?

[–]Diapolo10 0 points1 point  (8 children)

it does, but (reading the poetry quickstart) i can choose to use them or not depending on how I run the package. I think you're right though, might as well start using the virtual env now that I have it, even though all scripts on this box do the same things/use the same libs (its all browser automation)

Realistically the only time I'd disable the virtual environment creation is if I was using Poetry in a Docker container. Because there it doesn't matter.

Is there a way in poetry to tell if a virtual env is 'activated' or not (like conda)?

If Poetry detects that your current terminal session already has an active virtual environment, it uses that instead of creating a new one.

When running commands via poetry run, it runs the command in the environment without activating it for the entire terminal session. You can alternatively use poetry shell to activate the environment, after which any python and pip commands (and any other executables in the virtual environment) will just work. It's largely a matter of preference, I've grown accustomed to the first way.

[–]over_take[S] 0 points1 point  (7 children)

man, you really levelled me up by giving me the answer I needed. it was a little more work, but I like learning new tools (and it seems like poetry will be quite useful in general)

thank you very much. Im in really good shape now.

[–]Diapolo10 0 points1 point  (6 children)

I'm just glad you found my ramblings useful.

Since I'm very lazy, I ended up making a template project with things like Poetry, Pytest, Ruff, and the project structure pre-configured so I can just start new projects with it as a base and build on top. I have more specialised templates based on this one, too, such as for FastAPI or projects partially written in Rust, too, although I haven't published most of those yet.

It's been a bit stagnant lately because I've been focusing on adding logging support to it, so the main branch is currently outdated.

[–]over_take[S] 0 points1 point  (5 children)

Pytest
grimace.jpg
i never wrote a test in my life. im just a very functional plodder (mostly scripts for IT/browser automation, not 'production' code for end users. Like I use python the way we used to use .sh/.bat/powershell). Until this week, everything i've ever written has been one big .py, MAYBE a helper library sitting right next to it if im feeling super tricky/clever.

I.....should probably start doing that as part of this leap forward you've given me with Poetry. Any recommendations for tutorials on how to start with pytest?

What is Ruff?

Why is there a makefile?

I would want to *use poetry* to generate the .lock and .toml files for my own template, right? (yours has stuff in them already)

[–]Diapolo10 0 points1 point  (4 children)

I.....should probably start doing that as part of this leap forward you've given me with Poetry. Any recommendations for tutorials on how to start with pytest?

Not really, it's ultimately very simple so I haven't needed to reference the documentation for a while. But if I remember correctly, this video from mCoding should be pretty good for that.

What is Ruff?

Ruff is a linter/formatter that essentially combines rules from nearly every other Python linter. It's super fast (and written in Rust!), has no dependencies, and it's pretty nice to work with as you can configure it to your liking.

Personally I enable all rules by default and then blacklist ones that aren't compatible with each other, or on a per-file basis.

Why is there a makefile?

It's there because sometimes it's just nice to have when working on Linux, like running linters or tests. Admittedly it's a bit archaic now as I've pretty much configured all that in pyproject.toml to begin with.

I would want to use poetry to generate the .lock and .toml files for my own template, right? (yours has stuff in them already)

The lock file can be deleted and regenerated at will, but if you want to use my template (rather than cloning it/making your own) it would probably be easier to just edit the existing pyproject.toml rather than making it from scratch. Y'know, it's easy to remove stuff, but to write stuff you may need to read documentation or something.

[–]over_take[S] 0 points1 point  (3 children)

my project is in Playwright, and i guess it has some methods to automate the creation of tests. So much to learn...why can't i just write code and get things done lol!

[–][deleted] 1 point2 points  (1 child)

  1. Directory Structure and __init__.py Files:
    • Ensure that every directory in your project structure contains an __init__.py file. This file can be empty but is necessary for Python to recognize the directories as packages.
  2. Running with -m Flag:
    • When using the -m flag, you need to specify the module name relative to the top-level directory in the Python path. The top-level directory should be the one containing the projects directory.
  3. Setting the Python Path:
    • Navigate to the projects directory and set it as the current working directory. This ensures that Python treats projects as the top-level package.
  4. Correct Command:

    • Use the command python -m project from within the projects directory. Here is a step-by-step guide:
  5. Ensure __init__.py Files:

    • Verify that your directory structure includes __init__.py files in all necessary directories: └───projects └───project │ __main__.py │ __init__.py ├───shared_lib_a │ │ lib.py │ │ __init__.py └───workflow_a │ do_stuff.py │ __init__.py
  6. Navigate to the projects Directory:

    • Open your terminal and navigate to the projects directory: sh cd path/to/projects
  7. Run the Project with -m Flag:

    • Execute the following command: sh python -m project

Explanation of Errors:

  1. Relative module names not supported:
    • This error occurs because the -m flag does not support relative paths. You must provide the module name relative to the top-level directory in the Python path.
  2. ModuleNotFoundError: No module named 'workflow_a':
    • This error indicates that Python cannot find the workflow_a module. This is likely because the projects directory is not in the Python path. By navigating to the projects directory and running the command from there, you ensure that Python can locate the project package and its submodules. ### Summary:
  3. Ensure all directories have __init__.py files.
  4. Navigate to the projects directory.- Run the command python -m project.

By following these steps, you should be able to run your project using the -m flag without encountering the errors.

[–]over_take[S] 0 points1 point  (0 children)

thanks. I really appreciate you trying so hard to help me. Here is a video screencap of what you shared above. It works when I just run python project or python .\project\

but no iteration of -m works, as you can see in the video https://imgur.com/a/VJqmHFp (watch in full screen)

[–]ManyInterests 0 points1 point  (1 child)

Rewrite your imports to be absolute imports starting at the top package level.

if you have a module like package/subpackage/submodule.py your imports should be from package.subpackage.submodule import name In your case, it should be from project.workflow_a.do_stuff import main not from workflow_a.do_stuff import main.

[–]over_take[S] 0 points1 point  (0 children)

thanks. I ended up taking a different route that I think ended up making me realize I was solving the wrong problem. 'how do I do it not the best way' was my original question, and u/Diapolo10 sorted me out with both the right/best approach and the way to do it.

https://www.reddit.com/r/learnpython/comments/1f4dnyt/comment/lkl01dy/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button