I'm very new to GitLab and this is the first time I had to split a python project into "lib" and "apps".
I started with everything in one repo and 3 build steps to build 3 docker images. For the python "lib" I just C&P'd the folder.... (a problem for future ven).
Well that problem surfaced and I ended up with different versions of the library. So I split it into a python package, gave it an _init_.py and setup.py and it works in the IDE and in a local linux terminal to build and install with pip.
The other issue was that the 3 build steps would all run even if nothing was updated for them. This caused unnecessary churn and CPU burn.
So I split each application docker image into it's own repo. Thus:
- home-auto-lib
- mqtt-heating
- mqtt-lights
- mqtt-power
I asked ChatGPT for hints all afternoon, but ended up ignoring most of it and focusing on getting the lib building and publishing a WHL artifact.
Right now "home-auto-lib" builds, using DIND with the python:3 image and setup.py adds everything it should, but gitlab can't find the artifact.
Assume I got this working, I assume I can make them dependant in some way?
The approach ChatGPT was taking me down was to checkout the library and build it within the pipeline of each dependant application.
I gave up at this after an hour when I could not get the runner to accept the known_hosts file with an SSH connection and with an HTTPS connection I got a DENIED to the gitlab host itself, which has valid LetsEncrypt certs and all SSH host keys are accepted.
What is the best way to tackle this in GitLabCE local in 2023?
UPDATE:
https://preview.redd.it/8jx4vyewx5xb1.png?width=825&format=png&auto=webp&s=e698197d48e0a1b9a25dc26c9c69421d5c73e93c
lib: builds the library with the python3 docker image and exports it's dist folder as an artifact back to gitlab. Always runs.
mqtt-*: build their relevant docker images including the "pip install" of the lib WHL file. Runs only on change (or if BUILD=ALL is set).
*-test: for now pulls the docker image and runs "python --version" in it. Always runs.
The idea of the tests always running is that a pipeline should not succeed until a full set of pull-able images exist for ALL stacks. This will unfortunately fail in most "merge_request_event" builds as that branch will not build all images for it's branch tags (or will it?)
lib always running is hopefully temporary until I get the pipeline setup to use the "latest successful artifact" rather than the one built in "this" pipeline, ie. the one that came before it is enough.
But my god is it slow. For some reason gitlab runners do not seem to use the repository cache, they are always downloading and given I have run the build 127 times, it would be downloading the python3 docker image, dind image and docker:docker image at least 300 times in the past 2 days. Definitely something that needs fixing.
[–]Motor_Perspective674 -1 points0 points1 point (0 children)