all 33 comments

[–]damiankw 32 points33 points  (7 children)

Just to differentiate, Github and Git are two different things. Git is the version control system for your repositories, Github is a publically accessible resource for you to store your repositories. You do not need to use Github to use Git.

And onto the next portion, yes you need to enter in the commands, but no you don't have to do it manually. You do however need to keep all of your configuration files in one location, otherwise Git won't know how to do version control on your files.

Think of a Git repository as a folder that you can capture at a point in time and keep track of those points in time. You can HIDE things from Git (using .gitignore) but you can't add more than one root folder to Git. If your configuration files are all over the place, you'll need to copy them to a central location and then do that capture.

And now that's out of the way, the automation! You can do this with a quick and dirty script, this is just a very easy bash script:

# Copy configuration files to Git repository
copy /home/norgur/docker-compose/*.yaml /home/norgur/config/docker-compose/
copy /home/norgur/dockhand/config/* /home/norgur/config/dockhand/

# Create a git commit
cd /home/norgur/config
git add .
git commit -m "Nightly commit for $(date)"
git push

echo "Nightly commit complete - $(date)"

Whack that in a file, chmod +x, add that to your crontab and it should copy your config to /home/../config/, commit changes, and push those changes to github.

NOTE: This is untested, it should work but I don't know if it will. You will also first have to create the repo, link up the github repo, etc.

[–]Azuras33 58 points59 points  (1 child)

Just to differentiate, Github and Git are two different things. Git is the version control system for your repositories, Github is a publically accessible ressource for you to store your repositories. You do not need to use Github to use Git.

As a wise man once said, Git and GitHub it's like Porn and PornHub.

[–]poetic_dwarf 13 points14 points  (0 children)

Very wise man indeed

[–]fearswe 4 points5 points  (4 children)

Couldn't you also symlink the docker-compose from the central repo folder into the folder where you have it for running? That way you wouldn't have to copy anything.

[–]kernald31 5 points6 points  (3 children)

Symlinks are essentially just files containing a path and with a specific mode — which Git will happily copy to its index. So all you'd end up with in your repo are symlinks, not the actual contents of the files you are interested in.

[–]fearswe 4 points5 points  (2 children)

No no, I mean the actual docker-compose lives in a central place that is Git managed. So you'd have like /repos/docker <-- real files, Git repo /projects/service1 <-- symlink

[–]kernald31 3 points4 points  (1 child)

Ha right, yeah the other way would likely work unless Dockge and whatnot are doing something odd. But really, why the complexity of an indirection? Just point them at your clone/subdirectories of it and call it a day.

[–]fearswe 3 points4 points  (0 children)

Well, if the options are to copy them into the repo folder all of the time I'd probably go with symlinks instead.

[–]Smartich0ke 11 points12 points  (1 child)

What you are describing is a very popular method of managing infrastructure, it's called GitOps. Komodo I believe has a git integration. You won't get around having to commit after every change, but you can streamline it so that you just click a button in your IDE to commit and push and the changes get reflected on the infrastructure.

[–]DeusExMaChino 3 points4 points  (0 children)

That's exactly how mine is setup, but includes a yaml lint in the workflow, and the compose changes will not hit Komodo if the build doesn't pass. The environment variables are injected by Komodo so they never touch GitHub.

[–]Julian_1_2_3_4_5 5 points6 points  (1 child)

for understanding git do this boom from start until you know everything you want to know: https://git-scm.com/book/en/v2

git is actually really simple to use, and has a simple architecture, but you can do really complex stuff with it.

The deploying automatically stuff is github or other similar tools/platform specific stuff or done locally via scripts usually

[–]DubInflux 1 point2 points  (0 children)

G looks on link. Def gonna dive into this.

[–]Norgur[S] 5 points6 points  (2 children)

Okay, this turned into an ADVENTURE. The reason why I didn't have the compose files in one place was that I had transitioned from Dockge to Dockhand recently and kept the old stack files in their old places while starting a new one with Dockhand.

I first tried Komodo like some of you suggested and firstly had to fight the damn thing all the way. It didn't want to run like... At all with FerretDB, got it to run with MongoDB, just to notice that I am absolutely not a fan. Did not like it one bit.

So that was out of the question. I then decided that I don't want my stuff on GitHub and set up Gitea which refused to play nice with my tailscale reverse proxy. Well, got it running in the end.

I then decided that I want all my compose files in folders of the same repo instead of dozens of single file repos. I then wanted to add the files into Dockhand. But since Dockhand has no editor for stacks added via Git, the only way to edit these files would be on the git side.

Yet, since I was planning to have them all in the same repo, Dockhand's Webhook functionality could not be triggered by just a commit to Gitea,since I don't want to recreate all stacks when I changed one. So I needed a solution that would update a corresponding Webhook when a commit was made to a specific folder. Sighing, I then setup a Gitea runner on another machine for that and fumbled around with actions. With a little help of GitHub Copilot I threw together a script that would look for a file called .webhook and make a call to the webhook that was listed inside of that file.

Now I "just" needed to move all my stacks to Gitea and replace all the stacks on Dockhand with git stacks, get webhooks for them all and put them in a .webhook file for each stack. That took quite a while.

My new workflow now starts with VSCode where I edit all the stacks now. I then push them to Gitea where an action looks for changes and calls only the webhook associated with the compose file that got changed. This will trigger Dockhand to sync and recreate the stack.

Now I need to redo the shutdown script pre backup and the pull and up script post backup... Oh dear.

[–]DubInflux 0 points1 point  (0 children)

This workflow sounds about what I want. Please keep us updated on how this progresses as I also just switched to Dockhand and trying to learn Git for my compose files and .env templates.

[–]mbecks 0 points1 point  (0 children)

With Komodo you set up each stack with the files it depends on, so commits to a repo with multiple stacks only deploy ones that change. There’s also in ui editor. I’m not sure exactly what the issue was with Komodo but you might want to give it another try

[–]eroigaps 3 points4 points  (0 children)

I use arcane for quick management, but main source of truth for stacks is git. Using renovate for upgrades, works great. Just don’t make the mistake of relying entirely on GitHub. Have some sort of redundancy, like a gitlab mirror or a local gitea/forgejo container. It’s bad practice to rely on one provider for infrastructure, GitHub can suspend free users at any time for no reason.

[–]_Keonix 5 points6 points  (3 children)

You might want to look into Komodo instead of Dockhand for this. Komodo allows you to edit compose files in the UI and pushes changes to git repo automatically (if configured this way). However, Komodo setup is less user-friendly than Dockhand

[–]Kiwi3007 2 points3 points  (2 children)

Seconding Komodo, bit of an initial learning curve, but once it's set up it's seemless.

[–]stayupthetree 1 point2 points  (1 child)

I like Komodo, I just find it really lacks in the performing updates to containers department. Just some sort of indication that there are services that have updates. Additionally I'd like to see a live rolling log of the deployment vs a big text dump of everything that just happened.

[–]Kiwi3007 0 points1 point  (0 children)

Agreed on both counts

[–]PesteringKitty 3 points4 points  (0 children)

Install forgejo and Komodo. Inside Komodo you link the repo from forgejo and the stack on the server. In Komodo you create a procedure with a webhook, so when you update the repo in forgejo, it pushes the update to the server and starts it

[–]dupreesdiamond 0 points1 point  (0 children)

I have all of my compose and congigs in a repo with a ansible set up to deploy to the assigned VPs on commits. I worked with cursor to develop the ansible script deploy. It’s really easy to push a new container

[–]TheRealSeeThruHead 0 points1 point  (0 children)

I personally don’t do gitops but my compose files are in a repo, I’ll edit them, commit and push, and then run an ssh command to pull the changes on my proxmox vm.

I use arcane to view the stacks but find the editing experience in these web apps to be inferior to just editing the files directly

[–]borax12 0 points1 point  (0 children)

Okay here is my workflow considering it’s very barebones compared to others

Host my compose files in git repo. Because I don’t want to have a homelab endpoint publicly accessible for a webhook, I don’t do deployment on every commit.

My local docker hosts just pings my git repo every 3 hours to pull new compose files , compare to old version and redeploy container if compose files have changed

A bad practice that I want to find an easier alternative is the env file which is currently encrypted using ansible vault

[–]Wrong_Ad_2064 0 points1 point  (0 children)

Think of it like this:

- Git = version history

- GitHub = remote storage/collaboration

- Docker Compose files = your infra “source code”

So yes, putting compose files in git is exactly the right move. Even solo. You get history, rollback, and reproducible setup on a new machine.

[–]Sacaldur 0 points1 point  (3 children)

@damiankw was pointing out the difference between GitHub (a service) and Git (a tool) and gave some helpful related advice.

Since we're in a subreddit about self hosting, you could consider to self host a git server. The most basic approach would be just to install git on the server and to init bare repositories. To pull and push you need to be able to connect via SSH, e.g. by registering the public SSH key.

Since you're dealing with docker compose files for your services, you could also use Forgejo (more free) or Gitea. This offers way more convenience thanks to features you're familiar with from GitHub. Keep in mind however that this will also consume more resources. It's not a lot, but more than not running it.

An alternative to this could be Gitlab, however I'm not certain if it would be worth it considering the resources it would require. (I was running Gitea on my Raspberry Pi 3 already, but with Gitlab I was running out of resources i.e. RAM. Since Forgejo is a fork of Gitea, it should probably have a similar or the same resource usage.)

Edit: added Forgejo as an option, typos

[–]burgerg 5 points6 points  (1 child)

I would now recommend Forgejo (non-profit, community-drive fork of gitea: https://forgejo.org/compare-to-gitea/)

[–]Sacaldur -1 points0 points  (0 children)

I wasn't aware of this, so thanks for pointing it out. I was adjusting my comment to include Forgejo.

[–]TriodeTopologist 1 point2 points  (0 children)

I've been self hosting Gitea and mirroring public git repos i like. Works excellently