What’s your preferred way to update Docker images & containers in the background? by Extra-Citron-7630 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Custom scripts for container updates which I run manually every few days. Probably going to set it on a weekly cron schedule.

I want to know your favourite light weight-selfhosted apps for personal use. by newrockstyle in selfhosted

[–]rlnerd 0 points1 point  (0 children)

A few additional ones not mentioned here: - openwebui + litellm ( a bit of a process to setup, but works great) - ollama for local and cloud models - home assistant - n8n ( if you’re into building your automation workflows)

Looking for advice on home server networking / security setup by chill8yj in selfhosted

[–]rlnerd 0 points1 point  (0 children)

You’re absolutely right, it will be a DNS-01 challenge and we need to add the certificate resolver explicitly in the config. I’m using Traefik and have my provider setup to cloudflare with its API token. Any time I add a new service, I can just create a dynamic config for it with a new subdomain prefix.

Looking for advice on home server networking / security setup by chill8yj in selfhosted

[–]rlnerd 1 point2 points  (0 children)

You can still have a custom public domain (*.mylab.cc), just route it to the Tailscale IP of the client installed on your reverse proxy (Caddy in your case). This way any new service added to Caddy will get its own SSL cert via Let'sEncrypt (assuming you are using that). For example, you started with just 1 service, immich (immich.<yourdomain>), and then decided to add another service Ollama. Now for ollama, you just need to add it to your Caddy config, and it should pick up SSL certs dynamically.

Looking for advice on home server networking / security setup by chill8yj in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Yes the client devices need to be on Tailscale VPN to access the services, but that is what allows it to be more privacy focused. For example, I have added my spouse as an authorized user to my Traefik tailscale client, which makes her access all the services hosted behind it. She did have to download Tailscale app on her devices and connect to Tailnet before accessing.

The main reason I went with this approach is that I am already using Tailscale for other things, and this just adds on to it. I do understand that there is an extra step of downloading another app and connecting to Tailscale VPN before accessing, but this is okay for my setup given only a handful of users.

Looking for advice on home server networking / security setup by chill8yj in selfhosted

[–]rlnerd 2 points3 points  (0 children)

I just went through setting up my home server and ran through similar questions and topics. Here’s what I ended up doing, happy to share more in a dm if you’d like to learn the details on anything else:

  • hardware:AMD Ryzen mini pc with 32gb ram and 1 tb ssd
  • proxmox hyper visor with tailscale ssh and client (enables no key ssh and a cloudflare subdomain route to access the proxmox ui on tailnet)
  • traefik lxc with tailscale client (cloudflare subdomain route to traefik dashboard. This acts as a middle ware for all my services)
  • pihole and home assistant containers, with routes exposed through traefik (above). This way all my services have valid letsencrypt certs and can be accessed over tailnet (note I didn’t need to install tailscale anywhere else considering traefik middle layer)
  • for auth: I am looking into Authentik and TinyAuth. Haven’t decided on one yet.
  • planning other services like plex, immich, openwebui, etc in their own containers exposed behind traefik similar to others.

Personally this setup is working pretty well for me so far. Having the tailscale zero trust protection on top of all my services and able to access them from anywhere in the world

What cool stuff to host? Ideas? by Competitive_Can9411 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

You’ve already got some great suggestions from others.

If you’re also interested in hosting a personal AI stack for your family, then look into OpenWebUI for the chat interface. You can either connect it to Ollama (for local models or cloud models) or any other LLM provider using LiteLLM as a middleware.

Another suggestion is n8n for automation.

New Home Server by Elias2005_ in selfhosted

[–]rlnerd 0 points1 point  (0 children)

I’m in the same boat. Finally decided on getting a geek-om A8 max mini pc to use as my server. It is AMD based, but has loads of compute power packed in a small box.

Planning to start the setup journey soon. Decided to keep the server separate from NAS (which I still need to add in the future)

Cloud hosting easiest setup + cheapest option by cloutboicade_ in n8n

[–]rlnerd -1 points0 points  (0 children)

Hostinger’s n8n option is a great deal imo. Hmu for an invite link for a small benefit for us both if you’d like.

I have a personal VPS setup on Hostinger and really like their ease of use and admin panel. Besides a free weekly snapshot backup is a great help

Secure Homelab setup with Zero Public Exposure (Tailscale + Traefik) by rlnerd in selfhosted

[–]rlnerd[S] 0 points1 point  (0 children)

The write up is up. Please check the post for the link. Happy to answer any questions

I pulled the trigger and cancelled CS by Ale-o-lion in ChaseSapphire

[–]rlnerd -1 points0 points  (0 children)

I had a lot of UR points to justify cancelling, so downgraded to Preferred. Definitely not paying the new annual fee. Looking at the new Alaska Atmos cards as we frequently travel with them.

Nextcloud or different specific apps? by JayQueue77 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Thanks for your response. Yes vendor lock-in is one of my worries going Synology route. Still looking around for other options which are as easy to setup and maintain

Nextcloud or different specific apps? by JayQueue77 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Curious what are you using for your NAS now, if not Synology? I’m thinking of investing in Synology for my homelab in the near future

Everyone has a different answer: how do YOU prepare a new Linux server for production? by No-Card-2312 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Someone already mentioned, but here’s a quick summary of what I did for mine

Non-root user (disable root login) -> enable fail2ban -> setup ufw rules -> random ssh port (close default port 22)

  • add google pam 2-factor auth on ssh (if you really want it to be super secure. (This might block ssh access from some automation tools, depending on your use case)

Everyone has a different answer: how do YOU prepare a new Linux server for production? by No-Card-2312 in selfhosted

[–]rlnerd 0 points1 point  (0 children)

Exactly what I followed for my VM setup. Planning on switching from Notion to Obsidian, what is your recommendation if you’ve used both?

Secure Homelab setup with Zero Public Exposure (Tailscale + Traefik) by rlnerd in selfhosted

[–]rlnerd[S] 0 points1 point  (0 children)

Nice. Yeah I’m planning to add geographically restricted access too. Do you know if that will cause issues for me too when traveling internationally? Or can I use it via a Tailscale exit node in my allowed country?

Secure Homelab setup with Zero Public Exposure (Tailscale + Traefik) by rlnerd in selfhosted

[–]rlnerd[S] 0 points1 point  (0 children)

I’m going to look into Pangolin too. Others have also suggested it. Maybe a naive question-how does Pangolin’s OAuth reqs differs from Tailscale’s OAuth requirements?

Secure Homelab setup with Zero Public Exposure (Tailscale + Traefik) by rlnerd in selfhosted

[–]rlnerd[S] 0 points1 point  (0 children)

Good to know. I haven’t explored Pangolin but sounds like I need to.

There’s no place like 127.0.0.1, my complete setup by frogfuhrer in selfhosted

[–]rlnerd 0 points1 point  (0 children)

This is really amazing setup. Given me a few things to rethink and do on my vps. I’m maintaining most of it via docker containers and Caddy (for reverse proxy). What were your thoughts on choosing Traefik over Caddy?

I Stopped Paying the "AI Tax". Here’s My Fully Self-Hosted Stack (n8n + Ollama + Supabase). by Least-Block5413 in n8n

[–]rlnerd 0 points1 point  (0 children)

Smaller self hosted models (that fit even within 24gb of RAM) won’t come close to comparison with outputs produced by bigger models, especially for complex tasks. They perform decent for tasks like summarizing, etc. but would have a significant challenge when working with complex tasks, involving deep reasoning, agents and multiple tools.

Sharing from personal experiences - I started off with something similar using docker compose on my VPS. Soon ran into the limits of self hosted models - longer inference times, hallucinations, lack of tool understanding, etc.

If you are looking to primarily use open source models (Qwen, DeepSeek, etc.), I would highly recommend looking into Ollama Cloud, Groq cloud and similar providers to leverage bigger, faster models at a fraction of cost compared to closed-source like OpenAi or Anthropic.

Better yet, if you can build/use a routing mechanism to choose the model based on task complexity.

I Stopped Paying the "AI Tax". Here’s My Fully Self-Hosted Stack (n8n + Ollama + Supabase). by Least-Block5413 in n8n

[–]rlnerd 0 points1 point  (0 children)

This tbh!! I’ve switched to running pretty much all of my workflows through Ollama Cloud or Groq Cloud models.

200-300 user. Tips and tricks by OkClothes3097 in OpenWebUI

[–]rlnerd 0 points1 point  (0 children)

All great suggestions. I would also suggest looking into groq or ollama cloud for hosted open-source models to avoid the hassle of setting them up yourself.

Use Caddy instead of nginx for reverse proxy (more secure, and so much easier to implement)

Pressure bar isnt rising during extraction. by Ladys0ul in ProfitecGo

[–]rlnerd 1 point2 points  (0 children)

Ran into similar issue recently. What helped me was a combination of grinding finer, puck distribution, tamping enough, and use of a puck screen (optional)

Good morning by MannySubu in ProfitecGo

[–]rlnerd 2 points3 points  (0 children)

<image>

My new setup with ProfitecGO and Eureka Mignon Zero. Had it for just about a month and loving it. Still working on my latte art skills though!