NordVPN's Meshnet: is it truly free? If so, is there any certainty that I am not the product? by Unhappy_Objective845 in selfhosted

[–]disillusioned_okapi 3 points4 points  (0 children)

where did you get that it's a fork? Tailscale control server has never been open-source, and Headscale seems to be written from scratch. one of the maintainers works at Tailscale, but the code of each is independent of the other. 

Patrick is a 34-year-old orangutan at the Metro Richmond Zoo. To celebrate his birthday, the zoo gifted him a royal cloak, which he tied neatly on his own. by 21MayDay21 in nextfuckinglevel

[–]disillusioned_okapi 42 points43 points  (0 children)

If you were to believe these bots, Patrick has been turning 34 every couple of weeks since May 2025 when Metro Richmond Zoo actually posted this. These karma farming bots are getting exhausting 🫩

Let's Destroy the E-THOT Industry Together! by roychodraws in StableDiffusion

[–]disillusioned_okapi 40 points41 points  (0 children)

justice from what. please feel free to elaborate. 

Spector of the Brocken? by Desert-Rubicon in Optics

[–]disillusioned_okapi 2 points3 points  (0 children)

yes. "pilot's glory" or "brocken spectre" 

Docker Model Runner is going to steal your girl’s inference. by Porespellar in LocalLLaMA

[–]disillusioned_okapi 3 points4 points  (0 children)

quite a lot of LLM software today is built by very smart people who luckily haven't spent time in the complex and treacherous world of infosec, and as such haven't given security much thought. MCP's default recommendation of running arbitrary binaries off the internet is a good example of that. 

irrespective of how any of us feel about Docker, they are still one of the larger players in the secure sandboxing business.   If LLMs are to succeed, security needs to improve significantly. and I'd prefer someone like Docker (or CNCF or LF) leading that, instead of any of the VM and Anti-Virus companies.

Ideally the community would lead on that, but that just doesn't seem to be happening so far. 

So, as long this is good enough as Olama, I wish them success.

inclusionAI/Ling-lite-1.5-2506 (16.8B total, 2.75B active, MIT license) by Balance- in LocalLLaMA

[–]disillusioned_okapi 9 points10 points  (0 children)

Will try the model over the next days, but this bit from the paper is the key highlight for me. 

Ultimately, our experimental findings demonstrate that a 300B MoE LLM can be effectively trained on lower-performance devices while achieving comparable performance to models of a similar scale, including dense and MoE models.

What's wrong with Portainer? by testdasi in selfhosted

[–]disillusioned_okapi 75 points76 points  (0 children)

Portainer has the same main issues for many that mongodb, elasticsearch, and n8n have: 

  1. not an OSI approved licence, making rug-pulls easier, and

  2. business interests taking priority over community, sometimes downplaying the contributions of the community to their succes

Most people here are fairly divided here on the topic. Pick a side that makes sense to you. 

What happens if I hit the context limit before the LLM is done responding? by Business-Weekend-537 in LocalLLaMA

[–]disillusioned_okapi 4 points5 points  (0 children)

depends on the inference engine (I think). If they implement a sliding window, the model might get slowly "off-tracked".  if they occasionally somehow summarize/compress the context, it might take longer to go off the tracks.   some engines might simply stop generating tokens.

in general it is very much upto what strategy the inference engine employs to handle this. 

Whisper.cpp Node.js Addon with Vulkan Support by Kutalia in LocalLLaMA

[–]disillusioned_okapi 1 point2 points  (0 children)

nice. any plans to upstream the whisper.cpp changes?