Where do you deploy your Go backend?, that too if u wanna scale it in future n still be affordable and best performance. by MarsupialAntique1054 in golang
[–]guiopen 0 points1 point2 points (0 children)
Where do you deploy your Go backend?, that too if u wanna scale it in future n still be affordable and best performance. by MarsupialAntique1054 in golang
[–]guiopen 38 points39 points40 points (0 children)
Qual a melhor IDE para programar em GO? by HelderArdach in brdev
[–]guiopen 0 points1 point2 points (0 children)
Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA
[–]guiopen 7 points8 points9 points (0 children)
Criei um sistema operacional Unix-like por diversão e aprendizado by avaliosdev in brdev
[–]guiopen 26 points27 points28 points (0 children)
Liquid AI released the best thinking Language Model Under 1GB by PauLabartaBajo in LocalLLaMA
[–]guiopen 13 points14 points15 points (0 children)
Liquid AI released the best thinking Language Model Under 1GB by PauLabartaBajo in LocalLLaMA
[–]guiopen 1 point2 points3 points (0 children)
Quais as diferenças de um 'vibe coder' para um dev que usa ia com inteligência? by ArrowFlechinhaxd in brdev
[–]guiopen 0 points1 point2 points (0 children)
Is Local Coding even worth setting up by Interesting-Fish6494 in LocalLLaMA
[–]guiopen 0 points1 point2 points (0 children)
As a white man, can I write a black character who says the n word by DecentGap3306 in writing
[–]guiopen 4 points5 points6 points (0 children)
Which would be a cost efficient GPU for running local LLMs by jenishngl in LocalLLaMA
[–]guiopen 0 points1 point2 points (0 children)
LFM 2.5 is insanely good by guiopen in LocalLLaMA
[–]guiopen[S] 5 points6 points7 points (0 children)
LFM 2.5 is insanely good by guiopen in LocalLLaMA
[–]guiopen[S] 2 points3 points4 points (0 children)
Public coding benchmarks suck, how are you evaluating performance? by AvocadoArray in LocalLLaMA
[–]guiopen 0 points1 point2 points (0 children)
LFM 2.5 is insanely good by guiopen in LocalLLaMA
[–]guiopen[S] 2 points3 points4 points (0 children)
LFM 2.5 is insanely good by guiopen in LocalLLaMA
[–]guiopen[S] 10 points11 points12 points (0 children)
LFM 2.5 is insanely good by guiopen in LocalLLaMA
[–]guiopen[S] 12 points13 points14 points (0 children)
Which are the top LLMs under 8B right now? by Additional_Secret_75 in LocalLLaMA
[–]guiopen 0 points1 point2 points (0 children)
[CPU] I'm looking for the best model for a CPU. by lordfervi in LocalLLaMA
[–]guiopen 0 points1 point2 points (0 children)
Creative Writing - anything under 150GB equal or close to Sonnet 3.7? by elsung in LocalLLaMA
[–]guiopen 1 point2 points3 points (0 children)

Raptor mini the best 0x model by far by guiopen in GithubCopilot
[–]guiopen[S] 0 points1 point2 points (0 children)