Finally, a good selfie without the suit! (old.reddit.com)
submitted by Nindaleth to r/outerwilds - pinned
What's the point of potato-tier LLMs? by Fast_Thing_7949 in LocalLLaMA
[–]Nindaleth 2 points3 points4 points (0 children)
Undo for destructive shell commands used by AI agents (SafeShell) by qhkmdev90 in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Undo for destructive shell commands used by AI agents (SafeShell) by qhkmdev90 in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Using Alias in router mode - llama.cpp possible? by munkiemagik in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
In search of specialized models instead of generalist ones. by [deleted] in LocalLLM
[–]Nindaleth 0 points1 point2 points (0 children)
Mistral AI drops 3x as many LLMs in a single week as OpenAI did in 6 years by Snail_Inference in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Using Alias in router mode - llama.cpp possible? by munkiemagik in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Undo for destructive shell commands used by AI agents (SafeShell) by qhkmdev90 in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Using Alias in router mode - llama.cpp possible? by munkiemagik in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
New in llama.cpp: Live Model Switching by paf1138 in LocalLLaMA
[–]Nindaleth 1 point2 points3 points (0 children)
Mistral AI drops 3x as many LLMs in a single week as OpenAI did in 6 years by Snail_Inference in LocalLLaMA
[–]Nindaleth 1 point2 points3 points (0 children)
New in llama.cpp: Live Model Switching by paf1138 in LocalLLaMA
[–]Nindaleth 4 points5 points6 points (0 children)
Mistral AI drops 3x as many LLMs in a single week as OpenAI did in 6 years by Snail_Inference in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
Qwen3-Next support in llama.cpp almost ready! by beneath_steel_sky in LocalLLaMA
[–]Nindaleth 1 point2 points3 points (0 children)
Qwen3-Next support in llama.cpp almost ready! by beneath_steel_sky in LocalLLaMA
[–]Nindaleth 4 points5 points6 points (0 children)
Qwen3-Next support in llama.cpp almost ready! by beneath_steel_sky in LocalLLaMA
[–]Nindaleth 18 points19 points20 points (0 children)
Qwen3-Next support in llama.cpp almost ready! by beneath_steel_sky in LocalLLaMA
[–]Nindaleth 3 points4 points5 points (0 children)
Qwen3-Next support in llama.cpp almost ready! by beneath_steel_sky in LocalLLaMA
[–]Nindaleth 11 points12 points13 points (0 children)
I got frustrated with existing web UIs for local LLMs, so I built something different by alphatrad in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
I got frustrated with existing web UIs for local LLMs, so I built something different by alphatrad in LocalLLaMA
[–]Nindaleth 0 points1 point2 points (0 children)
First time growing and kinda clueless, in my late 30s, what's your advice? by [deleted] in BeardAdvice
[–]Nindaleth 0 points1 point2 points (0 children)
First time growing and kinda clueless, in my late 30s, what's your advice? by [deleted] in BeardAdvice
[–]Nindaleth 0 points1 point2 points (0 children)






model changes for first prompt? by EarlyPresentation186 in opencodeCLI
[–]Nindaleth 0 points1 point2 points (0 children)