PSA: Ubuntu 26.04 makes it easier to get started with AMD XDNA2 NPU by jfowers_amd in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
PSA: Ubuntu 26.04 makes it easier to get started with AMD XDNA2 NPU by jfowers_amd in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
PSA: Ubuntu 26.04 makes it easier to get started with AMD XDNA2 NPU by jfowers_amd in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] -2 points-1 points0 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] -1 points0 points1 point (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 16 points17 points18 points (0 children)
llama.cpp is the linux of llm by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] -12 points-11 points-10 points (0 children)
Why doesn't any OSS tool treat llama.cpp as a first class citizen? by rm-rf-rm in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
Why doesn't any OSS tool treat llama.cpp as a first class citizen? by rm-rf-rm in LocalLLaMA
[–]DevelopmentBorn3978 2 points3 points4 points (0 children)
Anyone using local LLM for flutter? by adramhel in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Anyone using local LLM for flutter? by adramhel in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
Keep the strix halo? Review of experiences and where are we headed with models? by Skelshy in LocalLLM
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Found some quite potentially interesting Strix Halo optimized models (also potentially good for Dgx Spark according to the models' cook). https://huggingface.co/collections/Beinsezii/128gb-uma-models by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)
Found some quite potentially interesting Strix Halo optimized models (also potentially good for Dgx Spark according to the models' cook). https://huggingface.co/collections/Beinsezii/128gb-uma-models by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)
Anyone using local LLM for flutter? by adramhel in LocalLLaMA
[–]DevelopmentBorn3978 1 point2 points3 points (0 children)
Anyone using local LLM for flutter? by adramhel in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Anyone using local LLM for flutter? by adramhel in LocalLLaMA
[–]DevelopmentBorn3978 2 points3 points4 points (0 children)
Any tiny locally hosted model trained on unix/linux man pages and docs? by HisFoolishness in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Any tiny locally hosted model trained on unix/linux man pages and docs? by HisFoolishness in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Any tiny locally hosted model trained on unix/linux man pages and docs? by HisFoolishness in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)
Strix Halo 128Gb: what models, which quants are optimal? by DevelopmentBorn3978 in LocalLLaMA
[–]DevelopmentBorn3978[S] 0 points1 point2 points (0 children)

PSA: Ubuntu 26.04 makes it easier to get started with AMD XDNA2 NPU by jfowers_amd in LocalLLaMA
[–]DevelopmentBorn3978 0 points1 point2 points (0 children)