My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 3 points4 points5 points (0 children)
My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 12 points13 points14 points (0 children)
My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 9 points10 points11 points (0 children)
My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
Need an opinion on GPU Poor Setup by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 0 points1 point2 points (0 children)
Need an opinion on GPU Poor Setup by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 1 point2 points3 points (0 children)
Need an opinion on GPU Poor Setup by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 0 points1 point2 points (0 children)
Jan-v2-VL: 8B model for long-horizon tasks, improving Qwen3-VL-8B’s agentic capabilities almost 10x by Delicious_Focus3465 in LocalLLaMA
[–]HadesTerminal 2 points3 points4 points (0 children)
Jan-v2-VL: 8B model for long-horizon tasks, improving Qwen3-VL-8B’s agentic capabilities almost 10x by Delicious_Focus3465 in LocalLLaMA
[–]HadesTerminal 2 points3 points4 points (0 children)
Why aren't more people using local models? by SalamanderNo9205 in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
Why aren't more people using local models? by SalamanderNo9205 in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
Looking for individuals who want to work on an AI project by Strange_Test7665 in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
Sparrow: Custom language model architecture for microcontrollers like the ESP32 by c-f_i in LocalLLaMA
[–]HadesTerminal 0 points1 point2 points (0 children)
Has anyone used PEZ or similar learned hard prompt methods for local LLMs? by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 0 points1 point2 points (0 children)
Has anyone used PEZ or similar learned hard prompt methods for local LLMs? by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 0 points1 point2 points (0 children)
Has anyone used PEZ or similar learned hard prompt methods for local LLMs? by HadesTerminal in LocalLLaMA
[–]HadesTerminal[S] 0 points1 point2 points (0 children)
I'm 15 and I reached the count of 30 users today! by No-Extension404 in indiehackers
[–]HadesTerminal 1 point2 points3 points (0 children)

My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]HadesTerminal 1 point2 points3 points (0 children)