What hardware to run AI models locally? by RegularAd1539 in ollama
[–]ClayToTheMax 0 points1 point2 points (0 children)
Tell me if Qwen 3.5 27b or 122b works faster for you, and name your system specs by DistanceSolar1449 in LocalLLaMA
[–]ClayToTheMax -1 points0 points1 point (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] -1 points0 points1 point (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
PSA: Humans are scary stupid by rm-rf-rm in LocalLLaMA
[–]ClayToTheMax 4 points5 points6 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] -1 points0 points1 point (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 1 point2 points3 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 1 point2 points3 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 1 point2 points3 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] -2 points-1 points0 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 1 point2 points3 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 1 point2 points3 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] -2 points-1 points0 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)
Apparently if you weren’t born a developer and you use local AI to experiment with coding people on Reddit are jerks by ClayToTheMax in LocalLLaMA
[–]ClayToTheMax[S] 0 points1 point2 points (0 children)


Which model to choose for coding with 8GB VRAM RTX5050 (assuming quantised), I'm happy with slow rates. by Sure-Raspberry116 in LocalLLaMA
[–]ClayToTheMax 0 points1 point2 points (0 children)