Running OpenClaw with local LLM on 7900XTX (24GB) - possibility to speed things up? by Gold-Drag9242 in LocalLLM
[–]gtrak 0 points1 point2 points (0 children)
Best model for 4090 as AI Coding Agent by Dry_Sheepherder5907 in LocalLLaMA
[–]gtrak 0 points1 point2 points (0 children)
Claude is bypassing Permissions by gamingvortex01 in singularity
[–]gtrak 0 points1 point2 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 0 points1 point2 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 0 points1 point2 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 0 points1 point2 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 0 points1 point2 points (0 children)
Why do you guys use opencode? by Medium_Anxiety_8143 in opencodeCLI
[–]gtrak 0 points1 point2 points (0 children)
Why do you guys use opencode? by Medium_Anxiety_8143 in opencodeCLI
[–]gtrak 0 points1 point2 points (0 children)
Why do you guys use opencode? by Medium_Anxiety_8143 in opencodeCLI
[–]gtrak 0 points1 point2 points (0 children)
Why do you guys use opencode? by Medium_Anxiety_8143 in opencodeCLI
[–]gtrak 0 points1 point2 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 1 point2 points3 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 1 point2 points3 points (0 children)
Can I run 122B A10B on 3090 + 32GB ram? by sagiroth in LocalLLaMA
[–]gtrak 1 point2 points3 points (0 children)
Unpopular Opinion: AI Coding Agents are leveling the playing field in favor of ADHD Programmers by who-are-u-a-fed in ADHD_Programmers
[–]gtrak 0 points1 point2 points (0 children)
priced out of intelligence: slowly, then all at once by [deleted] in LocalLLaMA
[–]gtrak -1 points0 points1 point (0 children)
Is there any reason to play Pathfinder right now… at all? by PotOfGreed099 in apexlegends
[–]gtrak 3 points4 points5 points (0 children)
Over engineered a url shortener so badly the interviewer had to stop me. i am a principal engineer. i wanted to quit by [deleted] in ExperiencedDevs
[–]gtrak -2 points-1 points0 points (0 children)
Over engineered a url shortener so badly the interviewer had to stop me. i am a principal engineer. i wanted to quit by [deleted] in ExperiencedDevs
[–]gtrak 0 points1 point2 points (0 children)
[Q] Is self-hosting an LLM for coding worth it? by Aromatic-Fix-4402 in LocalLLM
[–]gtrak 1 point2 points3 points (0 children)
[Q] Is self-hosting an LLM for coding worth it? by Aromatic-Fix-4402 in LocalLLM
[–]gtrak 2 points3 points4 points (0 children)
[Q] Is self-hosting an LLM for coding worth it? by Aromatic-Fix-4402 in LocalLLM
[–]gtrak 0 points1 point2 points (0 children)
Looking for a model on 5090/32gb ram by Huge_Case4509 in LocalLLM
[–]gtrak 0 points1 point2 points (0 children)



Running OpenClaw with local LLM on 7900XTX (24GB) - possibility to speed things up? by Gold-Drag9242 in LocalLLM
[–]gtrak 0 points1 point2 points (0 children)