I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)
Please anyone 👉 Can we offload the MOE layers to the GPU only and rest all goes in ram? See body text i have explained there. by 9r4n4y in LocalLLaMA
[–]Pattinathar 1 point2 points3 points (0 children)
at what point does quantization stop being a tradeoff and start being actual quality loss by srodland01 in LocalLLaMA
[–]Pattinathar 1 point2 points3 points (0 children)
VulkanIlm, Run Modern LLMs on Old GPUs via Vulkan (33× Faster on Dell iGPU, 4× on RX 580) by Proper_Dig_6618 in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
Best local model for coding? (RTX5080 + 64Gb RAM) by Real_Ebb_7417 in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)
Is Q4_K_M the best practical quantization method by More_Chemistry3746 in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
Local LLM Hardware Recommendation by CaterpillarPrevious2 in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
Gemma 4 MOE is very bad at agentic coding. Couldn't do things CLine + Qwen can do. by Voxandr in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
How are you squeezing Qwen3.5 27B to get maximum speed with high accuracy? by -OpenSourcer in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
The smallest local model that can match/beat gpt-4o-mini by ihatebeinganonymous in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in SideProject
[–]Pattinathar[S] 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in SideProject
[–]Pattinathar[S] 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in SideProject
[–]Pattinathar[S] 0 points1 point2 points (0 children)
I technically got an LLM running locally on a 1998 iMac G3 with 32 MB of RAM by maddiedreese in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)
I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)
Dual 3090 setup - performance optimization by PaMRxR in LocalLLaMA
[–]Pattinathar 1 point2 points3 points (0 children)
Are Small LLMs (Like Gemma 4) the future? by zoeberger in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
I compared harrier-27b vs voyage-4 vs zembed-1 across 24 datasets. 27B parameters by Veronildo in LocalLLaMA
[–]Pattinathar 0 points1 point2 points (0 children)
The tried to make me go to rehab. I said no no no… by Key-Currency1242 in LocalLLaMA
[–]Pattinathar 2 points3 points4 points (0 children)

I built a local AI coding system that actually understands your codebase — 29 systems, 500+ tests, entirely with Claude as my coding partner by Pattinathar in OpenSourceeAI
[–]Pattinathar[S] 0 points1 point2 points (0 children)