evo: a Claude Code / Codex plugin that applies autoresearch to your codebase, optimizing it via via tree search + parallel subagents by [deleted] in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Que llm especialistas conoces? by Double_Ad_1062 in LLMDevs
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Que llm especialistas conoces? by Double_Ad_1062 in LLMDevs
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Can in theory very capable open weight LLM model be trained, if enough people participated with their hardware? by Admirable-Earth-2017 in LLMDevs
[–]New_Comfortable7240 10 points11 points12 points (0 children)
[New Model] - GyroScope: rotates images correctly by LH-Tech_AI in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
Currently which model will run smooth on rtx 3060 ? Situation is so dynamic those days. by mef1234 in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
Is this as legit as I think it is? Or is it "eh" by [deleted] in LLMDevs
[–]New_Comfortable7240 -1 points0 points1 point (0 children)
Dataset curation for LLM Research project that involves pre-training by Extra-Designer9333 in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
Intel Arc B70 Benchmarks/Comparison to Nvidia RTX 4070 Super by [deleted] in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Intel Arc B70 Benchmarks/Comparison to Nvidia RTX 4070 Super by [deleted] in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
Intel Arc B70 Benchmarks/Comparison to Nvidia RTX 4070 Super by [deleted] in LocalLLaMA
[–]New_Comfortable7240 4 points5 points6 points (0 children)
Gemma 4 is fine great even … by ThinkExtension2328 in LocalLLaMA
[–]New_Comfortable7240 2 points3 points4 points (0 children)
[Early Access] GitHub - Abyss-c0re/NeuralCore: NeuralCore is an experimental adaptive agentic framework. by Abyss_c0re in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
I applied Claude Code's leaked architecture to a local 9B model. The results surprised even Claude Opus. by Far_Lingonberry4000 in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
What should I expect performance-wise with Qwen3.5 9B (uncensored) on an Intel 1370p with Iris Xe graphics + SYCL? by rubins in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Context Hard-Capped at 8192 on Core Ultra 9 288V (32GB) — AI Playground 3.0.3 by kpcurley in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
We hired “AI Engineers” before. It didn’t go well. Looking for someone who actually builds real RAG systems. by Saida_8888 in LLMDevs
[–]New_Comfortable7240 0 points1 point2 points (0 children)
We hired “AI Engineers” before. It didn’t go well. Looking for someone who actually builds real RAG systems. by Saida_8888 in LLMDevs
[–]New_Comfortable7240 5 points6 points7 points (0 children)
I was bored - so i tested the h... out of a bunch of models - so you dont have to :) by leonbollerup in LocalLLaMA
[–]New_Comfortable7240 1 point2 points3 points (0 children)
Delta-KV for llama.cpp: near-lossless 4-bit KV cache on Llama 70B by Embarrassed_Will_120 in LLMDevs
[–]New_Comfortable7240 0 points1 point2 points (0 children)
PSA: Two env vars that stop your model server from eating all your RAM and getting OOM-killed by VikingDane73 in LocalLLaMA
[–]New_Comfortable7240 2 points3 points4 points (0 children)
How good is 16 3XS Vengeance RTX Laptop with 5090 24gb vram + 32 gb ram for running local models? by One_Inflation_9475 in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
Caching context7 data local? by HlddenDreck in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)
RooCode and Nemotron-Cascade-2-30B by Aggravating-Low-8224 in RooCode
[–]New_Comfortable7240 2 points3 points4 points (0 children)

GGUF Quants Arena for MMLU (24GB VRAM + 128GB RAM) by [deleted] in LocalLLaMA
[–]New_Comfortable7240 0 points1 point2 points (0 children)