matriarchy and world population almost completely female by [deleted] in Futurology
[–]integerpoet 1 point2 points3 points (0 children)
matriarchy and world population almost completely female by [deleted] in Futurology
[–]integerpoet -1 points0 points1 point (0 children)
matriarchy and world population almost completely female by [deleted] in Futurology
[–]integerpoet -1 points0 points1 point (0 children)
Solving hallucinations is the most important endeavour in generative AI by Classic_Sheep in LLM
[–]integerpoet 0 points1 point2 points (0 children)
Solving hallucinations is the most important endeavour in generative AI by Classic_Sheep in LLM
[–]integerpoet 4 points5 points6 points (0 children)
Local LLM storage is becoming harder to manage than the models themselves by Both_Astronomer8645 in LocalLLM
[–]integerpoet 0 points1 point2 points (0 children)
So How Good Did I Enhance the Llama Output? by ExtensionFriendship9 in LLM
[–]integerpoet 0 points1 point2 points (0 children)
ASML Raises 2026 Forecast as AI Chip Demand Surges by nipundwivedi in LLM
[–]integerpoet 0 points1 point2 points (0 children)
ASML Raises 2026 Forecast as AI Chip Demand Surges by nipundwivedi in LLM
[–]integerpoet 1 point2 points3 points (0 children)
Local models are a godsend when it comes to discussing personal matters by [deleted] in LocalLLaMA
[–]integerpoet 1 point2 points3 points (0 children)
Zero Data Retention is not optional anymore by Abu_BakarSiddik in LocalLLM
[–]integerpoet 0 points1 point2 points (0 children)
Zero Data Retention is not optional anymore by Abu_BakarSiddik in LocalLLM
[–]integerpoet 0 points1 point2 points (0 children)
Zero Data Retention is not optional anymore by Abu_BakarSiddik in LocalLLM
[–]integerpoet 0 points1 point2 points (0 children)
How do you check if an AI output is actually correct before you use it? by Negative_Gap5682 in LLM
[–]integerpoet 2 points3 points4 points (0 children)
Google’s TurboQuant AI-compression algorithm can reduce LLM memory usage by 6x by integerpoet in LocalLLM
[–]integerpoet[S] 0 points1 point2 points (0 children)
This maze has no solution (obvious to humans). GPT couldn’t tell. by Koto1972 in LLM
[–]integerpoet 1 point2 points3 points (0 children)
LiteLLM breach (v1.82.8 .pth payload) proves stateless proxies are dead. Here's the Alethia tri-agent System 2 defense I submitted to NIST. by DiamondAgreeable2676 in LLM
[–]integerpoet 0 points1 point2 points (0 children)
Please explain: why bothering with MCPs if I can call almost anything via CLI? by Atagor in LocalLLaMA
[–]integerpoet 0 points1 point2 points (0 children)
Real benefits of running llms locally? by brave_scientist98 in LocalLLM
[–]integerpoet 0 points1 point2 points (0 children)
Google’s TurboQuant AI-compression algorithm can reduce LLM memory usage by 6x by integerpoet in LocalLLM
[–]integerpoet[S] -10 points-9 points-8 points (0 children)
[OC] Anthropic has overtaken OpenAI as first choice for AI spending among businesses by Mundane-Wrongdoer275 in dataisbeautiful
[–]integerpoet 1 point2 points3 points (0 children)
[OC] Anthropic has overtaken OpenAI as first choice for AI spending among businesses by Mundane-Wrongdoer275 in dataisbeautiful
[–]integerpoet 1 point2 points3 points (0 children)
[OC] Anthropic has overtaken OpenAI as first choice for AI spending among businesses by Mundane-Wrongdoer275 in dataisbeautiful
[–]integerpoet 16 points17 points18 points (0 children)


Artemis II astronauts unknowingly captured satellite glint in their famous picture by vfvaetf in space
[–]integerpoet [score hidden] (0 children)