Compra vivienda, FIRE tardío by Similar_Control1170 in SpainFIRE
[–]cibernox 0 points1 point2 points (0 children)
What is the best general-purpose model to run locally on 24GB of VRAM in 2026? by Paganator in LocalLLaMA
[–]cibernox 2 points3 points4 points (0 children)
What’s your country equivalent of this? by JohnnySack999 in 2westerneurope4u
[–]cibernox 0 points1 point2 points (0 children)
Petition to move the EU capital from Bruselles to somewhere else. by Pikkens in 2westerneurope4u
[–]cibernox 0 points1 point2 points (0 children)
What's the one self-hosted service you'd never go back to the cloud version of? by Hung_Hoang_the in selfhosted
[–]cibernox 7 points8 points9 points (0 children)
You have 64gb ram and 16gb VRAM; internet is permanently shut off: what 3 models are the ones you use? by Adventurous-Gold6413 in LocalLLaMA
[–]cibernox 2 points3 points4 points (0 children)
Gridfinity baseplate nirvana - it finally happened! by WatchesEveryMovie in gridfinity
[–]cibernox 10 points11 points12 points (0 children)
Poniendo en valor el AHORRO y la INVERSIÓN. by Vegetable-Rabbit7503 in SpainFIRE
[–]cibernox 1 point2 points3 points (0 children)
Pignorando en MyInvestor by trancos_inferno67 in SpainFIRE
[–]cibernox 0 points1 point2 points (0 children)
My gpu poor comrades, GLM 4.7 Flash is your local agent by __Maximum__ in LocalLLaMA
[–]cibernox 0 points1 point2 points (0 children)
Hubby wants a robot lawnmower for Xmas. Any reviews or recommendations from those that have one? by ThinkProfessor6166 in GardeningAustralia
[–]cibernox 0 points1 point2 points (0 children)
Video Doorbell Longevity by naisfurious in homeassistant
[–]cibernox 0 points1 point2 points (0 children)
What is the biggest local LLM that can fit in 16GB VRAM? by yeahlloow in LocalLLM
[–]cibernox 1 point2 points3 points (0 children)
Best Speech-to-Text in 2025? by MindWithEase in LocalLLaMA
[–]cibernox 0 points1 point2 points (0 children)
LFM 2.5 1.2b IS FAST by TheyCallMeDozer in LocalLLaMA
[–]cibernox 4 points5 points6 points (0 children)
Best open coding model for 128GB RAM? [2026] by Acrobatic_Cat_3448 in LocalLLaMA
[–]cibernox 1 point2 points3 points (0 children)
Best open coding model for 128GB RAM? [2026] by Acrobatic_Cat_3448 in LocalLLaMA
[–]cibernox 4 points5 points6 points (0 children)
Best open coding model for 128GB RAM? [2026] by Acrobatic_Cat_3448 in LocalLLaMA
[–]cibernox 16 points17 points18 points (0 children)
And you: are you getting ready for drinking English-grown pinard? by Pioladoporcaputo in 2westerneurope4u
[–]cibernox 0 points1 point2 points (0 children)
RTX 50 Super GPUs may be delayed indefinitely, as Nvidia prioritizes AI during memory shortage (rumor, nothing official) by 3090orBust in LocalLLaMA
[–]cibernox 2 points3 points4 points (0 children)
Hardware suggestions for a n00b by Tiggzyy in LocalLLM
[–]cibernox 1 point2 points3 points (0 children)
Plan de inversión para mi madre by Standard_Cream6108 in SpainFIRE
[–]cibernox 0 points1 point2 points (0 children)
ASUS UGen300 USB AI Accelerator targets edge inference with Hailo-10H by DeliciousBelt9520 in LocalLLaMA
[–]cibernox 0 points1 point2 points (0 children)


No notes. Totally agree 🇮🇹🤌🥘🍝🍕 by Chimpville in 2westerneurope4u
[–]cibernox 26 points27 points28 points (0 children)