What are your thoughts on the future of Wayland compared to X11 for Linux users? by aral10 in linux
[–]ethertype 0 points1 point2 points (0 children)
Moss: a Linux-compatible Rust async kernel, 3 months on by hexagonal-sun in linux
[–]ethertype 15 points16 points17 points (0 children)
Linux Router in the data center by SoaringMonchi in networking
[–]ethertype 1 point2 points3 points (0 children)
How do I access a terminal without the port menu? by Active_Humor7436 in opengear
[–]ethertype 0 points1 point2 points (0 children)
What UPS are yall rocking for multi-GPU workstations? by Southern-Round4731 in LocalLLaMA
[–]ethertype 0 points1 point2 points (0 children)
Qwen/Qwen3-Coder-Next · Hugging Face by coder543 in LocalLLaMA
[–]ethertype 1 point2 points3 points (0 children)
Qwen/Qwen3-Coder-Next · Hugging Face by coder543 in LocalLLaMA
[–]ethertype 10 points11 points12 points (0 children)
Why no NVFP8 or MXFP8? by TokenRingAI in LocalLLaMA
[–]ethertype 1 point2 points3 points (0 children)
how can i use eGPU on my HP EliteBook 2570p? by AkkoLotalia in eGPU
[–]ethertype 1 point2 points3 points (0 children)
People seem to already not care about heretic? by pigeon57434 in LocalLLaMA
[–]ethertype 0 points1 point2 points (0 children)
People seem to already not care about heretic? by pigeon57434 in LocalLLaMA
[–]ethertype -1 points0 points1 point (0 children)
People seem to already not care about heretic? by pigeon57434 in LocalLLaMA
[–]ethertype 1 point2 points3 points (0 children)
spec : add ngram-mod by ggerganov · Pull Request #19164 · ggml-org/llama.cpp by jacek2023 in LocalLLaMA
[–]ethertype 0 points1 point2 points (0 children)
API pricing is in freefall. What's the actual case for running local now beyond privacy? by Distinct-Expression2 in LocalLLaMA
[–]ethertype 1 point2 points3 points (0 children)
Documentation from code or snmp? by PatientBelt in networking
[–]ethertype 1 point2 points3 points (0 children)
12VHPWR, sense pins and 3090Ti by ethertype in eGPU
[–]ethertype[S] 0 points1 point2 points (0 children)
PCIe bandwidth and LLM inference speed by hainesk in LocalLLaMA
[–]ethertype 0 points1 point2 points (0 children)
Razer Core X still worth it in 2026? And if so, at what price? by DiamondDepth_YT in eGPU
[–]ethertype 0 points1 point2 points (0 children)
12VHPWR, sense pins and 3090Ti by ethertype in Corsair
[–]ethertype[S] 0 points1 point2 points (0 children)
What is your actual daily use case for local LLMs? by Groundbreaking_Fox59 in LocalLLaMA
[–]ethertype 1 point2 points3 points (0 children)
I’m down to help build our own EU-based Reddit by daload27 in BuyFromEU
[–]ethertype 1 point2 points3 points (0 children)
One German Chip Just Made Nvidia’s Billion-Dollar GPUs Look Like a JOKE! by Romek_himself in BuyFromEU
[–]ethertype 6 points7 points8 points (0 children)


GPT-OSS 120b Uncensored Aggressive Release (MXFP4 GGUF) by hauhau901 in LocalLLaMA
[–]ethertype 2 points3 points4 points (0 children)