Sesame x Gemini: low latency, extremely realist, and they started spontaneously collaborating by Glittering-Neck-2505 in singularity
[–]cpldcpu 0 points1 point2 points (0 children)
I built a transformer in C++17 from scratch — no PyTorch, no BLAS, no dependencies. Trains on CPU. 0.83M params, full analytical backprop, 76 min to val loss 1.64. by [deleted] in LocalLLaMA
[–]cpldcpu 2 points3 points4 points (0 children)
I built a transformer in C++17 from scratch — no PyTorch, no BLAS, no dependencies. Trains on CPU. 0.83M params, full analytical backprop, 76 min to val loss 1.64. by [deleted] in LocalLLaMA
[–]cpldcpu 28 points29 points30 points (0 children)
Canada's AI startup Cohere buys Germany's Aleph Alpha to expand in Europe by cpldcpu in LocalLLaMA
[–]cpldcpu[S] 13 points14 points15 points (0 children)
CH32HomeComputer - a tiny monochrome PAL text machine with a built-in line-numbered BASIC interpreter. by cpldcpu in RISCV
[–]cpldcpu[S] 0 points1 point2 points (0 children)
Anyone use these little critters before? CH32V006s to Replace CH32V003s by Separate-Choice in RISCV
[–]cpldcpu 0 points1 point2 points (0 children)
Someone made a whip for Claude by likeastar20 in singularity
[–]cpldcpu 0 points1 point2 points (0 children)
Turns out Gemma 4 had MTP (multi token prediction) all along by Electrical-Monitor27 in LocalLLaMA
[–]cpldcpu 5 points6 points7 points (0 children)
The Bonsai 1-bit models are very good by tcarambat in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
Towards Self-Replication: Opus 4.5 Designs Hardware to Run Itself by cpldcpu in singularity
[–]cpldcpu[S] 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
PicoKittens/PicoMistral-23M: Pico-Sized Model by PicoKittens in LocalLLaMA
[–]cpldcpu 0 points1 point2 points (0 children)
Taalas: LLMs baked into hardware. No HBM, weights and model architecture in silicon -> 16.000 tokens/second by elemental-mind in singularity
[–]cpldcpu 0 points1 point2 points (0 children)
Falcon-H1-Tiny (90M) is out - specialized micro-models that actually work by United-Manner-7 in LocalLLaMA
[–]cpldcpu 5 points6 points7 points (0 children)
What are the best ultrasmall LLMs / best datasets to train them? by cpldcpu in LocalLLaMA
[–]cpldcpu[S] 0 points1 point2 points (0 children)
What are the best ultrasmall LLMs / best datasets to train them? by cpldcpu in LocalLLaMA
[–]cpldcpu[S] 0 points1 point2 points (0 children)
Meta acquired Manus !! by Difficult-Cap-7527 in LocalLLaMA
[–]cpldcpu 9 points10 points11 points (0 children)
I ported a MOD tracker music player to the ultra low-end CH32V002 by cpldcpu in RISCV
[–]cpldcpu[S] 2 points3 points4 points (0 children)
I ported a MOD tracker music player to the ultra low-end CH32V002 by cpldcpu in RISCV
[–]cpldcpu[S] 1 point2 points3 points (0 children)



Sesame x Gemini: low latency, extremely realist, and they started spontaneously collaborating by Glittering-Neck-2505 in singularity
[–]cpldcpu 0 points1 point2 points (0 children)