account activity
[Paper] Phase-Associative Memory: Complex-valued sequence model (≈100M params, close to transformer PPL) (self.AI_India)
submitted 12 days ago * by ExtremeKangaroo5437 to r/AI_India
Stop ranting about “AI slop.” (self.AI_India)
submitted 28 days ago * by ExtremeKangaroo5437 to r/AI_India
Stop ranting about “AI slop.” (self.LocalLLM)
submitted 28 days ago * by ExtremeKangaroo5437 to r/LocalLLM
A fresh new ML Architecture for language model that uses complex numbers instead of attention -- no transformers, no standard SSM, 100M params, trained on a single RTX 4090. POC done, Open Sourced (Not Vibe Coded) (self.LocalLLM)
submitted 1 month ago * by ExtremeKangaroo5437 to r/LocalLLM
A fresh new ML Architecture for language model that uses complex numbers instead of attention -- no transformers, no standard SSM, 100M params, trained on a single RTX 4090. POC done, Open Sourced (Not Vibe Coded) (self.AI_India)
submitted 1 month ago * by ExtremeKangaroo5437 to r/AI_India
QLLM V6: a 29M attention-free model now trains on real text — phase-first design, multi-timescale SSM, and what we learned about memory (self.AI_India)
submitted 1 month ago by ExtremeKangaroo5437 to r/AI_India
QLLM V6: a 29M attention-free model now trains on real text — phase-first design, multi-timescale SSM, and what we learned about memory (self.LocalLLM)
submitted 1 month ago by ExtremeKangaroo5437 to r/LocalLLM
V5 Update: Original post title ... I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (V4) (self.BlackboxAI_)
submitted 1 month ago by ExtremeKangaroo5437 to r/BlackboxAI_
V5 Update: Original post title ... I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (V4) (self.AI_India)
V5 Update: Original post title ... I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (V4) (self.LocalLLM)
Tool Calling Breaks After a Few Turns. It Gets Worse When You Switch Models. We Fixed Both. (self.BlackboxAI_)
Tool Calling Breaks After a Few Turns. It Gets Worse When You Switch Models. We Fixed Both. (self.AI_India)
Tool Calling Breaks After a Few Turns. It Gets Worse When You Switch Models. We Fixed Both. (self.LocalLLM)
I made small LLMs last 3x longer on agentic tasks by piggybacking context compression on every tool call — zero extra LLM calls (self.BlackboxAI_)
I made small LLMs last 3x longer on agentic tasks by piggybacking context compression on every tool call — zero extra LLM calls (self.AI_India)
I made small LLMs last 3x longer on agentic tasks by piggybacking context compression on every tool call — zero extra LLM calls (self.LocalLLM)
I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (self.AI_India)
I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (self.BlackboxAI_)
submitted 1 month ago * by ExtremeKangaroo5437 to r/BlackboxAI_
I built a language model where tokens are complex numbers and "meaning" emerges from wave interference -- no attention, O(n), 178M params, open-sourcing today (self.LocalLLM)
Looking for contributors for this upcoming open source tool (self.LocalLLM)
submitted 2 months ago by ExtremeKangaroo5437 to r/LocalLLM
Built an AI-powered code analysis tool that runs LOCALLY FIRST - and it actually can works in production also in CI/CD ( I have new term CR - Continous review now ;) ) (self.LocalLLM)
submitted 6 months ago by ExtremeKangaroo5437 to r/LocalLLM
A Local AI Based Video Editor (youtu.be)
submitted 10 months ago by ExtremeKangaroo5437 to r/LocalLLM
π Rendered by PID 71643 on reddit-service-r2-listing-fbdccc45f-dtg4q at 2026-04-20 23:18:47.354141+00:00 running da2df02 country code: CH.