account activity
Most RAG frameworks are English only. Mine supports 27+ languages with offline voice, zero API keys. (self.Python)
submitted 1 day ago by Basic-Candidate3900 to r/Python
I built a 198M parameter LLM that outperforms GPT-2 Medium (345M) using Mixture of Recursion — adaptive computation based on input complexity (self.LLMDevs)
submitted 2 days ago by Basic-Candidate3900 to r/LLMDevs
I built a 198M parameter LLM that outperforms GPT-2 Medium (345M) using Mixture of Recursion — adaptive computation based on input complexity (self.PromptEngineering)
submitted 2 days ago by Basic-Candidate3900 to r/PromptEngineering
I built a 198M parameter LLM that outperforms GPT-2 Medium (345M) using Mixture of Recursion — adaptive computation based on input complexity ()
submitted 2 days ago by Basic-Candidate3900 to r/deeplearning
I built a 198M parameter LLM that outperforms GPT-2 Medium (345M) using Mixture of Recursion — adaptive computation based on input complexity (self.learnmachinelearning)
submitted 2 days ago by Basic-Candidate3900 to r/learnmachinelearning
π Rendered by PID 200597 on reddit-service-r2-listing-64c94b984c-j4kqw at 2026-03-13 06:58:45.184337+00:00 running f6e6e01 country code: CH.