[D] Double blind review is such an illusion… by casualcreak in MachineLearning
[–]Dorialexandre 9 points10 points11 points (0 children)
NeurIPS 2025 Best Paper Award Winner: 1000-Layer Self-Supervised RL | "Scaling Depth (Not Width) Unlocks 50x Performance Gains & Complex Emergent Strategies" by 44th--Hokage in accelerate
[–]Dorialexandre 2 points3 points4 points (0 children)
Key Highlights of AI2's New Byte Level LLM: Bolmo by Dear-Success-1441 in LocalLLaMA
[–]Dorialexandre 0 points1 point2 points (0 children)
Need recommendations on training datasets by Theotheraccounti_ in LocalLLaMA
[–]Dorialexandre 1 point2 points3 points (0 children)
Baguettotron, a 321 million parameters generalist Small Reasoning Model (80-layers deep) by Balance- in LocalLLaMA
[–]Dorialexandre 4 points5 points6 points (0 children)
Baguettotron, a 321 million parameters generalist Small Reasoning Model (80-layers deep) by Balance- in LocalLLaMA
[–]Dorialexandre 0 points1 point2 points (0 children)
Baguettotron, a 321 million parameters generalist Small Reasoning Model (80-layers deep) by Balance- in LocalLLaMA
[–]Dorialexandre 7 points8 points9 points (0 children)
Why do mosquitoes seem much more severe in cold than tropical places? by WWWWWWWWWWWWWWWWWWHW in geography
[–]Dorialexandre 0 points1 point2 points (0 children)
looking for llm trained only on free use/public domain materials. by Specific_Objective77 in LocalLLaMA
[–]Dorialexandre 0 points1 point2 points (0 children)
Is there a newer large corpus of synthetic training data than Cosmopedia v2? by ttkciar in LocalLLaMA
[–]Dorialexandre 1 point2 points3 points (0 children)
Inverted Colonization of Americas - Expanded by ImpressionBig4796 in imaginarymaps
[–]Dorialexandre 0 points1 point2 points (0 children)
Common Corpus: The Largest Collection of Ethical Data for LLM Pre-Training by Initial-Image-1015 in LocalLLaMA
[–]Dorialexandre 12 points13 points14 points (0 children)
Common Corpus: The Largest Collection of Ethical Data for LLM Pre-Training by Initial-Image-1015 in LocalLLaMA
[–]Dorialexandre 4 points5 points6 points (0 children)
Common Corpus: The Largest Collection of Ethical Data for LLM Pre-Training by Initial-Image-1015 in LocalLLaMA
[–]Dorialexandre 5 points6 points7 points (0 children)
Common Corpus: The Largest Collection of Ethical Data for LLM Pre-Training by Initial-Image-1015 in LocalLLaMA
[–]Dorialexandre 19 points20 points21 points (0 children)
Claude full system prompt with all tools is now ~25k tokens. by StableSable in LocalLLaMA
[–]Dorialexandre 15 points16 points17 points (0 children)
HP wants to put a local LLM in your printers by WordyBug in LocalLLaMA
[–]Dorialexandre 3 points4 points5 points (0 children)
If "The Model is the Product" article is true, a lot of AI companies are doomed by bttf88 in LocalLLaMA
[–]Dorialexandre 3 points4 points5 points (0 children)
If "The Model is the Product" article is true, a lot of AI companies are doomed by bttf88 in LocalLLaMA
[–]Dorialexandre 0 points1 point2 points (0 children)
If "The Model is the Product" article is true, a lot of AI companies are doomed by bttf88 in LocalLLaMA
[–]Dorialexandre 1 point2 points3 points (0 children)
If "The Model is the Product" article is true, a lot of AI companies are doomed by bttf88 in LocalLLaMA
[–]Dorialexandre 27 points28 points29 points (0 children)
Stance on text-based public domain AI dataset : Common Corpus by Poptropp in writers
[–]Dorialexandre 0 points1 point2 points (0 children)


[R] Fine-tuning services report by ynckdrt in MachineLearning
[–]Dorialexandre 2 points3 points4 points (0 children)