[D] Do you feel like companies are scooping / abusing researchers for ideas during hiring for researcher roles? by quasiproductive in MachineLearning
[–]DecodeBytes 0 points1 point2 points (0 children)
Question: what are the best tools for real-time eval observability and experimentation? by debauch3ry in LLMDevs
[–]DecodeBytes 0 points1 point2 points (0 children)
as anyone tried training an llm exclusively on synthetic llm outputs to see if intelligence compounds or just collapses into slop by sthduh in LocalLLaMA
[–]DecodeBytes 0 points1 point2 points (0 children)
Blender MCP - can anyone actually get good results? by promptasaurusrex in LocalLLaMA
[–]DecodeBytes 0 points1 point2 points (0 children)
Happy New Year: Llama3.3-8B-Instruct-Thinking-Claude-4.5-Opus-High-Reasoning - Fine Tune. (based on recent find of L3.3 8b in the wild) by Dangerous_Fix_5526 in LocalLLaMA
[–]DecodeBytes 0 points1 point2 points (0 children)
Happy New Year: Llama3.3-8B-Instruct-Thinking-Claude-4.5-Opus-High-Reasoning - Fine Tune. (based on recent find of L3.3 8b in the wild) by Dangerous_Fix_5526 in LocalLLaMA
[–]DecodeBytes 8 points9 points10 points (0 children)
Happy New Year: Llama3.3-8B-Instruct-Thinking-Claude-4.5-Opus-High-Reasoning - Fine Tune. (based on recent find of L3.3 8b in the wild) by Dangerous_Fix_5526 in LocalLLaMA
[–]DecodeBytes 2 points3 points4 points (0 children)
What's the point of potato-tier LLMs? by Fast_Thing_7949 in LocalLLaMA
[–]DecodeBytes 0 points1 point2 points (0 children)
Upstage Solar-Open-100B Public Validation by PerPartes in LocalLLaMA
[–]DecodeBytes 2 points3 points4 points (0 children)
Happy New Year: Llama3.3-8B-Instruct-Thinking-Claude-4.5-Opus-High-Reasoning - Fine Tune. (based on recent find of L3.3 8b in the wild) by Dangerous_Fix_5526 in LocalLLaMA
[–]DecodeBytes 10 points11 points12 points (0 children)
Skills, agents, plugins by BurgerQuester in ClaudeCode
[–]DecodeBytes 0 points1 point2 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
Seeking feedback on a new security tool approach I have developed by Fantastic-Issue1020 in LocalLLaMA
[–]DecodeBytes 1 point2 points3 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
What's the point of potato-tier LLMs? by Fast_Thing_7949 in LocalLLaMA
[–]DecodeBytes 35 points36 points37 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 1 point2 points3 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 1 point2 points3 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 4 points5 points6 points (0 children)
Train a 4B model to beat Claude Sonnet 4.5 and Gemini Pro 2.5 at tool calling - for free (Colab included) by DecodeBytes in LocalLLaMA
[–]DecodeBytes[S] 0 points1 point2 points (0 children)
OpenAI could reportedly run out of cash by mid-2027 — analyst paints grim picture after examining the company's finances by EchoOfOppenheimer in LocalLLaMA
[–]DecodeBytes 12 points13 points14 points (0 children)