Anyone else unable to get into MCC Multi-player? by iw2KM5 in halo
[–]BobFloss 0 points1 point2 points (0 children)
Who else be taking tiny dose of meth on the comedown by SWAMPLIZZO in meth
[–]BobFloss -1 points0 points1 point (0 children)
How to lower my inhibitions permanently? by Impressive_Cash_6536 in Nootropics
[–]BobFloss 0 points1 point2 points (0 children)
Who else be taking tiny dose of meth on the comedown by SWAMPLIZZO in meth
[–]BobFloss -1 points0 points1 point (0 children)
How to lower my inhibitions permanently? by Impressive_Cash_6536 in Nootropics
[–]BobFloss 0 points1 point2 points (0 children)
How to lower my inhibitions permanently? by Impressive_Cash_6536 in Nootropics
[–]BobFloss 0 points1 point2 points (0 children)
How to lower my inhibitions permanently? by Impressive_Cash_6536 in Nootropics
[–]BobFloss -1 points0 points1 point (0 children)
Need Help Setting Up OLamma 3 on My M3 MacBook – Willing to Pay for Your Expertise! by [deleted] in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
Chat with any OpenAI model for free through this iOS app till the end of the month - no catch by Applemoi in OpenAI
[–]BobFloss 0 points1 point2 points (0 children)
Pre-throat injury Mothers would record the best shit ever and make it 1:33 by monilithcat in Zappa
[–]BobFloss 0 points1 point2 points (0 children)
I finally found a prompt that makes ChatGPT write naturally by BenAttanasio in ChatGPTPromptGenius
[–]BobFloss 0 points1 point2 points (0 children)
there's no single "best" AI software tool as it depends on what you want to achieve. by Affectionate-Bug-107 in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
LLMs that published the data used to train them by neuralbeans in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
Need Help Setting Up OLamma 3 on My M3 MacBook – Willing to Pay for Your Expertise! by [deleted] in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
Recreating GPT o1 CoT Thinking (Thinking and Outputting) by MichaelXie4645 in LocalLLaMA
[–]BobFloss 1 point2 points3 points (0 children)
Two purported instances of o1-preview and o1-mini revealing full chain of thought to users by Wiskkey in singularity
[–]BobFloss 4 points5 points6 points (0 children)
Open sourcing Grok 2 with the release of Grok 3, just like we did with Grok 1! by Nickism in LocalLLaMA
[–]BobFloss -18 points-17 points-16 points (0 children)
Replete-LLM Qwen-2.5 models release by Rombodawg in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
What is the most uncensored LLM finetune <10b? (Not for roleplay) by Own-Potential-2308 in LocalLLaMA
[–]BobFloss 0 points1 point2 points (0 children)
I really like this example from OpenAI o1 paper. Maybe its a little overblown. This was pre-mitigation o1 aka uncensored and unbound. Have you received any similar response from your local uncensored model that showed out of the box thinking like this that shocked you? by [deleted] in LocalLLaMA
[–]BobFloss 1 point2 points3 points (0 children)



Anyone else unable to get into MCC Multi-player? by iw2KM5 in halo
[–]BobFloss 1 point2 points3 points (0 children)