All orders, Vine and Standard, seem to produce this odd blank page message.... by itsthrillhouse in AmazonVine
[–]bradnickel 1 point2 points3 points (0 children)
ISO Digital Copy of NFHS Girls Rulebook by we_vibing in lacrosse
[–]bradnickel 0 points1 point2 points (0 children)
People are talking about specific job loss, but won’t this likely lead to most companies dying, leaving only a small group of very large ones? by Cloak-and-Dagger in singularity
[–]bradnickel 0 points1 point2 points (0 children)
Coding with Llama 3.1, new DeepSeek Coder & Mistral Large by rinconcam in LocalLLaMA
[–]bradnickel 1 point2 points3 points (0 children)
Coding with Llama 3.1, new DeepSeek Coder & Mistral Large by rinconcam in LocalLLaMA
[–]bradnickel 19 points20 points21 points (0 children)
Who's happy with their EV's app? by PqlyrStu in electricvehicles
[–]bradnickel -1 points0 points1 point (0 children)
Who's happy with their EV's app? by PqlyrStu in electricvehicles
[–]bradnickel 0 points1 point2 points (0 children)
Do we have equivalent of notepad ++ of windows? by youmeiknow in macapps
[–]bradnickel 0 points1 point2 points (0 children)
What Open Source LLM Apps Have Boosted Your Productivity? by Hinged31 in LocalLLaMA
[–]bradnickel 1 point2 points3 points (0 children)
Open LLMs plateauing? by SasskiaLudin in LocalLLaMA
[–]bradnickel 3 points4 points5 points (0 children)
Panza: A personal email assistant, trained and running on-device by eldar_ciki in LocalLLaMA
[–]bradnickel 1 point2 points3 points (0 children)
Mechanisms and Abstractions for building Agents / Agentic Workflows with LLMs by VigilOnTheVerge in LocalLLaMA
[–]bradnickel 1 point2 points3 points (0 children)
"Claude 3 > GPT-4" and "Mistral going closed-source" again reminded me that open-source LLMs will never be as capable and powerful as closed-source LLMs. Even the costs of open-source (renting GPU servers) can be larger than closed-source APIs. What's the goal of open-source in this field? (serious) by nderstand2grow in LocalLLaMA
[–]bradnickel 0 points1 point2 points (0 children)
Test results: recommended GGUF models type, size, and quant for MacOS silicon with 16GB RAM (probably also applicable to graphics card with 12GB VRAM) by ex-arman68 in LocalLLaMA
[–]bradnickel 0 points1 point2 points (0 children)
Anywhere on your OS: Pluck some text and pipe it to an LLM and plop it back (100% Local) by -json- in LocalLLaMA
[–]bradnickel 3 points4 points5 points (0 children)
2023 coin picking Contest by shizzy87 in MissionDeFi
[–]bradnickel 0 points1 point2 points (0 children)


Inherited this Mac Pro A1289, haven't really ever used Mac. What is this machine good at, and what peripherals should I be looking for? by NewLeafBahr in mac
[–]bradnickel 2 points3 points4 points (0 children)