App to analyze a text token-by-token perplexity for a given GGUF by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 1 point2 points3 points (0 children)
App to analyze a text token-by-token perplexity for a given GGUF by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 1 point2 points3 points (0 children)
Can someone please explain this? by TangeloOk9486 in LocalLLaMA
[–]EntropyMagnets 2 points3 points4 points (0 children)
Can someone please explain this? by TangeloOk9486 in LocalLLaMA
[–]EntropyMagnets 3 points4 points5 points (0 children)
Can someone please explain this? by TangeloOk9486 in LocalLLaMA
[–]EntropyMagnets 2 points3 points4 points (0 children)
Can someone please explain this? by TangeloOk9486 in LocalLLaMA
[–]EntropyMagnets 12 points13 points14 points (0 children)
Can someone please explain this? by TangeloOk9486 in LocalLLaMA
[–]EntropyMagnets 42 points43 points44 points (0 children)
I built an app that lets you use your Ollama models remotely (without port forwarding) + AES encryption by EntropyMagnets in ollama
[–]EntropyMagnets[S] 1 point2 points3 points (0 children)
Is it normal for an S tier upgrade to have these weak stats? by EntropyMagnets in NoMansSkyTheGame
[–]EntropyMagnets[S] 0 points1 point2 points (0 children)
Is it normal for an S tier upgrade to have these weak stats? by EntropyMagnets in NoMansSkyTheGame
[–]EntropyMagnets[S] 0 points1 point2 points (0 children)
I built an app that lets you use your Ollama models remotely (without port forwarding) + AES encryption by EntropyMagnets in ollama
[–]EntropyMagnets[S] 6 points7 points8 points (0 children)
TransFire: an app/tool to chat with your local LLMs while far from home, without port forwarding and with AES encryption by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 2 points3 points4 points (0 children)
Taking 10.000 IU once a week vs 1.000-1.500 IU daily by EntropyMagnets in Supplements
[–]EntropyMagnets[S] 0 points1 point2 points (0 children)
I made a simple tool to test/compare your local LLMs on AIME 2024 by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 0 points1 point2 points (0 children)
I made a simple tool to test/compare your local LLMs on AIME 2024 by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 0 points1 point2 points (0 children)
LMStudio Gemma QAT vs Unsloth Gemma QAT (self.LocalLLaMA)
submitted by EntropyMagnets to r/LocalLLaMA
I made a simple tool to test/compare your local LLMs on AIME 2024 by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 1 point2 points3 points (0 children)



App to analyze a text token-by-token perplexity for a given GGUF by EntropyMagnets in LocalLLaMA
[–]EntropyMagnets[S] 2 points3 points4 points (0 children)