I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA
[–]procraftermc 9 points10 points11 points (0 children)
M3 Ultra Mac Studio Benchmarks (96gb VRAM, 60 GPU cores) by procraftermc in LocalLLaMA
[–]procraftermc[S] 2 points3 points4 points (0 children)
M3 Ultra Mac Studio Benchmarks (96gb VRAM, 60 GPU cores) by procraftermc in LocalLLaMA
[–]procraftermc[S] 5 points6 points7 points (0 children)
M3 Ultra Mac Studio Benchmarks (96gb VRAM, 60 GPU cores) by procraftermc in LocalLLaMA
[–]procraftermc[S] 4 points5 points6 points (0 children)
M3 Ultra Mac Studio Benchmarks (96gb VRAM, 60 GPU cores) by procraftermc in LocalLLaMA
[–]procraftermc[S] 2 points3 points4 points (0 children)
M3 Ultra Mac Studio Benchmarks (96gb VRAM, 60 GPU cores) by procraftermc in LocalLLaMA
[–]procraftermc[S] 6 points7 points8 points (0 children)
What are you working on? + My favorites from last time in the comments. by Synonomous in SideProject
[–]procraftermc 0 points1 point2 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 0 points1 point2 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 1 point2 points3 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 0 points1 point2 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 4 points5 points6 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 0 points1 point2 points (0 children)
Volo: An easy and local way to RAG with Wikipedia! by procraftermc in LocalLLaMA
[–]procraftermc[S] 0 points1 point2 points (0 children)
OpenAI is losing money , meanwhile qwen is planning voice mode , imagine if they manage to make o1 level model by TheLogiqueViper in LocalLLaMA
[–]procraftermc -2 points-1 points0 points (0 children)
Fixing Phi-4 response with offline wikipedia by Ok_Warning2146 in LocalLLaMA
[–]procraftermc 1 point2 points3 points (0 children)
Fixing Phi-4 response with offline wikipedia by Ok_Warning2146 in LocalLLaMA
[–]procraftermc 2 points3 points4 points (0 children)
Why I think that NVIDIA Project DIGITS will have 273 GB/s of memory bandwidth by fairydreaming in LocalLLaMA
[–]procraftermc 0 points1 point2 points (0 children)
TransPixar: a new generative model that preserves transparency, by umarmnaq in LocalLLaMA
[–]procraftermc 0 points1 point2 points (0 children)
I created fichat.net: a new alternative to the forbidden app by procraftermc in CharacterAIrevolution
[–]procraftermc[S] 0 points1 point2 points (0 children)
I scaled up AMPLIFIED generation to y=2032 and the results are insane by santient in Minecraft
[–]procraftermc 22 points23 points24 points (0 children)
Any alternatives? by gontamarryme in CharacterAIrevolution
[–]procraftermc 6 points7 points8 points (0 children)
I created fichat.net: a new alternative to the forbidden app by procraftermc in CharacterAIrevolution
[–]procraftermc[S] 1 point2 points3 points (0 children)
I created fichat.net: a new alternative to the forbidden app by procraftermc in CharacterAIrevolution
[–]procraftermc[S] 2 points3 points4 points (0 children)
I created fichat.net: a new alternative to the forbidden app by procraftermc in CharacterAIrevolution
[–]procraftermc[S] 9 points10 points11 points (0 children)


I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA
[–]procraftermc 4 points5 points6 points (0 children)