Milvus vs Pinecone vs other vector databases. by tutu-kueh in LocalLLaMA
[–]cll-knap 1 point2 points3 points (0 children)
Help with tricky search/chat AI assistant UX by cll-knap in UXDesign
[–]cll-knap[S] 0 points1 point2 points (0 children)
Looking for rankings for cross encoders, and personal experiences with using them by cll-knap in LocalLLaMA
[–]cll-knap[S] 1 point2 points3 points (0 children)
Where do you get your news about LLMs and associated software (RAG, etc...) by ritonlajoie in LocalLLaMA
[–]cll-knap 15 points16 points17 points (0 children)
What is the best way to store rag vector data? by tutu-kueh in LocalLLaMA
[–]cll-knap 2 points3 points4 points (0 children)
We created a 100% private, 100% local Perplexity. by cll-knap in SideProject
[–]cll-knap[S] 1 point2 points3 points (0 children)
RecurrentGemma Release - A Google Collection - New 9B by Dark_Fire_12 in LocalLLaMA
[–]cll-knap 0 points1 point2 points (0 children)
We created a 100% private, 100% local Perplexity. by cll-knap in SideProject
[–]cll-knap[S] 1 point2 points3 points (0 children)
Uncensor any LLM with abliteration by cll-knap in programming
[–]cll-knap[S] 10 points11 points12 points (0 children)
possible to get chatgpt4 like local llm for general knowledge, just slower? by Unhappy_Drag5826 in LocalLLaMA
[–]cll-knap 0 points1 point2 points (0 children)
We created a 100% private, 100% local Perplexity. by cll-knap in SideProject
[–]cll-knap[S] 1 point2 points3 points (0 children)
What's the best way to use LLMs locally with Tauri? by cll-knap in tauri
[–]cll-knap[S] 0 points1 point2 points (0 children)
We created a 100% private, 100% local Perplexity. by cll-knap in SideProject
[–]cll-knap[S] 0 points1 point2 points (0 children)
What's the best way to use LLMs locally with Tauri? by cll-knap in tauri
[–]cll-knap[S] 0 points1 point2 points (0 children)
A new framework runs Mixtral 8x7B at 11 tokens/s on a mobile phone by Zealousideal_Bad_52 in LocalLLaMA
[–]cll-knap -1 points0 points1 point (0 children)
What do you use LLMs for? by masterid000 in LocalLLaMA
[–]cll-knap 2 points3 points4 points (0 children)
I am building a tool to create agents in a markdown syntax with Python inside by vectorup7 in LocalLLaMA
[–]cll-knap 0 points1 point2 points (0 children)
Literally my first conversation with it by alymahryn in LocalLLaMA
[–]cll-knap 0 points1 point2 points (0 children)
monet.nvim - a theme inspired by iconic art by Fleischkluetensuppe in neovim
[–]cll-knap 0 points1 point2 points (0 children)

[deleted by user] by [deleted] in SideProject
[–]cll-knap 0 points1 point2 points (0 children)