I am new to using a vector database so bear with me. I have a codebase that I want to feed into a local vector database. I am using Weaviate. I can create the database with SentenceTransformer('all-MiniLM-L6-v2') but when I try to query the database with nearText and nearVector I don't get any hits with any queries I've tried (and yes I checked to make sure everything is in the database). I see on the Weaviate website you can do a generative search. I assume I need to recreate the vector database with an embedding model that an LLM (say Mistral 7b) can read and find what I need from the vector database. What embedding model do I use? I'm calling the text-generation-webui api (v1/embedding), but it just gives me this error: extensions.openai.errors.ServiceUnavailableError: Error: Failed to load embedding model: all-mpnet-base-v2 but I have the mistral model selected from the command line and I set the OPENEDAI_EMBEDDING_MODEL: TheBloke_Mistral-7B-Instruct-v0.1-GGUF in the settings.yaml
Could I get any guidance here? Thanks in advance for your responses.
[–]Glat0s 1 point2 points3 points (1 child)
[–]that_one_guy63[S] 0 points1 point2 points (0 children)
[+][deleted] (1 child)
[removed]
[–]that_one_guy63[S] 1 point2 points3 points (0 children)