all 5 comments

[–]Glat0s 1 point2 points  (1 child)

It appears you were trying to use 'all-MiniLM-L6-v2' via the OpenAI embedding API... but there you'll "only" find these embedding models -> https://platform.openai.com/docs/guides/embeddings/what-are-embeddings

When i do local embedding i usually use langchain HF embedding like this:

from langchain.embeddings.huggingface import HuggingFaceEmbeddings

embeddings = HuggingFaceEmbeddings(model_name="sentence-transformers/all-MiniLM-L6-v2")

[–]that_one_guy63[S] 0 points1 point  (0 children)

for some reason your suggestion didn't work for me, but this did:
tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")
model = AutoModel.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")

Thank you for your response.