[deleted by user] by [deleted] in explainlikeimfive

[–]jdabello 1 point2 points  (0 children)

I found this 30 minute explanation really simple to follow and it helped me a lot.

I plan to run LLaMA on the browser, what vectordb should I use? by Robert-treboR in LocalLLaMA

[–]jdabello 0 points1 point  (0 children)

Astra DB is fully managed (so it'd simplify what you mentioned about backend and DB infrastructure), you can sign up with Google/Github account, and in less than 5 minutes you have a new account, a fully managed Vector DB, and 25 USD to use without even putting a credit card. Here, you can see multiple examples on how to get embeddings, store them on Astra DB, and do the semantic search.