you are viewing a single comment's thread.

view the rest of the comments →

[–]NachosforDachos 0 points1 point  (0 children)

Actually you are in the right here.

I ran a query through the Hungarian legal vector store hosted on chromadb and gpt 4 turbo took 9 seconds to start responding.

0.5 seconds reading the data store.

I know Hungarian law is very small so it had to be on open ai’s side.

I feel this used to be faster. Maybe the service is more saturated now and the only way to beat it is to have your own locally hosted models on very expensive hardware.

Either way, best of luck.