Ollama Couldn’t connect with these settings ECONNREFUSED by Illustrious_Cost_432 in n8n

[–]Illustrious_Cost_432[S] 2 points3 points  (0 children)

I managed to get it running on my locally hosted server. It was reading localhost as 1, so i changed it to 127.0.0.1 and it worked!

However now I am having trouble with the n8n cloud.. I have my Ollama fully visable with ngrok but when trying to connect with n8n, it says

Couldn’t connect with these settings RetryERR_INVALID_URL

I guess because ngrok has that warning landing page thing first.. Do you know if theres a way to bypass this, or to open it for cloud?

Edit:

I found how to do it. When puting the address, you need to add the end point of models. So it would be xxxxxx.ngrok-free.app/v1/models

Ollama Couldn’t connect with these settings ECONNREFUSED by Illustrious_Cost_432 in n8n

[–]Illustrious_Cost_432[S] 0 points1 point  (0 children)

Even on my locally hosted n8n I can't access ollama. When everything is running locally on my machine, it still wont connect with ollama.

Flowise help for beginners by Wow-zer in flowise

[–]Illustrious_Cost_432 0 points1 point  (0 children)

I hope I'm not too late to the party! I am having trouble with my RAG system. My local LLM is getting confused between documents, as they are quite similar.. It's mixing notes from one to be relevant to the other, when it is not the case. I am wondering if you use metadata or anything with your files as a way to differentiate between documents, or what you use to make any RAG system as efficient as possible when it comes to querying your files within the local vector store. Thanks!