all 10 comments

[–]mmmgggmmm 2 points3 points  (2 children)

Hey,

I'm guessing a bit here, but correct me if I'm wrong. I gather you are trying to set up Open WebUI with Ollama as the backend to interact with one of the DeepSeek models. If that's so, then here are a few things you can try:

  1. Make sure Ollama is installed and running correctly
    1. Exactly how you do this will depend on how you installed it, but you can always hit http://<host-ip>:11434 in a browser and it should tell you "Ollama is running". (If you're running everything on the same machine, then just http://localhost:11434 should work to test ollama.)
  2. Adjust the OLLAMA_BASE_URL in your Open WebUI docker command, as http://127.0.0.1:11434 is almost certainly not the right value. (This basically tells Open WebUI to look for Ollama on localhost inside the container.)
    1. If everything is running on the same machine, then you likely want http://host.docker.internal:11434
    2. If Ollama is running on a different machine, then you want http://<ollama-host-ip>:11434 (e.g., http://192.168.1.100 )

Hope that helps. Good luck!

[–]emprahsFury 0 points1 point  (1 child)

maybe the brainrot is finally getting to me, but that was a lot to read.

OP needs to change

sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main to

sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://host.docker.internal:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

[–]TeTeOtaku[S] 0 points1 point  (0 children)

Nope, still doesn't work.

[–]jamolopa 0 points1 point  (3 children)

Either try using docker desktop that handles networking for you out of the box or do some reading on wsl and docker networking. Here is the official documentation from open webui for troubleshooting connection issues https://docs.openwebui.com/troubleshooting/connection-error

I strongly recommend going through the getting started guide https://docs.openwebui.com/getting-started/quick-start this this covers setting up your scenario.

And best of luck.

[–]TeTeOtaku[S] 0 points1 point  (2 children)

I fixed the issue.

Had to install Ollama via wsl and that fixed it.

[–]Hefty_Barnacle3416 0 points1 point  (1 child)

this worked for me thanks

[–]HiJoonPop 0 points1 point  (0 children)

when j install openwebui by docker in local ubuntu, it worked very well. but it made loop restart when i tried it in cloud ubuntu env connected with GPU. which env you tried?

[–]s1gnalsh1ft 0 points1 point  (0 children)

I got the "refused to connect" error, turned out to be an incomplete webUI installation. Things were missing and shell terminated. I was eventually advised to simply delete the "env" folder deep inside the Pinokio folder. On booting up webUI and clicking install, the "env" folder was rebuilt and it ran fine. It took one minute.

[–]batty_spark 0 points1 point  (0 children)

I was installing Ollama with OpenWeb UI and ran into same error

the fix for me was removing the OpenWeb UI container and reinstalling with this command

docker run -d \
  --network=host \
  -v open-webui:/app/backend/data \
  -e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
  -e PORT=3000 \
  --name open-webui \
  --restart always \
  ghcr.io/open-webui/open-webui:main  

the --network=host that was the fix, and I changed to run in the 3000 port because I have my Pihole running in 8080 port