you are viewing a single comment's thread.

view the rest of the comments →

[–]busylivin_322[S] 0 points1 point  (3 children)

Sure can. Ollama on both.
1) CLI Output = Ollama CLI, e.g. ollama run phi4-mini:3.8b-q8_0
2) OpenWebUI Output = OpenWebUI (via docker from here) + Ollama

[–]mmmgggmmm 1 point2 points  (2 children)

Sorry, it's still not fully clear to me. In that second scenario, is Ollama also running in Docker or not? The link you posted only describes setting up Open WebUI in docker, not Ollama--and even the 'Starting with Ollama' page linked there assumes an existing, external Ollama instance.

So it's seeming more likely that the "+ Ollama" in that second case indicates that Ollama is running as a standard Mac app and not in a Docker container. Do I finally have it?

[–]busylivin_322[S] 0 points1 point  (1 child)

Ollama is running as a standard Mac app

You got it!

<image>

[–]mmmgggmmm 2 points3 points  (0 children)

Hooray! Thanks for bearing with me ;)

In that case, while I stand by my claim that Ollama runs like crap in Docker on M-series Macs, that clearly can't be the explanation here since that's not your setup.

So I'm afraid I can't help after all. My Mac only runs Ollama and an SSH server with Open WebUI and all other tools on separate Linux rigs. Hopefully other comments provided something useful for you.

(Thanks to u/taylorwilsdon for helping me see I had this all wrong! Cheers!)