Run Open-WebUI for Ollama in a docker container
1 min read

Run Open-WebUI for Ollama in a docker container

Run Open-WebUI for Ollama in a docker container

These notes are based on:

  • Running ollama directly on the host
  • Running a docker container with open-webui
GitHub - open-webui/open-webui: User-friendly WebUI for LLMs (Formerly Ollama WebUI)
User-friendly WebUI for LLMs (Formerly Ollama WebUI) - open-webui/open-webui

Steps

  1. Make sure ollama is running by opening terminal and running:
sudo systemctl start ollama

Check it's running at: http://localhost:11434/

2. Open terminal in open-webui directory and start it with the following command:

docker run -d -p 3000:8080 --network=host --add-host=host.docker.internal:host-gateway -v open-webui
:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

3. Login at localhost:8080.

4. Make sure it can connect to ollama in settings of open-webui. You might need to change the Ollama URL from the default if connection fails:

Ollama URL = http://localhost:11434/


References:

https://medium.com/@quicky316/ollama-with-ollama-webui-with-a-fix-334180915ef4