Skip to main content

Command Palette

Search for a command to run...

podman and Open WebUI

Updated
1 min read

There are numerous tutorials both on youtube and on blog platforms that shows how to get Open WebUI up and running on docker.

Not too many mentions of podman, so here goes.

$ podman create -p 127.0.0.1:3000:8080 --network=pasta:-T,11434 \
--add-host=localhost:127.0.0.1 \
--env 'OLLAMA_BASE_URL=http://localhost:11434' \
--env 'ANONYMIZED_TELEMETRY=False' \
-v open-webui:/app/backend/data \
--label io.containers.autoupdate=registry \
--name open-webui ghcr.io/open-webui/open-webui:main

This will allow you to access Open WebUI on http://localhost:3000.

Remember to pull the image ghcr.io/open-webui/open-webui

‘till next time.

G

What do you think about writing an article about installing open-webui using cockpit on fedora server which uses the gpu?

R

Work has gotten in the way of life lately :) I haven't posted in ages. Can't promise anything but I'll try.

G

thanks for your time, a beer left ;) Rune Hansén Steinnes-Westum

G

Sorry for my noob question, but for add the gpu the correct tag like container is --gpus=all down --add-host ?

R

Yes, that should (in theory) do the trick

1
H
Haizea Gomez11mo ago

..

S

What was the command to pick up my jaw from the office chair...

1
M

Thanks. However I'm pulling my hair out since I can't get the open webui to load the models already available on my host windows machine. Any help will be greatly appreciated

R

Sorry, didn't notice your question. I have little to no experience with windows my self. This might be a docker problem and not a windows/ollama problem.. not much help I'm afraid.