Open-WebUI
less than a minute
Open WebUI is a user-friendly AI interface with OpenAI-compatible APIs, serving as the default chatbot for llmaz.
Prerequisites
- Make sure EnvoyGateway and Envoy AI Gateway are installed, both of them are installed by default in llmaz. See AI Gateway for more details.
How to use
Enable Open WebUI
Open-WebUI is enabled by default in the values.global.yaml
and will be deployed in llmaz-system.
open-webui:
enabled: true
Set the Service Address
Run
kubectl get svc -n llmaz-system
to list out the services, the output looks like below, the LoadBalancer service name will be used later.envoy-default-default-envoy-ai-gateway-dbec795a LoadBalancer 10.96.145.150 <pending> 80:30548/TCP 132m envoy-gateway ClusterIP 10.96.52.76 <none> 18000/TCP,18001/TCP,18002/TCP,19001/TCP 172m
Port forward the Open-WebUI service, and visit
http://localhost:8080
.kubectl port-forward svc/open-webui 8080:80 -n llmaz-system
Click
Settings -> Admin Settings -> Connections
, set the URL tohttp://envoy-default-default-envoy-ai-gateway-dbec795a.llmaz-system.svc.cluster.local/v1
and save. (You can also set theopenaiBaseApiUrl
in thevalues.global.yaml
)
- Start to chat now.
Persistence
Set the persistence=true
in values.global.yaml
to enable persistence.
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.