Docs / AI & Machine Learning / How to Set Up Open WebUI for Local AI Chat

How to Set Up Open WebUI for Local AI Chat

By Admin · Mar 1, 2026 · Updated Apr 23, 2026 · 31 views · 1 min read

How to Set Up Open WebUI for Local AI Chat

Open WebUI provides a polished chat interface for interacting with local LLMs. It connects to Ollama or any OpenAI-compatible API running on your Breeze.

Prerequisites

  • A Breeze with Ollama installed and running
  • Docker and Docker Compose installed

Install with Docker

Run Open WebUI in a single command:

docker run -d -p 3000:8080 \
  --add-host=host.docker.internal:host-gateway \
  -v open-webui:/app/backend/data \
  --name open-webui \
  --restart always \
  ghcr.io/open-webui/open-webui:main

Access the Interface

Open your browser and navigate to http://your-breeze-ip:3000. Create an admin account on first launch. The interface auto-detects Ollama models available on the host.

Key Features

  • Multi-model chat with conversation history
  • Document upload and RAG (Retrieval-Augmented Generation)
  • User management and role-based access
  • Customizable system prompts and model parameters

Secure with a Reverse Proxy

For production use, place Nginx in front with SSL:

sudo apt install nginx certbot python3-certbot-nginx
sudo certbot --nginx -d chat.yourdomain.com

Proxy pass to localhost:3000 in your Nginx server block.

Was this article helpful?