Chat UI documentation
Chat UI
Chat UI
Open source chat interface with support for tools, multimodal inputs, and intelligent routing across models. The app uses MongoDB and SvelteKit behind the scenes. Try the live version called HuggingChat on hf.co/chat or setup your own instance.
Chat UI connects to any OpenAI-compatible API endpoint, making it work with:
- Hugging Face Inference Providers
- Ollama
- llama.cpp
- OpenRouter
- Any other OpenAI-compatible service
MCP Tools: Function calling via Model Context Protocol (MCP) servers
LLM Router: Intelligent routing to select the best model for each request
Multimodal: Image uploads on models that support vision
OpenID: Optional user authentication via OpenID Connect
Quickstart
Step 1 - Create .env.local:
OPENAI_BASE_URL=https://router.huggingface.co/v1
OPENAI_API_KEY=hf_************************You can use any OpenAI-compatible endpoint:
| Provider | OPENAI_BASE_URL | OPENAI_API_KEY |
|---|---|---|
| Hugging Face | https://router.huggingface.co/v1 | hf_xxx |
| Ollama | http://127.0.0.1:11434/v1 | ollama |
| llama.cpp | http://127.0.0.1:8080/v1 | sk-local |
| OpenRouter | https://openrouter.ai/api/v1 | sk-or-v1-xxx |
Step 2 - Install and run:
git clone https://github.com/huggingface/chat-ui
cd chat-ui
npm install
npm run dev -- --openThat’s it! Chat UI will automatically discover available models from your endpoint.
MongoDB is optional for development. When
MONGODB_URLis not set, Chat UI uses an embedded database that persists to./db.
For production deployments, see the installation guides.
Update on GitHub