Getting Started
Install, connect Ollama, start chatting.
Vashti is a small self-hosted chat UI for Ollama. It runs as one Rust binary with the frontend embedded, so setup is mostly installing the service and opening it in a browser.
1. Install Ollama
Vashti talks to Ollama backends, so install Ollama first if it is not already running.
Get Ollama from ollama.com/download.
2. Pull a model
Pull at least one model in Ollama before testing chat. For example:
ollama pull gemma4:e2b
3. Install Vashti
Run the install command on the Linux machine that should host Vashti:
curl -fsSL https://vashti.chat/install.sh | sh
The installer downloads the latest release, verifies the checksum, installs the binary, creates a systemd service, and prints the browser URLs to try.
4. Open Vashti
After install, open one of the URLs printed by the installer. Use the localhost URL on the server itself, or the network URL from another device on the same LAN or VPN.
5. Create the first account
The first account created becomes the admin account. After an admin exists, new accounts start disabled until an admin approves them.
6. Connect models
Open Settings, check the Backends and Models sections, and confirm your Ollama models are visible. If localhost detection does not find Ollama, add the backend URL manually.
http://127.0.0.1:11434
7. Update later
To update Vashti, run the same install command again. The installer replaces the binary and restarts the service without deleting your data.