Open Source  ·  Self-Hostable  ·  v0.0.2
O

OpenChat

A self-hostable, multi-provider AI chat interface for fact-grounded research.
Every answer is sourced. Every session persists. You own the data.

Get started GitHub Deploy guide →

Architecture overview

Next.js App Router on the server, your database on the backend, your LLM provider on the edge.

Architecture — request flow
Browser ┌─────────────────────────────────────────────────────────┐ │ Next.js App Router (React, TypeScript) │ │ │ │ / Chat UI · Sidebar · SourcesBanner │ │ /hub Landing & docs │ │ /about Feature overview │ │ /releases Changelog │ │ /settings Runtime configuration UI │ └──────────────────┬──────────────────────────────────────┘ │ HTTP / Server-Sent Events (streaming) Server ▼ ┌─────────────────────────────────────────────────────────┐ │ API Routes (Node.js runtime) │ │ │ │ POST /api/chat ──► LLM stream │ │ GET /api/sessions ──► list sessions │ │ POST /api/sessions ──► create session │ │ GET /api/sessions/:id/messages ──► load history │ │ GET /api/settings ──► read config │ │ POST /api/settings ──► save config │ │ POST /api/db-test ──► test connection │ └────────┬────────────────────────────┬───────────────────┘ │ │ ┌──────▼──────┐ ┌───────▼───────────┐ │ Database │ │ LLM Provider │ │ │ │ │ │ PostgreSQL │ │ OpenAI │ │ MySQL │ │ Anthropic │ │ SQLite │ │ Google Gemini │ └─────────────┘ │ Ollama (local) │ └───────────────────┘

Everything you need for AI research

Streaming, persistence, and citations as first-class citizens. Zero vendor lock-in.

💬
Real-time streaming
Responses stream token-by-token from your chosen model with a live typing cursor.
💾
Persistent sessions
Every conversation is stored in your database. Reload or restart — history is always there.
🔗
Cited sources
Every response ends with a collapsible sources banner. No unverified claims.
🔌
Multi-provider LLM
OpenAI, Anthropic, Gemini, Ollama. Switch from the Settings UI — no restart needed.
⚙️
Zero-restart config
Change provider, model, database, or system prompt at runtime from the Settings UI.
🚀
Self-hostable
Local, Docker, Cloud Foundry, GCP, AWS, Azure — you own your data and your infra.

Up and running in minutes

SQLite in-memory means you don't even need a database to try it out.

1 Clone and install
bash
git clone https://github.com/SentorLabs/openchat
cd openchat
npm install
2 Configure environment
.env.local
cp .env.example .env.local

# Minimum required
DATABASE_URL=:memory:      # SQLite in-memory — zero setup
LLM_PROVIDER=openai
LLM_MODEL=gpt-4o
LLM_API_KEY=sk-...
3 Start the server
bash
npm run dev       # development — http://localhost:3000
npm run build && npm start   # production
The database schema is created automatically on first run. Open http://localhost:3000/settings to configure your LLM provider from the browser — no env var needed.

Bring your own model

Switch providers from the Settings UI at any time — no restart, no code changes.

OpenAI
gpt-4o  ·  gpt-4o-mini  ·  o3-mini
LLM_PROVIDER=openai
Anthropic
claude-opus-4-6  ·  claude-sonnet-4-6  ·  claude-haiku-4-5
LLM_PROVIDER=anthropic
Google Gemini
gemini-2.0-flash  ·  gemini-1.5-pro
LLM_PROVIDER=gemini
Ollama (local)
llama3.2  ·  mistral  ·  phi4  ·  gemma3  ·  any local model
LLM_PROVIDER=ollama

Your data, your database

Auto-detected from the connection string. Schema is created on first run — no migrations needed.

PostgreSQL — Recommended for production. Full UUID + JSONB support.
DATABASE_URL=postgresql://user:pass@host:5432/openchat
MySQL — Fully supported. Placeholders translated automatically.
DATABASE_URL=mysql://user:pass@host:3306/openchat
SQLite — Perfect for local dev. Zero setup.
DATABASE_URL=/absolute/path/to/openchat.db
DATABASE_URL=:memory:    # in-memory, lost on restart

Environment variables

All settings can also be configured at runtime from the Settings UI — no server restart needed.

Variable Required Description
DATABASE_URL required PostgreSQL, MySQL, or SQLite connection string (or :memory:)
LLM_PROVIDER required openai | anthropic | gemini | ollama
LLM_MODEL required Model name for your chosen provider (e.g. gpt-4o)
LLM_API_KEY optional API key — required for all providers except Ollama
LLM_SYSTEM_PROMPT optional Override the default system prompt to set AI persona or domain focus
OLLAMA_BASE_URL optional Ollama base URL — default: http://localhost:11434

Deploy anywhere

Ready-to-run scripts for every major platform are in the deployment/ folder.

Local / npm
Fastest start
1 Install & configure
bash
npm install
cp .env.example .env.local   # fill in your values
2 Run
bash
npm run dev                          # dev mode
npm run build && npm start           # production
Docker Compose
Batteries included
1 docker-compose.yml
yaml
services:
  app:
    build: .
    ports: ["3000:3000"]
    environment:
      DATABASE_URL: postgresql://postgres:secret@db:5432/openchat
      LLM_PROVIDER: openai
      LLM_MODEL: gpt-4o
      LLM_API_KEY: ${LLM_API_KEY}
    depends_on: [db]
  db:
    image: postgres:16-alpine
    environment:
      POSTGRES_PASSWORD: secret
      POSTGRES_DB: openchat
    volumes: [pgdata:/var/lib/postgresql/data]
volumes:
  pgdata:
2 Start
bash
LLM_API_KEY=sk-... docker compose up -d
Cloud Foundry / SAP BTP
Enterprise
1 Create PostgreSQL service & set env vars
bash
cf create-service postgresql-db trial openchat-db
cf set-env openchat LLM_PROVIDER openai
cf set-env openchat LLM_MODEL gpt-4o
cf set-env openchat LLM_API_KEY sk-...
2 Push (or use the script)
bash
cf push             # uses manifest.yml
# or automated:
bash deployment/btp.sh
GCP Cloud Run
Serverless
1 One command deploy
bash
bash deployment/gcp.sh
2 What it provisions
bash
# Cloud SQL (PostgreSQL) instance
# DATABASE_URL stored in Secret Manager
# Image built via Cloud Build
# Cloud Run service with Cloud SQL socket
AWS App Runner
Managed
1 One command deploy
bash
bash deployment/aws.sh
2 What it provisions
bash
# ECR repo + Docker image push
# RDS PostgreSQL instance
# Secrets Manager entry
# IAM role + App Runner service
Azure Container Apps
Managed
1 One command deploy
bash
bash deployment/azure.sh
2 What it provisions
bash
# Azure Container Registry + image build
# Azure DB for PostgreSQL Flexible Server
# Key Vault + managed identity
# Container App with ACR pull access

Built on solid foundations

Modern, minimal, no magic — just TypeScript, Next.js, and your database of choice.

Next.js App Router
React server components + streaming API routes
🔷
TypeScript 5
Strict mode — no untyped corners
🎨
Tailwind CSS v4
Dark-mode design system, zero CSS bloat
🗄️
Multi-dialect DB
pg · mysql2 · better-sqlite3
🤖
Pluggable LLMs
OpenAI · Anthropic · Gemini · Ollama
🟢
Node.js ≥ 20
Native streaming, no extra runtime