Skip to content
memghost.com Open App

Deploy

No need to clone the full repository. Download three small files and you’re ready to go.

1. Download the Compose Files

Create a directory and download the compose file, Caddyfile, and environment template:

Terminal window
mkdir memghost && cd memghost
# Docker Compose file
curl -fsSLO https://code.usshaws.com/rob/memghost/raw/branch/main/deploy/docker-compose.yml
# Caddy reverse proxy config
curl -fsSLO https://code.usshaws.com/rob/memghost/raw/branch/main/deploy/Caddyfile
# Environment template
curl -fsSL https://code.usshaws.com/rob/memghost/raw/branch/main/deploy/.env.example -o .env

Or copy the files manually from the deploy/ directory in the repository.

2. Configure

Edit the .env file with your settings:

Terminal window
# Required — change these before starting
DB_PASSWORD=your-secure-password
JWT_SECRET_KEY=$(openssl rand -hex 32)
# Your domain (Caddy auto-provisions HTTPS for public domains)
SITE_ADDRESS=home.example.com

3. Start the Stack

Core only

The base stack includes the database, API, frontend, and Caddy reverse proxy:

Terminal window
docker compose --profile standalone up -d

With AI

Add the ai profile to include Ollama for AI chat agents and semantic search:

Terminal window
# Enable AI in your .env
sed -i 's/AI_ENABLED=false/AI_ENABLED=true/' .env
docker compose --profile standalone --profile ai up -d

After first start, pull the AI models (one-time, ~5 GB total):

Terminal window
docker compose exec ollama ollama pull nomic-embed-text
docker compose exec ollama ollama pull qwen3:8b

Everything

All features: AI agents, text-to-speech, and speech-to-text:

Terminal window
# Enable all features in your .env
sed -i 's/AI_ENABLED=false/AI_ENABLED=true/' .env
sed -i 's/AI_TTS_ENABLED=false/AI_TTS_ENABLED=true/' .env
sed -i 's/AI_STT_ENABLED=false/AI_STT_ENABLED=true/' .env
docker compose --profile standalone --profile ai --profile voice up -d

Then pull the AI models:

Terminal window
docker compose exec ollama ollama pull nomic-embed-text
docker compose exec ollama ollama pull qwen3:8b

4. Wait for Readiness

The migrate service runs database migrations automatically on first start. Watch the logs until the API is ready:

Terminal window
docker compose logs -f api

Look for Starting application... in the output. The first start takes a moment while the migrate service runs.

Stopping the Stack

Terminal window
docker compose --profile standalone down

To also remove the database volume (wipes all data):

Terminal window
docker compose --profile standalone down -v

Updating

Pull the latest images and restart:

Terminal window
docker compose pull
docker compose --profile standalone up -d

Migrations run automatically on startup, so schema changes are applied when the API container restarts.

Using Your Own Reverse Proxy

If you already run Traefik, nginx, or another reverse proxy, skip the standalone profile and route traffic directly to the API and web containers. See the comments at the bottom of docker-compose.yml for the routing rules:

PathTarget
/api/*api:8080
/events/*api:8080
/tts/*api:8080
/stt/*api:8080
/mcp/*api:8080
/themes/*api:8080
/*web:3000