Skip to content
memghost.com Open App

Quick Start

MemGhost is designed to run on modest hardware with Docker Compose. No need to clone a repository — just download the compose file and go.

Overview

The core deployment consists of four containers:

ServiceImagePurpose
dbpgvector/pgvector:pg15Event store, read models, and vector search
apimemghost:latestGo backend API server
webmemghost-web:latestNext.js web interface
caddycaddy:2-alpineReverse proxy with automatic HTTPS

Optional services extend the platform with AI and voice capabilities:

ServiceProfilePurpose
ollamaaiLLM chat and semantic embeddings
kokorovoiceText-to-speech (67+ voices)
whispervoiceSpeech-to-text (voice input)

Steps

  1. Prerequisites — install Docker and Docker Compose.
  2. Deploy — download the compose file and start the stack.
  3. First Run — migrations, AI model setup, and seed data.
  4. Verify Installation — health checks and accessing the UI.

Minimum Requirements

ResourceCore OnlyWith AIWith AI + Voice
CPU1 core2 cores4 cores
RAM512 MB4 GB6 GB
Disk1 GB10 GB15 GB
OSAny Linux with DockerUbuntu 22.04+ / Debian 12+Ubuntu 22.04+ / Debian 12+

AI features use Ollama for local inference. A GPU is not required but significantly speeds up chat responses and embedding generation.