Coming soon — MemGhost is in private beta. Join the waitlist

Your server. Your data. Free forever.

Unlimited captures, sessions, webhooks, storage. No limits, no cost, no compromise.

Quick start (5 minutes)

1

Install

curl -fsSL https://get.memghost.app | sh

Auto-installs Docker if needed, generates secrets, asks for domain and API key.

2

Start

docker compose up -d

Starts PostgreSQL + pgvector, Go API, Next.js frontend. Runs migrations automatically.

3

Use

Navigate to http://your-server:3300 and create your admin account.

Full guide: Docker Compose Deploy

What stays on your server

Everything stays local

  • Vault items and metadata
  • Hub pages and structure
  • Space conversations and artifacts
  • Webhook data
  • Rules and schemas
  • Embeddings and vectors

Only AI prompts leave (if using cloud AI)

  • Classification prompts
  • Hub materialization prompts
  • Space chat messages

Use local Ollama for fully offline AI. Or run without AI—rules-only classification still works.

AI options

Cloud AI

Recommended

Paste your Anthropic API key. No extra containers. Best quality.

Local AI

Fully offline

Ollama with local models. Requires GPU for reasonable speed. Complete privacy.

No AI

Rules only

Classification via rules engine. Hubs are manual. Spaces disabled. Still useful.

Requirements

Minimum

  • 2 CPU cores
  • 2 GB RAM
  • 10 GB storage
  • Docker 24.0+

Recommended

  • 4 CPU cores
  • 4 GB RAM
  • 50 GB+ storage
  • GPU (for local AI)

Runs on: Linux VPS, home server, Raspberry Pi 4+, Synology NAS, macOS, Windows (WSL2).