Skip to content
memghost.com Open App

Verify Installation

Health Checks

The backend exposes two health endpoints:

Liveness

Returns 200 OK when the process is running:

Terminal window
curl https://your-domain/api/v1/health/live

Readiness

Returns 200 OK when the database connection and migrations are complete:

Terminal window
curl https://your-domain/api/v1/health/ready

If readiness fails, check that the database container is healthy:

Terminal window
docker compose ps

Access the Web UI

Open your browser to your configured domain (or http://localhost if using the default SITE_ADDRESS).

You should see the MemGhost setup wizard on first visit. If the frontend is not yet ready, give it a moment — the web container starts after the API is available.

Test the API

List notes (should return an empty array on a fresh install):

Terminal window
curl -s https://your-domain/api/v1/notes | head -c 200

Verify AI Features

If you enabled the ai profile, check that Ollama is running and models are loaded:

Terminal window
# Check Ollama is responding
docker compose exec ollama ollama list
# Test the health endpoint
curl -s https://your-domain/api/v1/health/ready

If models haven’t been pulled yet, see First Run for the model download commands.

Verify Voice Features

If you enabled the voice profile:

Terminal window
# Check TTS service
docker compose logs kokoro | tail -5
# Check STT service
docker compose logs whisper | tail -5

The voice toggle buttons appear in the chat UI header when TTS and STT services are available.

Container Status

View all running containers and their health:

Terminal window
docker compose ps

Expected output for a full install (all profiles):

ServiceStatusNotes
dbUp (healthy)PostgreSQL with pgvector
apiUpGo backend
webUpNext.js frontend
caddyUpReverse proxy
ollamaUpLLM inference (ai profile)
kokoroUpTTS (voice profile)
whisperUpSTT (voice profile)

View Logs

Terminal window
# All services
docker compose logs -f
# Single service
docker compose logs -f api

Troubleshooting

API container won’t start

Check the migrate service completed successfully:

Terminal window
docker compose logs migrate

Common causes:

  • Database not yet ready — the API waits for the migrate service, which waits for db to be healthy.
  • Migration conflict — run docker compose down -v to reset the database and try again.

Frontend shows blank page or errors

The frontend proxies API requests through Caddy. If using the standalone profile, make sure Caddy is running:

Terminal window
docker compose logs caddy

Ollama out of memory

The default chat model (qwen3:8b) needs ~5 GB RAM. On memory-constrained systems, try a smaller model:

Terminal window
docker compose exec ollama ollama pull qwen3:4b

Then update AI_LLM_MODEL=qwen3:4b in your .env and restart the API.

Port conflicts

If ports 80 or 443 are already in use, change them in .env:

PORT=8080
HTTPS_PORT=8443