Local Development
Complete guide for running CyberMem on your local machine (Mac/Linux).
Overview
Local deployment uses:
- SQLite for storage (no external database needed)
- Ollama for embeddings (or synthetic fallback)
- Docker Compose for orchestration
- Keyless localhost access (no API key required)
Prerequisites
- Docker Desktop or Docker Engine
- Node.js 18+
- 4GB RAM minimum
- 5GB disk space
Quick Setup
npx @cybermem/mcp
That's it! The CLI handles everything automatically.
Architecture
┌─────────────────────────────────────────────────────┐
│ localhost │
├─────────────────────────────────────────────────────┤
│ :3000 Dashboard :8626 Traefik :9092 Prometheus│
│ ↓ ↓ ↓ │
│ Next.js OpenMemory Metrics │
│ └──────────────┴──────────────────┘ │
│ ↓ │
│ SQLite + Ollama │
└─────────────────────────────────────────────────────┘
Services
| Service | Port | Description |
|---|---|---|
| Traefik | 8626 | Reverse proxy, MCP routing |
| OpenMemory | internal | Memory API (via Traefik) |
| Dashboard | 3000 | Monitoring UI |
| Prometheus | 9092 | Metrics collection |
| Ollama | 11434 | Local embeddings |
Configuration
Configuration is stored in ~/.cybermem/.env:
# Embedding provider (ollama for local)
EMBEDDING_PROVIDER=ollama
OLLAMA_URL=http://ollama:11434
# Optional: Use OpenAI instead of Ollama
# OPENAI_API_KEY=sk-...
# EMBEDDING_PROVIDER=openai
Commands
# Start services
npx @cybermem/mcp
# Stop services
cd ~/.cybermem && docker-compose down
# View logs
cd ~/.cybermem && docker-compose logs -f
# Reset database
rm -rf ~/.cybermem/data && npx @cybermem/mcp
Troubleshooting
Ollama Not Starting
# Check if Ollama is running
docker logs cybermem-ollama
# Pull embedding model manually
docker exec cybermem-ollama ollama pull nomic-embed-text
Port Conflicts
If ports are in use, modify ~/.cybermem/docker-compose.yml:
services:
traefik:
ports:
- "8627:80" # Change from 8626
SQLite Permissions
# Fix permissions if container crashes
docker run --rm -v cybermem-openmemory-data:/data alpine \
sh -c 'chown -R 1001:1001 /data && chmod 777 /data'
Next Steps
- MCP Integration - Connect AI clients
- Raspberry Pi - Deploy to edge