A high-performance Model Context Protocol (MCP) server that provides intelligent memory and context management for ANY AI application or client. Built with Go, it features advanced vector storage, pattern recognition, and contextual learning capabilities using Chroma vector database. Works seamlessly with Claude, VS Code Copilot, Continue, Cursor, and any MCP-compatible client.
The MCP Go implementation has been moved to a separate open-source project:
- GoMCP SDK: github.com/fredcamaral/gomcp-sdk
This allows the community to use the MCP SDK independently for building any MCP-compatible application.
- Persistent Conversation Memory: Store and retrieve conversation history across sessions
- Vector Similarity Search: Find semantically similar conversations and contexts
- Hierarchical Memory Organization: Project-based memory isolation and organization
- Intelligent Context Suggestions: Proactive recommendations based on conversation patterns
- Web UI & GraphQL API: Modern web interface for browsing memories with GraphQL API
- Memory Tracing: Trace sessions and find related memories with visual timelines
- Pattern Recognition: Automatically detect conversation patterns and user preferences
- Knowledge Graph Construction: Build semantic relationships between entities and concepts
- Learning & Adaptation: Continuously improve suggestions based on user feedback
- Multi-Repository Intelligence: Cross-project pattern detection and insights
- Multi-Level Caching: LRU/LFU/FIFO caching strategies for optimal performance
- Data Backup & Restore: Automated backup with tar.gz compression and encryption
- Security & Access Control: Repository-level permissions and AES-GCM encryption
- Health Monitoring: Comprehensive health checks with Prometheus metrics
- Docker Containerization: Production-ready containerization with multi-stage builds
- Go 1.21 or higher
- Chroma vector database (required)
- PostgreSQL 13+ (optional, for metadata storage)
- Docker & Docker Compose (for containerized deployment)
- Redis (optional, for distributed caching)
- OpenAI API key (for embeddings generation)
-
Clone the repository
git clone https://github.com/fredcamaral/mcp-memory.git cd mcp-memory -
Set up environment
cp .env.example .env # Edit .env to add your OPENAI_API_KEY and other configurations -
Install dependencies
go mod download
-
Start Chroma database
docker run -p 9000:8000 chromadb/chroma:latest run --path /data --host 0.0.0.0
-
Run the MCP server (if using MCP tools)
go run cmd/server/main.go
-
Run the GraphQL server and Web UI
go run cmd/graphql/main.go # Or use the binary: # ./graphql
-
Access the Web UI
- Open http://localhost:8082/ in your browser
- GraphQL playground: http://localhost:8082/graphql
- Health check:
curl http://localhost:8081/health
-
Using Docker Compose (Recommended)
cp .env.example .env # Edit .env to configure your environment docker-compose up -d -
Using Docker directly
docker build -t mcp-memory . docker run -p 8080:8080 -p 8081:8081 -p 8082:8082 \ -e OPENAI_API_KEY=your-api-key \ -e CHROMA_URL=http://chroma:8000 \ mcp-memory -
Check deployment
curl http://localhost:8081/health curl http://localhost:8082/metrics
All documentation is organized in the docs/ directory. See the Documentation Index for a complete overview.
- Development Setup - Hot reload development environment
- Deployment Guide - Production deployment instructions
- API Reference - Complete API documentation
- Development Roadmap - Current priorities and future plans
- Monitoring Setup - Observability and metrics configuration
See .env.example for a complete list of configuration options. Key variables include:
| Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY |
(required) | OpenAI API key for embeddings |
CHROMA_URL |
http://localhost:8000 |
Chroma database URL |
MCP_MEMORY_DATA_DIR |
./data |
Data storage directory |
MCP_MEMORY_LOG_LEVEL |
info |
Logging level (debug, info, warn, error) |
MCP_MEMORY_HTTP_PORT |
8080 |
Main MCP API port |
MCP_MEMORY_HEALTH_PORT |
8081 |
Health check port |
MCP_MEMORY_GRAPHQL_PORT |
8082 |
GraphQL API & Web UI port |
MCP_MEMORY_METRICS_PORT |
9090 |
Prometheus metrics port |
MCP_MEMORY_VECTOR_DIM |
1536 |
Vector dimension (OpenAI ada-002) |
MCP_MEMORY_ENCRYPTION_ENABLED |
false |
Enable data encryption |
MCP_MEMORY_ACCESS_CONTROL_ENABLED |
false |
Enable access control |
MCP_MEMORY_CACHE_ENABLED |
true |
Enable performance caching |
- Development:
configs/dev/config.yaml - Staging:
configs/staging/config.yaml - Production:
configs/production/config.yaml - Docker:
configs/docker/config.yaml
- Endpoint:
http://localhost:8081/health - Liveness Probe: Kubernetes-compatible health check
- Readiness Probe: Service availability check
- Endpoint:
http://localhost:8082/metrics - Format: Prometheus format
- Dashboards: Pre-configured Grafana dashboards included
- Structured Logging: JSON format for production
- Log Levels: Debug, Info, Warn, Error
- Correlation IDs: Request tracing support
- Algorithm: AES-GCM 256-bit encryption
- Key Derivation: PBKDF2 with 100,000 iterations
- Scope: Sensitive fields (API keys, passwords, tokens)
- Repository-Level: Isolated access per repository
- User Authentication: Token-based authentication
- Permission System: Read/Write/Admin permissions
- Default: 60 requests per minute per user
- Burst: 10 requests burst capacity
- Distributed: Redis-backed rate limiting
Access the modern web UI at http://localhost:8082/ to:
- Browse and search memories
- View memory details and metadata
- Trace sessions with timeline visualization
- Explore related memories with relationship graphs
- Filter by repository, type, and time period
The GraphQL endpoint is available at http://localhost:8082/graphql with a built-in GraphiQL playground.
# Search memories
query SearchMemories($input: MemoryQueryInput!) {
search(input: $input) {
chunks {
chunk { id content summary type timestamp }
score
}
}
}
# Trace a session
query TraceSession($sessionId: String!) {
traceSession(sessionId: $sessionId) {
id content type timestamp
}
}
# Find related memories
query TraceRelated($chunkId: String!, $depth: Int) {
traceRelated(chunkId: $chunkId, depth: $depth) {
id content type timestamp
}
}# Store a memory
mutation StoreChunk($input: StoreChunkInput!) {
storeChunk(input: $input) {
id summary
}
}Note: The MCP tools are still available but the GraphQL API is now the recommended interface for most use cases.
The server implements the following MCP tools with the standardized naming convention:
Store a conversation or context in memory.
{
"content": "User asked about implementing authentication",
"metadata": {
"type": "conversation",
"tags": ["auth", "security"],
"project": "my-app"
}
}Search for similar conversations or contexts using vector similarity.
{
"query": "authentication implementation",
"limit": 10,
"threshold": 0.7,
"project": "my-app"
}List all stored memories with optional filtering.
{
"project": "my-app",
"limit": 20,
"offset": 0
}Delete specific memories by ID.
{
"id": "memory-id-123"
}Get AI-powered context suggestions based on current context.
{
"current_context": "implementing user login",
"project": "my-app"
}Analyze conversation patterns and trends.
{
"project": "my-app",
"time_range": "7d"
}Export all memory for a project.
{
"project": "my-app",
"format": "json",
"include_vectors": false
}Import conversation context from external sources.
{
"source": "file",
"data": "...",
"project": "my-app"
}Get memory usage statistics.
{
"project": "my-app"
}Update metadata for existing memories.
{
"id": "memory-id-123",
"metadata": {
"tags": ["updated", "important"]
}
}# Install dependencies
go mod download
# Run tests
go test ./...
# Run tests with coverage
go test -cover ./...
# Run linting
golangci-lint run
# Build binary
go build -o bin/mcp-memory cmd/server/main.go
# Run with race detector
go run -race cmd/server/main.go# Format code
go fmt ./...
# Vet code
go vet ./...
# Generate mocks (if using mockgen)
go generate ./...
# Run specific tests
go test -run TestFunctionName ./internal/...
# Benchmark tests
go test -bench=. ./...- Builder Stage: Go compilation with optimizations
- Runtime Stage: Alpine Linux minimal image
- Security: Non-root user, minimal attack surface
- Size: <50MB final image
- mcp-memory: Main MCP server application
- chroma: Vector database for embeddings storage
- postgres: Metadata database (optional)
- redis: Distributed cache (optional)
- prometheus: Metrics collection
- grafana: Metrics visualization with pre-built dashboards
- traefik: Reverse proxy with automatic SSL
- Memory Operations: >10,000 ops/sec
- Vector Search: <100ms p95 latency
- Concurrent Users: 1,000+ simultaneous connections
- Memory Usage: <500MB typical workload
- Multi-Level Caching: Memory, Query, and Vector caches
- Connection Pooling: Database connection management
- Batch Processing: Efficient bulk operations
- Graceful Degradation: Fallback strategies
- Schedule: Configurable interval (default: 24h)
- Retention: Configurable retention period (default: 30 days)
- Compression: gzip compression to reduce storage
- Encryption: Optional backup encryption
# Create backup
curl -X POST http://localhost:8080/api/backup
# List backups
curl http://localhost:8080/api/backups
# Restore backup
curl -X POST http://localhost:8080/api/restore \
-H "Content-Type: application/json" \
-d '{"backup_id": "backup-20241201-120000"}'- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'feat: add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
feat:New featuresfix:Bug fixesdocs:Documentation changesstyle:Code style changesrefactor:Code refactoringtest:Test changesbuild:Build system changes
This project is licensed under the MIT License - see the LICENSE file for details.
- Anthropic for the Model Context Protocol specification
- Chroma for the high-performance vector database
- OpenAI for embedding model APIs
- Prometheus & Grafana communities for monitoring tools
- Go community for excellent libraries and tooling
- GitHub Issues: Report bugs and request features
- Documentation: Full documentation
- Discord: Community support
- MCP Specification - Official Model Context Protocol documentation
- chroma-go - Go client for Chroma vector database
- Claude Desktop - Desktop application with MCP support
Made with β€οΈ for the MCP ecosystem