Smart memory for AI assistants - A Model Context Protocol (MCP) server that remembers your conversations, learns from patterns, and provides intelligent context suggestions.
Perfect for Claude Desktop, VS Code, Continue, Cursor, and any MCP-compatible AI client.
-
Clone and start everything:
git clone https://github.com/fredcamaral/mcp-memory.git cd mcp-memory cp .env.example .env # Edit .env and add your OPENAI_API_KEY # Local development (builds from source) docker-compose up -d # Production with auto-updates (uses registry + Watchtower) docker-compose -f docker-compose.yml -f docker-compose.prod.yml --profile auto-update up -d -
Configure your AI client (e.g., Claude Desktop, Claude Code, Windsurf, Cursor, etc):
!!! The MCP works through a simple stdio <> HTTP proxy written in javascript, as per shown below. !!! SSE protocol is also available @ :9080/sse
{ "mcpServers": { "memory": { "type": "stdio", "command": "docker", "args": ["exec", "-i", "mcp-memory-server", "node", "/app/mcp-proxy.js"] } } } -
Test it! 🎉
- Open your AI client (Claude Desktop, etc.)
- Ask it to store a memory: "Please remember that I prefer TypeScript over JavaScript"
- Later ask: "What do you remember about my coding preferences?"
MCP Memory transforms your AI assistant into a smart companion that:
- 📚 Remembers Everything: Stores all your conversations and contexts across sessions
- 🔍 Smart Search: Finds relevant past conversations using AI-powered similarity search
- 🧠 Pattern Learning: Recognizes your preferences, coding patterns, and decision-making
- 💡 Proactive Suggestions: Automatically suggests relevant context from your history
- 🔄 Cross-Project Intelligence: Learns patterns across all your repositories and projects
Configuration:
Add to your Continue configuration:
Once configured, your AI assistant automatically gets these powerful memory abilities:
- Store important moments: memory_store_chunk - Save conversations, decisions, solutions
- Smart search: memory_search - Find similar past conversations and contexts
- Get context: memory_get_context - Retrieve project overview and recent activity
- Find patterns: memory_get_patterns - Identify recurring themes and solutions
- Health monitoring: memory_health_dashboard - Track memory system effectiveness
- Intelligent decay: memory_decay_management - Automatically summarize and archive old memories
- 🧠 Conversation Flow Detection: Recognizes when you're debugging, implementing, or planning
- 🔗 Relationship Mapping: Automatically links related memories and contexts
- 📊 Pattern Recognition: Learns your coding patterns, preferences, and decision-making
- 💡 Smart Suggestions: Proactively suggests relevant memories based on current context
- 🗂️ Multi-Repository Support: Works across all your projects with intelligent cross-referencing
🔴 "Connection refused" or "Server not responding"
🔴 "OpenAI API errors"
- Check your API key in .env file
- Verify you have credits in your OpenAI account
- Check network connectivity
🔴 "Memory not persisting"
-
Test the server directly:
curl http://localhost:8081/health -
Browse the web interface:
- Open http://localhost:8082 in your browser
- You should see the memory management dashboard
-
Test with your AI client:
- Ask it to remember something: "Please store that I work on the mcp-memory project"
- Ask it to recall: "What do you remember about my current projects?"
For production use, see the detailed configurations:
- 📖 Full Documentation - Complete guides and API reference
- 🌐 Web Interface - Browse and manage memories
- 📊 GraphQL API - Playground for advanced queries
- 🔍 Health Monitoring - System status and metrics
We welcome contributions! See Contributing Guide for details.
MIT License - see LICENSE file for details.
🚀 Ready to give your AI assistant a perfect memory? Follow the Quick Start above and you'll be up and running in minutes!
Questions? Open an issue or check our documentation.