# T6 Mem0 v2 - Memory System for LLM Applications Comprehensive memory system based on mem0.ai featuring MCP server integration, REST API, hybrid storage architecture, and AI-powered memory management. ## Features - **MCP Server**: Model Context Protocol integration for Claude Code and other AI tools - **REST API**: Full HTTP API for memory operations (CRUD) - **Hybrid Storage**: Supabase (pgvector) + Neo4j (graph relationships) - **AI-Powered**: OpenAI embeddings and LLM processing - **Multi-Agent Support**: User and agent-specific memory isolation - **Graph Visualization**: Neo4j Browser for relationship exploration - **Docker-Native**: Fully containerized with Docker Compose ## Architecture ``` Clients (Claude, N8N, Apps) ↓ MCP Server (8765) + REST API (8080) ↓ Mem0 Core Library ↓ Supabase (Vector) + Neo4j (Graph) + OpenAI (LLM) ``` ## Quick Start ### Prerequisites - Docker and Docker Compose - Existing Supabase instance (PostgreSQL with pgvector) - OpenAI API key - Python 3.11+ (for development) ### Installation ```bash # Clone repository git clone https://git.colsys.tech/klas/t6_mem0_v2 cd t6_mem0_v2 # Configure environment cp .env.example .env # Edit .env with your credentials # Start services docker compose up -d # Verify health curl http://localhost:8080/v1/health ``` ### Configuration Create `.env` file: ```bash # OpenAI OPENAI_API_KEY=sk-... # Supabase SUPABASE_CONNECTION_STRING=postgresql://user:pass@172.21.0.12:5432/postgres # Neo4j NEO4J_URI=neo4j://neo4j:7687 NEO4J_USER=neo4j NEO4J_PASSWORD=your-password # API API_KEY=your-secure-api-key ``` ## Usage ### REST API ```bash # Add memory curl -X POST http://localhost:8080/v1/memories/ \ -H "Authorization: Bearer YOUR_API_KEY" \ -H "Content-Type: application/json" \ -d '{"messages":[{"role":"user","content":"I love pizza"}],"user_id":"alice"}' # Search memories curl -X GET "http://localhost:8080/v1/memories/search?query=food&user_id=alice" \ -H "Authorization: Bearer YOUR_API_KEY" ``` ### MCP Server (Claude Code) Add to Claude Code configuration: ```json { "mcpServers": { "t6-mem0": { "url": "http://localhost:8765/mcp/claude/sse/user-123" } } } ``` ## Documentation Full documentation available at: `docs/` (Mintlify) - [Architecture](ARCHITECTURE.md) - [Project Requirements](PROJECT_REQUIREMENTS.md) - [API Reference](docs/api/) - [Deployment Guide](docs/deployment/) ## Project Structure ``` t6_mem0_v2/ ├── api/ # REST API (FastAPI) ├── mcp-server/ # MCP server implementation ├── migrations/ # Database migrations ├── docker/ # Docker configurations ├── docs/ # Mintlify documentation ├── tests/ # Test suites └── docker-compose.yml ``` ## Technology Stack - **Core**: mem0ai library - **Vector DB**: Supabase with pgvector - **Graph DB**: Neo4j 5.x - **LLM**: OpenAI API (Phase 1), Ollama (Phase 2) - **REST API**: FastAPI - **MCP**: Python MCP SDK - **Container**: Docker & Docker Compose ## Roadmap ### Phase 1: Foundation (Current) - ✅ Architecture design - ⏳ REST API implementation - ⏳ MCP server implementation - ⏳ Supabase integration - ⏳ Neo4j integration - ⏳ Documentation site ### Phase 2: Local LLM - Local Ollama integration - Model switching capabilities - Performance optimization ### Phase 3: Advanced Features - Memory versioning - Advanced graph queries - Multi-modal memory support - Analytics dashboard ## Development ```bash # Install dependencies pip install -r requirements.txt # Run tests pytest tests/ # Format code black . ruff check . # Run locally (development) python -m api.main ``` ## Contributing This is a private project. For issues or suggestions, contact the maintainer. ## License Proprietary - All rights reserved ## Support - Repository: https://git.colsys.tech/klas/t6_mem0_v2 - Documentation: See `docs/` directory - Issues: Contact maintainer --- **Status**: In Development **Version**: 0.1.0 **Last Updated**: 2025-10-13