61a4050a8eba464821f0e05d9e3445349b4b23e8
Implementation Summary:
- REST API with FastAPI (complete CRUD operations)
- MCP Server with Python MCP SDK (7 tools)
- Supabase migrations (pgvector setup)
- Docker Compose orchestration
- Mintlify documentation site
- Environment configuration
- Shared config module
REST API Features:
- POST /v1/memories/ - Add memory
- GET /v1/memories/search - Semantic search
- GET /v1/memories/{id} - Get memory
- GET /v1/memories/user/{user_id} - User memories
- PATCH /v1/memories/{id} - Update memory
- DELETE /v1/memories/{id} - Delete memory
- GET /v1/health - Health check
- GET /v1/stats - Statistics
- Bearer token authentication
- OpenAPI documentation
MCP Server Tools:
- add_memory - Add from messages
- search_memories - Semantic search
- get_memory - Retrieve by ID
- get_all_memories - List all
- update_memory - Update content
- delete_memory - Delete by ID
- delete_all_memories - Bulk delete
Infrastructure:
- Neo4j 5.26 with APOC/GDS
- Supabase pgvector integration
- Docker network: localai
- Health checks and monitoring
- Structured logging
Documentation:
- Introduction page
- Quickstart guide
- Architecture deep dive
- Mintlify configuration
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
T6 Mem0 v2 - Memory System for LLM Applications
Comprehensive memory system based on mem0.ai featuring MCP server integration, REST API, hybrid storage architecture, and AI-powered memory management.
Features
- MCP Server: Model Context Protocol integration for Claude Code and other AI tools
- REST API: Full HTTP API for memory operations (CRUD)
- Hybrid Storage: Supabase (pgvector) + Neo4j (graph relationships)
- AI-Powered: OpenAI embeddings and LLM processing
- Multi-Agent Support: User and agent-specific memory isolation
- Graph Visualization: Neo4j Browser for relationship exploration
- Docker-Native: Fully containerized with Docker Compose
Architecture
Clients (Claude, N8N, Apps)
↓
MCP Server (8765) + REST API (8080)
↓
Mem0 Core Library
↓
Supabase (Vector) + Neo4j (Graph) + OpenAI (LLM)
Quick Start
Prerequisites
- Docker and Docker Compose
- Existing Supabase instance (PostgreSQL with pgvector)
- OpenAI API key
- Python 3.11+ (for development)
Installation
# Clone repository
git clone https://git.colsys.tech/klas/t6_mem0_v2
cd t6_mem0_v2
# Configure environment
cp .env.example .env
# Edit .env with your credentials
# Start services
docker compose up -d
# Verify health
curl http://localhost:8080/v1/health
Configuration
Create .env file:
# OpenAI
OPENAI_API_KEY=sk-...
# Supabase
SUPABASE_CONNECTION_STRING=postgresql://user:pass@172.21.0.12:5432/postgres
# Neo4j
NEO4J_URI=neo4j://neo4j:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=your-password
# API
API_KEY=your-secure-api-key
Usage
REST API
# Add memory
curl -X POST http://localhost:8080/v1/memories/ \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"messages":[{"role":"user","content":"I love pizza"}],"user_id":"alice"}'
# Search memories
curl -X GET "http://localhost:8080/v1/memories/search?query=food&user_id=alice" \
-H "Authorization: Bearer YOUR_API_KEY"
MCP Server (Claude Code)
Add to Claude Code configuration:
{
"mcpServers": {
"t6-mem0": {
"url": "http://localhost:8765/mcp/claude/sse/user-123"
}
}
}
Documentation
Full documentation available at: docs/ (Mintlify)
Project Structure
t6_mem0_v2/
├── api/ # REST API (FastAPI)
├── mcp-server/ # MCP server implementation
├── migrations/ # Database migrations
├── docker/ # Docker configurations
├── docs/ # Mintlify documentation
├── tests/ # Test suites
└── docker-compose.yml
Technology Stack
- Core: mem0ai library
- Vector DB: Supabase with pgvector
- Graph DB: Neo4j 5.x
- LLM: OpenAI API (Phase 1), Ollama (Phase 2)
- REST API: FastAPI
- MCP: Python MCP SDK
- Container: Docker & Docker Compose
Roadmap
Phase 1: Foundation (Current)
- ✅ Architecture design
- ⏳ REST API implementation
- ⏳ MCP server implementation
- ⏳ Supabase integration
- ⏳ Neo4j integration
- ⏳ Documentation site
Phase 2: Local LLM
- Local Ollama integration
- Model switching capabilities
- Performance optimization
Phase 3: Advanced Features
- Memory versioning
- Advanced graph queries
- Multi-modal memory support
- Analytics dashboard
Development
# Install dependencies
pip install -r requirements.txt
# Run tests
pytest tests/
# Format code
black .
ruff check .
# Run locally (development)
python -m api.main
Contributing
This is a private project. For issues or suggestions, contact the maintainer.
License
Proprietary - All rights reserved
Support
- Repository: https://git.colsys.tech/klas/t6_mem0_v2
- Documentation: See
docs/directory - Issues: Contact maintainer
Status: In Development Version: 0.1.0 Last Updated: 2025-10-13
Description
Languages
Python
92.4%
PLpgSQL
5%
Shell
2.6%