Initial commit: Project foundation and architecture
- Add project requirements document - Add comprehensive architecture design - Add README with quick start guide - Add .gitignore for Python/Docker/Node 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
66
PROJECT_REQUIREMENTS.md
Normal file
66
PROJECT_REQUIREMENTS.md
Normal file
@@ -0,0 +1,66 @@
|
||||
# T6 Mem0 v2 - Project Requirements
|
||||
|
||||
## Original User Request
|
||||
|
||||
**Date**: 2025-10-13
|
||||
|
||||
### Core Objectives
|
||||
|
||||
Set up a comprehensive memory system with the following capabilities:
|
||||
- **MCP Server Integration**: Serve as an MCP server for Claude and other LLM-based systems
|
||||
- **REST API Access**: Enable memory storage and retrieval via REST API
|
||||
- **Data Storage**: Use locally running Supabase for primary data storage
|
||||
- **Graph Visualization**: Use Neo4j for storing and visualizing memory relationships
|
||||
- **LLM Integration**: Initial phase with OpenAI, future phase with local Ollama instance
|
||||
|
||||
### Technology Stack
|
||||
|
||||
**Phase 1 (Initial Implementation)**:
|
||||
- mem0.ai as the core memory framework
|
||||
- Supabase (local instance) for vector and structured storage
|
||||
- Neo4j for graph-based memory relationships
|
||||
- OpenAI API for embeddings and LLM capabilities
|
||||
- MCP (Model Context Protocol) server for AI agent integration
|
||||
|
||||
**Phase 2 (Future)**:
|
||||
- Local Ollama integration for LLM independence
|
||||
- Additional local model support
|
||||
|
||||
### Key Requirements
|
||||
|
||||
1. **MCP Server**: Must function as an MCP server that can be used by Claude Code and other LLM systems
|
||||
2. **REST API**: Full REST API for CRUD operations on memories
|
||||
3. **Local Infrastructure**: All data storage must be local (Supabase, Neo4j)
|
||||
4. **Visualization**: Neo4j integration for memory graph visualization
|
||||
5. **Documentation**: Mintlify-based documentation site
|
||||
6. **Version Control**: Git repository at https://git.colsys.tech/klas/t6_mem0_v2
|
||||
|
||||
### Repository Information
|
||||
|
||||
- **Git Remote**: https://git.colsys.tech/klas/t6_mem0_v2
|
||||
- **Username**: klas
|
||||
- **Password**: csjXgew3In
|
||||
|
||||
### Project Phases
|
||||
|
||||
#### Phase 1: Foundation
|
||||
- Research and validate mem0.ai capabilities
|
||||
- Design architecture with Supabase + Neo4j + OpenAI
|
||||
- Implement core memory storage and retrieval
|
||||
- Build MCP server interface
|
||||
- Create REST API endpoints
|
||||
- Set up Mintlify documentation
|
||||
|
||||
#### Phase 2: Local LLM Integration
|
||||
- Integrate Ollama for local model support
|
||||
- Add model switching capabilities
|
||||
- Performance optimization for local models
|
||||
|
||||
### Success Criteria
|
||||
|
||||
- Functional MCP server that Claude Code can use
|
||||
- Working REST API for memory operations
|
||||
- Memories persisted in local Supabase
|
||||
- Graph relationships visible in Neo4j
|
||||
- Complete documentation in Mintlify
|
||||
- All code versioned in git repository
|
||||
Reference in New Issue
Block a user