- Configure mem0 to use self-hosted Supabase instead of Qdrant for vector storage - Update docker-compose to connect containers to localai network - Install vecs library for Supabase pgvector integration - Create comprehensive test suite for Supabase + mem0 integration - Update documentation to reflect Supabase configuration - All containers now connected to shared localai network - Successful vector storage and retrieval tests completed 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
14 lines
371 B
Plaintext
14 lines
371 B
Plaintext
# OpenAI Configuration (for initial testing)
|
|
OPENAI_API_KEY=your_openai_api_key_here
|
|
|
|
# Supabase Configuration
|
|
SUPABASE_URL=your_supabase_url_here
|
|
SUPABASE_ANON_KEY=your_supabase_anon_key_here
|
|
|
|
# Neo4j Configuration
|
|
NEO4J_URI=bolt://localhost:7687
|
|
NEO4J_USERNAME=neo4j
|
|
NEO4J_PASSWORD=your_neo4j_password_here
|
|
|
|
# Ollama Configuration
|
|
OLLAMA_BASE_URL=http://localhost:11434 |