Add Docker networking support for N8N and container integration

- Added docker-compose.api-localai.yml for Docker network integration
- Updated config.py to support dynamic Supabase connection strings via environment variables
- Enhanced documentation with Docker network deployment instructions
- Added specific N8N workflow integration guidance
- Solved Docker networking issues for container-to-container communication

Key improvements:
* Container-to-container API access for N8N workflows
* Automatic service dependency resolution (Ollama, Supabase)
* Comprehensive deployment options for different use cases
* Production-ready Docker network configuration

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Docker Config Backup
2025-08-01 08:25:25 +02:00
parent e55c38bc7f
commit 710adff0aa
6 changed files with 75 additions and 3 deletions

View File

@@ -75,9 +75,28 @@ Our Phase 2 implementation provides a production-ready REST API with two deploym
The Docker deployment automatically configures the API to accept external connections on `0.0.0.0:8080`.
</Note>
</Tab>
<Tab title="Docker Network Integration ✅ For N8N/Containers">
For integration with N8N or other Docker containers on custom networks:
```bash
# Deploy to localai network (or your custom network)
docker-compose -f docker-compose.api-localai.yml up -d
# Find container IP for connections
docker inspect mem0-api-localai --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}'
```
**Access:** http://CONTAINER_IP:8080 (from within Docker network)
**Example:** http://172.21.0.17:8080
<Note>
Perfect for N8N workflows and Docker-to-Docker communication. Automatically handles service dependencies like Ollama and Supabase connections.
</Note>
</Tab>
</Tabs>
Both options provide:
All deployment options provide:
- Interactive documentation at `/docs`
- Full authentication and rate limiting
- Comprehensive error handling