Add Docker support and fix external access issues

🐳 Docker Configuration:
- Created Dockerfile for containerized API deployment
- Added docker-compose.api.yml for complete stack
- Added requirements.txt for Docker builds
- Added .dockerignore for optimized builds
- Configured external access on 0.0.0.0:8080

📚 Documentation Updates:
- Updated quickstart to reflect Neo4j already running
- Added Docker deployment tabs with external access info
- Updated REST API docs with Docker deployment options
- Clarified local vs external access deployment methods

🔧 Configuration:
- API_HOST=0.0.0.0 for external access in Docker
- Health checks and restart policies
- Proper networking and volume configuration
- Environment variable configuration

 Addresses user issues:
- REST API now accessible from outside the machine via Docker
- Documentation reflects actual infrastructure state
- Clear deployment options for different use cases

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Docker Config Backup
2025-07-31 17:43:45 +02:00
parent 801ae75069
commit e2899a2bd0
6 changed files with 252 additions and 15 deletions

63
.dockerignore Normal file
View File

@@ -0,0 +1,63 @@
# Git
.git
.gitignore
# Python
__pycache__
*.pyc
*.pyo
*.pyd
.Python
env
pip-log.txt
pip-delete-this-directory.txt
.tox
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.log
.git
.mypy_cache
.pytest_cache
.hypothesis
# Virtual environments
venv/
.venv/
ENV/
env/
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# OS
.DS_Store
.DS_Store?
._*
.Spotlight-V100
.Trashes
ehthumbs.db
Thumbs.db
# Project specific
logs/
*.log
test_*.py
docs/
examples/
tests/
backup/
*.md
!README.md
# Docker
Dockerfile*
docker-compose*.yml
.dockerignore

49
Dockerfile Normal file
View File

@@ -0,0 +1,49 @@
FROM python:3.10-slim
# Set working directory
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
gcc \
g++ \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements and install Python dependencies
COPY requirements.txt* ./
RUN pip install --no-cache-dir \
fastapi \
uvicorn \
posthog \
qdrant-client \
sqlalchemy \
vecs \
ollama \
mem0ai \
requests \
httpx
# Copy application code
COPY . .
# Create non-root user
RUN useradd -m -u 1000 appuser && chown -R appuser:appuser /app
USER appuser
# Expose port
EXPOSE 8080
# Environment variables with defaults
ENV API_HOST=0.0.0.0
ENV API_PORT=8080
ENV API_KEYS=mem0_dev_key_123456789,mem0_docker_key_987654321
ENV ADMIN_API_KEYS=mem0_admin_key_111222333
ENV RATE_LIMIT_REQUESTS=100
ENV RATE_LIMIT_WINDOW_MINUTES=1
# Health check
HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8080/health || exit 1
# Start the API server
CMD ["python", "start_api.py"]

53
docker-compose.api.yml Normal file
View File

@@ -0,0 +1,53 @@
version: '3.8'
services:
mem0-api:
build: .
container_name: mem0-api-server
ports:
- "8080:8080"
environment:
- API_HOST=0.0.0.0
- API_PORT=8080
- API_KEYS=mem0_dev_key_123456789,mem0_docker_key_987654321
- ADMIN_API_KEYS=mem0_admin_key_111222333
- RATE_LIMIT_REQUESTS=100
- RATE_LIMIT_WINDOW_MINUTES=1
networks:
- mem0-network
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/health"]
interval: 30s
timeout: 10s
retries: 3
depends_on:
- neo4j
volumes:
- ./logs:/app/logs:rw
neo4j:
image: neo4j:5.23
container_name: mem0-neo4j-api
ports:
- "7474:7474"
- "7687:7687"
environment:
- NEO4J_AUTH=neo4j/password123
- NEO4J_PLUGINS=["apoc"]
- NEO4J_dbms_security_procedures_unrestricted=apoc.*
volumes:
- neo4j_data:/data
- neo4j_logs:/logs
networks:
- mem0-network
restart: unless-stopped
networks:
mem0-network:
driver: bridge
external: false
volumes:
neo4j_data:
neo4j_logs:

View File

@@ -68,14 +68,34 @@ Mem0 provides a comprehensive REST API server built with FastAPI. The implementa
```
</Tab>
<Tab title="With Docker (Future)">
Docker support is planned for Phase 3.
<Tab title="With Docker ✅ Recommended for External Access">
For external access and production deployment:
```bash
# Coming soon in Phase 3
docker build -t mem0-api-server .
docker run -p 8080:8080 mem0-api-server
# Using Docker Compose (recommended)
docker-compose -f docker-compose.api.yml up -d
```
Or build and run manually:
```bash
# Build the image
docker build -t mem0-api-server .
# Run with external access
docker run -d \
--name mem0-api \
-p 8080:8080 \
-e API_HOST=0.0.0.0 \
-e API_PORT=8080 \
mem0-api-server
```
**Access:** http://YOUR_SERVER_IP:8080 (accessible from external networks)
<Note>
The Docker deployment automatically configures external access on `0.0.0.0:8080`.
</Note>
</Tab>
</Tabs>

View File

@@ -16,16 +16,28 @@ description: 'Get your Mem0 Memory System running in under 5 minutes'
## Installation
### Step 1: Start Database Services
### Step 1: Verify Database Services
```bash
docker compose up -d neo4j
```
Both required database services are already running:
<Note>
Supabase is already running as part of your existing infrastructure on the localai network.
**Neo4j** is already running in Docker container `mem0-neo4j` on ports 7474 (HTTP) and 7687 (Bolt).
**Supabase** is already running as part of your existing infrastructure on the localai network.
</Note>
You can verify the services are running:
```bash
# Check running containers
docker ps | grep -E "(neo4j|supabase)"
# Test Neo4j connection
curl http://localhost:7474
# Test Supabase connection
curl http://localhost:8000/health
```
### Step 2: Test Your Installation
```bash
@@ -36,13 +48,36 @@ You should see all systems passing.
### Step 3: Start the REST API Server ✅
Our Phase 2 implementation provides a production-ready REST API:
Our Phase 2 implementation provides a production-ready REST API with two deployment options:
```bash
python start_api.py
```
<Tabs>
<Tab title="Direct Python (Local Only)">
For local development and testing:
The server will start on **http://localhost:8080** with:
```bash
python start_api.py
```
**Access:** http://localhost:8080 (localhost only)
</Tab>
<Tab title="Docker (External Access) ✅ Recommended">
For external access and production deployment:
```bash
# Build and start the API server
docker-compose -f docker-compose.api.yml up -d
```
**Access:** http://YOUR_SERVER_IP:8080 (accessible from outside)
<Note>
The Docker deployment automatically configures the API to accept external connections on `0.0.0.0:8080`.
</Note>
</Tab>
</Tabs>
Both options provide:
- Interactive documentation at `/docs`
- Full authentication and rate limiting
- Comprehensive error handling

17
requirements.txt Normal file
View File

@@ -0,0 +1,17 @@
# Core API dependencies
fastapi>=0.104.0
uvicorn[standard]>=0.24.0
pydantic>=2.5.0
# Mem0 dependencies
mem0ai>=0.1.115
posthog>=3.5.0
qdrant-client>=1.9.1
sqlalchemy>=2.0.31
vecs>=0.4.0
ollama>=0.1.0
# Additional utilities
requests
httpx
python-dateutil