- Added docker-compose.api-localai.yml for Docker network integration - Updated config.py to support dynamic Supabase connection strings via environment variables - Enhanced documentation with Docker network deployment instructions - Added specific N8N workflow integration guidance - Solved Docker networking issues for container-to-container communication Key improvements: * Container-to-container API access for N8N workflows * Automatic service dependency resolution (Ollama, Supabase) * Comprehensive deployment options for different use cases * Production-ready Docker network configuration 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
118 lines
3.5 KiB
Plaintext
118 lines
3.5 KiB
Plaintext
---
|
|
title: Introduction
|
|
description: 'Welcome to the Mem0 Memory System - A comprehensive memory layer for AI agents'
|
|
---
|
|
|
|
<img
|
|
className="block dark:hidden"
|
|
src="/images/hero-light.svg"
|
|
alt="Hero Light"
|
|
/>
|
|
<img
|
|
className="hidden dark:block"
|
|
src="/images/hero-dark.svg"
|
|
alt="Hero Dark"
|
|
/>
|
|
|
|
## What is Mem0 Memory System?
|
|
|
|
The Mem0 Memory System is a comprehensive, self-hosted memory layer designed for AI agents and applications. Built on top of the open-source mem0 framework, it provides persistent, intelligent memory capabilities that enhance AI interactions through contextual understanding and knowledge retention.
|
|
|
|
<CardGroup cols={2}>
|
|
<Card
|
|
title="Local-First Architecture"
|
|
icon="server"
|
|
href="/essentials/architecture"
|
|
>
|
|
Complete local deployment with Ollama, Neo4j, and Supabase for maximum privacy and control
|
|
</Card>
|
|
<Card
|
|
title="Multi-Provider Support"
|
|
icon="plug"
|
|
href="/llm/configuration"
|
|
>
|
|
Seamlessly switch between OpenAI, Ollama, and other LLM providers
|
|
</Card>
|
|
<Card
|
|
title="Graph Memory"
|
|
icon="project-diagram"
|
|
href="/database/neo4j"
|
|
>
|
|
Advanced relationship mapping with Neo4j for contextual memory connections
|
|
</Card>
|
|
<Card
|
|
title="REST API Server ✅"
|
|
icon="code"
|
|
href="/open-source/features/rest-api"
|
|
>
|
|
Production-ready FastAPI server with authentication, rate limiting, and comprehensive testing
|
|
</Card>
|
|
</CardGroup>
|
|
|
|
## Key Features
|
|
|
|
<AccordionGroup>
|
|
<Accordion title="Vector Memory Storage">
|
|
High-performance vector search using Supabase with pgvector for semantic memory retrieval and similarity matching.
|
|
</Accordion>
|
|
|
|
<Accordion title="Graph Relationships">
|
|
Neo4j-powered knowledge graph for complex entity relationships and contextual memory connections.
|
|
</Accordion>
|
|
|
|
<Accordion title="Local LLM Support">
|
|
Full Ollama integration with 20+ local models including Llama, Qwen, and specialized embedding models.
|
|
</Accordion>
|
|
|
|
<Accordion title="REST API Complete ✅">
|
|
Production-ready FastAPI server with comprehensive memory operations, authentication, rate limiting, and testing suites.
|
|
</Accordion>
|
|
|
|
<Accordion title="Self-Hosted Privacy">
|
|
Complete local deployment ensuring your data never leaves your infrastructure.
|
|
</Accordion>
|
|
</AccordionGroup>
|
|
|
|
## Architecture Overview
|
|
|
|
The system consists of several key components working together:
|
|
|
|
```mermaid
|
|
graph TB
|
|
A[AI Applications] --> B[MCP Server]
|
|
B --> C[Memory API]
|
|
C --> D[Mem0 Core]
|
|
D --> E[Vector Store - Supabase]
|
|
D --> F[Graph Store - Neo4j]
|
|
D --> G[LLM Provider]
|
|
G --> H[Ollama Local]
|
|
G --> I[OpenAI/Remote]
|
|
```
|
|
|
|
## Current Status: Phase 2 Complete ✅
|
|
|
|
<Note>
|
|
**REST API Ready**: Complete FastAPI implementation with authentication, testing, and documentation.
|
|
</Note>
|
|
|
|
| Component | Status | Description |
|
|
|-----------|--------|-------------|
|
|
| **Neo4j** | ✅ Ready | Graph database running on localhost:7474 |
|
|
| **Supabase** | ✅ Ready | Self-hosted database with pgvector on localhost:8000 |
|
|
| **Ollama** | ✅ Ready | 21+ local models available on localhost:11434 |
|
|
| **Mem0 Core** | ✅ Ready | Memory management system v0.1.115 |
|
|
| **REST API** | ✅ Ready | FastAPI server with full CRUD, auth, testing, and Docker networking support |
|
|
|
|
## Getting Started
|
|
|
|
<CardGroup cols={1}>
|
|
<Card
|
|
title="Quick Start Guide"
|
|
icon="rocket"
|
|
href="/quickstart"
|
|
>
|
|
Get your memory system running in under 5 minutes
|
|
</Card>
|
|
</CardGroup>
|
|
|
|
Ready to enhance your AI applications with persistent, intelligent memory? Let's get started! |