Integrate self-hosted Supabase with mem0 system
- Configure mem0 to use self-hosted Supabase instead of Qdrant for vector storage - Update docker-compose to connect containers to localai network - Install vecs library for Supabase pgvector integration - Create comprehensive test suite for Supabase + mem0 integration - Update documentation to reflect Supabase configuration - All containers now connected to shared localai network - Successful vector storage and retrieval tests completed 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
117
docs/introduction.mdx
Normal file
117
docs/introduction.mdx
Normal file
@@ -0,0 +1,117 @@
|
||||
---
|
||||
title: Introduction
|
||||
description: 'Welcome to the Mem0 Memory System - A comprehensive memory layer for AI agents'
|
||||
---
|
||||
|
||||
<img
|
||||
className="block dark:hidden"
|
||||
src="/images/hero-light.svg"
|
||||
alt="Hero Light"
|
||||
/>
|
||||
<img
|
||||
className="hidden dark:block"
|
||||
src="/images/hero-dark.svg"
|
||||
alt="Hero Dark"
|
||||
/>
|
||||
|
||||
## What is Mem0 Memory System?
|
||||
|
||||
The Mem0 Memory System is a comprehensive, self-hosted memory layer designed for AI agents and applications. Built on top of the open-source mem0 framework, it provides persistent, intelligent memory capabilities that enhance AI interactions through contextual understanding and knowledge retention.
|
||||
|
||||
<CardGroup cols={2}>
|
||||
<Card
|
||||
title="Local-First Architecture"
|
||||
icon="server"
|
||||
href="/essentials/architecture"
|
||||
>
|
||||
Complete local deployment with Ollama, Neo4j, and Supabase for maximum privacy and control
|
||||
</Card>
|
||||
<Card
|
||||
title="Multi-Provider Support"
|
||||
icon="plug"
|
||||
href="/llm/configuration"
|
||||
>
|
||||
Seamlessly switch between OpenAI, Ollama, and other LLM providers
|
||||
</Card>
|
||||
<Card
|
||||
title="Graph Memory"
|
||||
icon="project-diagram"
|
||||
href="/database/neo4j"
|
||||
>
|
||||
Advanced relationship mapping with Neo4j for contextual memory connections
|
||||
</Card>
|
||||
<Card
|
||||
title="MCP Integration"
|
||||
icon="link"
|
||||
href="/guides/mcp-integration"
|
||||
>
|
||||
Model Context Protocol server for Claude Code and other AI tools
|
||||
</Card>
|
||||
</CardGroup>
|
||||
|
||||
## Key Features
|
||||
|
||||
<AccordionGroup>
|
||||
<Accordion title="Vector Memory Storage">
|
||||
High-performance vector search using Supabase with pgvector for semantic memory retrieval and similarity matching.
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Graph Relationships">
|
||||
Neo4j-powered knowledge graph for complex entity relationships and contextual memory connections.
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Local LLM Support">
|
||||
Full Ollama integration with 20+ local models including Llama, Qwen, and specialized embedding models.
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="API-First Design">
|
||||
RESTful API with comprehensive memory operations, authentication, and rate limiting.
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Self-Hosted Privacy">
|
||||
Complete local deployment ensuring your data never leaves your infrastructure.
|
||||
</Accordion>
|
||||
</AccordionGroup>
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
The system consists of several key components working together:
|
||||
|
||||
```mermaid
|
||||
graph TB
|
||||
A[AI Applications] --> B[MCP Server]
|
||||
B --> C[Memory API]
|
||||
C --> D[Mem0 Core]
|
||||
D --> E[Vector Store - Supabase]
|
||||
D --> F[Graph Store - Neo4j]
|
||||
D --> G[LLM Provider]
|
||||
G --> H[Ollama Local]
|
||||
G --> I[OpenAI/Remote]
|
||||
```
|
||||
|
||||
## Current Status: Phase 1 Complete ✅
|
||||
|
||||
<Note>
|
||||
**Foundation Ready**: All core infrastructure components are operational and tested.
|
||||
</Note>
|
||||
|
||||
| Component | Status | Description |
|
||||
|-----------|--------|-------------|
|
||||
| **Neo4j** | ✅ Ready | Graph database running on localhost:7474 |
|
||||
| **Supabase** | ✅ Ready | Self-hosted database with pgvector on localhost:8000 |
|
||||
| **Ollama** | ✅ Ready | 21+ local models available on localhost:11434 |
|
||||
| **Mem0 Core** | ✅ Ready | Memory management system v0.1.115 |
|
||||
|
||||
## Getting Started
|
||||
|
||||
<CardGroup cols={1}>
|
||||
<Card
|
||||
title="Quick Start Guide"
|
||||
icon="rocket"
|
||||
href="/quickstart"
|
||||
>
|
||||
Get your memory system running in under 5 minutes
|
||||
</Card>
|
||||
</CardGroup>
|
||||
|
||||
Ready to enhance your AI applications with persistent, intelligent memory? Let's get started!
|
||||
Reference in New Issue
Block a user