--- title: Overview description: 'Enhance your memory system with graph-based knowledge representation and retrieval' icon: "info" iconType: "solid" --- Mem0 now supports **Graph Memory**. With Graph Memory, users can now create and utilize complex relationships between pieces of information, allowing for more nuanced and context-aware responses. This integration enables users to leverage the strengths of both vector-based and graph-based approaches, resulting in more accurate and comprehensive information retrieval and generation. NodeSDK now supports Graph Memory. 🎉 ## Installation To use Mem0 with Graph Memory support, install it using pip: ```bash Python pip install "mem0ai[graph]" ``` ```bash TypeScript npm install mem0ai ``` This command installs Mem0 along with the necessary dependencies for graph functionality. Try Graph Memory on Google Colab. Open In Colab ## Initialize Graph Memory To initialize Graph Memory you'll need to set up your configuration with graph store providers. Currently, we support [Neo4j](#initialize-neo4j) and [Memgraph](#initialize-memgraph) as graph store providers. ### Initialize Neo4j You can setup [Neo4j](https://neo4j.com/) locally or use the hosted [Neo4j AuraDB](https://neo4j.com/product/auradb/). If you are using Neo4j locally, then you need to install [APOC plugins](https://neo4j.com/labs/apoc/4.1/installation/). User can also customize the LLM for Graph Memory from the [Supported LLM list](https://docs.mem0.ai/components/llms/overview) with three levels of configuration: 1. **Main Configuration**: If `llm` is set in the main config, it will be used for all graph operations. 2. **Graph Store Configuration**: If `llm` is set in the graph_store config, it will override the main config `llm` and be used specifically for graph operations. 3. **Default Configuration**: If no custom LLM is set, the default LLM (`gpt-4o-2024-08-06`) will be used for all graph operations. Here's how you can do it: ```python Python from mem0 import Memory config = { "graph_store": { "provider": "neo4j", "config": { "url": "neo4j+s://xxx", "username": "neo4j", "password": "xxx" } } } m = Memory.from_config(config_dict=config) ``` ```typescript TypeScript import { Memory } from "mem0ai/oss"; const config = { enableGraph: true, graphStore: { provider: "neo4j", config: { url: "neo4j+s://xxx", username: "neo4j", password: "xxx", } } } const memory = new Memory(config); ``` ```python Python (Advanced) config = { "llm": { "provider": "openai", "config": { "model": "gpt-4o", "temperature": 0.2, "max_tokens": 2000, } }, "graph_store": { "provider": "neo4j", "config": { "url": "neo4j+s://xxx", "username": "neo4j", "password": "xxx" }, "llm" : { "provider": "openai", "config": { "model": "gpt-4o-mini", "temperature": 0.0, } } } } m = Memory.from_config(config_dict=config) ``` ```typescript TypeScript (Advanced) const config = { llm: { provider: "openai", config: { model: "gpt-4o", temperature: 0.2, max_tokens: 2000, } }, enableGraph: true, graphStore: { provider: "neo4j", config: { url: "neo4j+s://xxx", username: "neo4j", password: "xxx", }, llm: { provider: "openai", config: { model: "gpt-4o-mini", temperature: 0.0, } } } } const memory = new Memory(config); ``` If you are using NodeSDK, you need to pass `enableGraph` as `true` in the `config` object. ### Initialize Memgraph Run Memgraph with Docker: ```bash docker run -p 7687:7687 memgraph/memgraph-mage:latest --schema-info-enabled=True ``` The `--schema-info-enabled` flag is set to `True` for more performant schema generation. Additional information can be found on [Memgraph documentation](https://memgraph.com/docs). User can also customize the LLM for Graph Memory from the [Supported LLM list](https://docs.mem0.ai/components/llms/overview) with three levels of configuration: 1. **Main Configuration**: If `llm` is set in the main config, it will be used for all graph operations. 2. **Graph Store Configuration**: If `llm` is set in the graph_store config, it will override the main config `llm` and be used specifically for graph operations. 3. **Default Configuration**: If no custom LLM is set, the default LLM (`gpt-4o-2024-08-06`) will be used for all graph operations. Here's how you can do it: ```python Python from mem0 import Memory config = { "graph_store": { "provider": "memgraph", "config": { "url": "bolt://localhost:7687", "username": "memgraph", "password": "xxx", }, }, } m = Memory.from_config(config_dict=config) ``` ```python Python (Advanced) config = { "embedder": { "provider": "openai", "config": {"model": "text-embedding-3-large", "embedding_dims": 1536}, }, "graph_store": { "provider": "memgraph", "config": { "url": "bolt://localhost:7687", "username": "memgraph", "password": "xxx" } } } m = Memory.from_config(config_dict=config) ``` ## Graph Operations The Mem0's graph supports the following operations: ### Add Memories Mem0 with Graph Memory supports both "user_id" and "agent_id" parameters. You can use either or both to organize your memories. Use "userId" and "agentId" in NodeSDK. ```python Python # Using only user_id m.add("I like pizza", user_id="alice") # Using both user_id and agent_id m.add("I like pizza", user_id="alice", agent_id="food-assistant") ``` ```typescript TypeScript // Using only userId memory.add("I like pizza", { userId: "alice" }); // Using both userId and agentId memory.add("I like pizza", { userId: "alice", agentId: "food-assistant" }); ``` ```json Output {'message': 'ok'} ``` ### Get all memories ```python Python # Get all memories for a user m.get_all(user_id="alice") # Get all memories for a specific agent belonging to a user m.get_all(user_id="alice", agent_id="food-assistant") ``` ```typescript TypeScript // Get all memories for a user memory.getAll({ userId: "alice" }); // Get all memories for a specific agent belonging to a user memory.getAll({ userId: "alice", agentId: "food-assistant" }); ``` ```json Output { 'memories': [ { 'id': 'de69f426-0350-4101-9d0e-5055e34976a5', 'memory': 'Likes pizza', 'hash': '92128989705eef03ce31c462e198b47d', 'metadata': None, 'created_at': '2024-08-20T14:09:27.588719-07:00', 'updated_at': None, 'user_id': 'alice', 'agent_id': 'food-assistant' } ], 'entities': [ { 'source': 'alice', 'relationship': 'likes', 'target': 'pizza' } ] } ``` ### Search Memories ```python Python # Search memories for a user m.search("tell me my name.", user_id="alice") # Search memories for a specific agent belonging to a user m.search("tell me my name.", user_id="alice", agent_id="food-assistant") ``` ```typescript TypeScript // Search memories for a user memory.search("tell me my name.", { userId: "alice" }); // Search memories for a specific agent belonging to a user memory.search("tell me my name.", { userId: "alice", agentId: "food-assistant" }); ``` ```json Output { 'memories': [ { 'id': 'de69f426-0350-4101-9d0e-5055e34976a5', 'memory': 'Likes pizza', 'hash': '92128989705eef03ce31c462e198b47d', 'metadata': None, 'created_at': '2024-08-20T14:09:27.588719-07:00', 'updated_at': None, 'user_id': 'alice', 'agent_id': 'food-assistant' } ], 'entities': [ { 'source': 'alice', 'relationship': 'likes', 'target': 'pizza' } ] } ``` ### Delete all Memories ```python Python # Delete all memories for a user m.delete_all(user_id="alice") # Delete all memories for a specific agent belonging to a user m.delete_all(user_id="alice", agent_id="food-assistant") ``` ```typescript TypeScript // Delete all memories for a user memory.deleteAll({ userId: "alice" }); // Delete all memories for a specific agent belonging to a user memory.deleteAll({ userId: "alice", agentId: "food-assistant" }); ``` # Example Usage Here's an example of how to use Mem0's graph operations: 1. First, we'll add some memories for a user named Alice. 2. Then, we'll visualize how the graph evolves as we add more memories. 3. You'll see how entities and relationships are automatically extracted and connected in the graph. ### Add Memories Below are the steps to add memories and visualize the graph: ```python Python m.add("I like going to hikes", user_id="alice123") ``` ```typescript TypeScript memory.add("I like going to hikes", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example1.png) ```python Python m.add("I love to play badminton", user_id="alice123") ``` ```typescript TypeScript memory.add("I love to play badminton", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example2.png) ```python Python m.add("I hate playing badminton", user_id="alice123") ``` ```typescript TypeScript memory.add("I hate playing badminton", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example3.png) ```python Python m.add("My friend name is john and john has a dog named tommy", user_id="alice123") ``` ```typescript TypeScript memory.add("My friend name is john and john has a dog named tommy", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example4.png) ```python Python m.add("My name is Alice", user_id="alice123") ``` ```typescript TypeScript memory.add("My name is Alice", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example5.png) ```python Python m.add("John loves to hike and Harry loves to hike as well", user_id="alice123") ``` ```typescript TypeScript memory.add("John loves to hike and Harry loves to hike as well", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example6.png) ```python Python m.add("My friend peter is the spiderman", user_id="alice123") ``` ```typescript TypeScript memory.add("My friend peter is the spiderman", { userId: "alice123" }); ``` ![Graph Memory Visualization](/images/graph_memory/graph_example7.png) ### Search Memories ```python Python m.search("What is my name?", user_id="alice123") ``` ```typescript TypeScript memory.search("What is my name?", { userId: "alice123" }); ``` ```json Output { 'memories': [...], 'entities': [ {'source': 'alice123', 'relation': 'dislikes_playing','destination': 'badminton'}, {'source': 'alice123', 'relation': 'friend', 'destination': 'peter'}, {'source': 'alice123', 'relation': 'friend', 'destination': 'john'}, {'source': 'alice123', 'relation': 'has_name', 'destination': 'alice'}, {'source': 'alice123', 'relation': 'likes', 'destination': 'hiking'} ] } ``` Below graph visualization shows what nodes and relationships are fetched from the graph for the provided query. ![Graph Memory Visualization](/images/graph_memory/graph_example8.png) ```python Python m.search("Who is spiderman?", user_id="alice123") ``` ```typescript TypeScript memory.search("Who is spiderman?", { userId: "alice123" }); ``` ```json Output { 'memories': [...], 'entities': [ {'source': 'peter', 'relation': 'identity','destination': 'spiderman'} ] } ``` ![Graph Memory Visualization](/images/graph_memory/graph_example9.png) > **Note:** The Graph Memory implementation is not standalone. You will be adding/retrieving memories to the vector store and the graph store simultaneously. ## Using Multiple Agents with Graph Memory When working with multiple agents, you can use the "agent_id" parameter to organize memories by both user and agent. This allows you to: 1. Create agent-specific knowledge graphs 2. Share common knowledge between agents 3. Isolate sensitive or specialized information to specific agents ### Example: Multi-Agent Setup ```python Python # Add memories for different agents m.add("I prefer Italian cuisine", user_id="bob", agent_id="food-assistant") m.add("I'm allergic to peanuts", user_id="bob", agent_id="health-assistant") m.add("I live in Seattle", user_id="bob") # Shared across all agents # Search within specific agent context food_preferences = m.search("What food do I like?", user_id="bob", agent_id="food-assistant") health_info = m.search("What are my allergies?", user_id="bob", agent_id="health-assistant") location = m.search("Where do I live?", user_id="bob") # Searches across all agents ``` ```typescript TypeScript // Add memories for different agents memory.add("I prefer Italian cuisine", { userId: "bob", agentId: "food-assistant" }); memory.add("I'm allergic to peanuts", { userId: "bob", agentId: "health-assistant" }); memory.add("I live in Seattle", { userId: "bob" }); // Shared across all agents // Search within specific agent context const foodPreferences = memory.search("What food do I like?", { userId: "bob", agentId: "food-assistant" }); const healthInfo = memory.search("What are my allergies?", { userId: "bob", agentId: "health-assistant" }); const location = memory.search("Where do I live?", { userId: "bob" }); // Searches across all agents ``` If you want to use a managed version of Mem0, please check out [Mem0](https://mem0.dev/pd). If you have any questions, please feel free to reach out to us using one of the following methods: