diff --git a/README.md b/README.md index 2de88672..ac2bba6d 100644 --- a/README.md +++ b/README.md @@ -1,24 +1,20 @@

- Mem0 - The Memory Layer for Personalized AI + Mem0 - The Memory Layer for Personalized AI +

- - mem0ai%2Fmem0 | Trendshift - - - Launch YC: Mem0 - Open Source Memory Layer for AI Apps + + mem0ai%2Fmem0 | Trendshift

- -

- Learn more - · - Join Discord - · - Demo -

+

+ Learn more + · + Join Discord + · + Demo

@@ -26,55 +22,71 @@ Mem0 Discord - Mem0 PyPI - Downloads + Mem0 PyPI - Downloads GitHub commit activity - - Package version - - - Npm package - + + Package version + + + Npm package + Y Combinator S24

+

+ 📄 Building Production-Ready AI Agents with Scalable Long-Term Memory → +

+

+ ⚡ +26% Accuracy vs. OpenAI Memory • 🚀 91% Faster • 💰 90% Fewer Tokens +

+ +## 🔥 Research Highlights +- **+26% Accuracy** over OpenAI Memory on the LOCOMO benchmark +- **91% Faster Responses** than full-context, ensuring low-latency at scale +- **90% Lower Token Usage** than full-context, cutting costs without compromise +- [Read the full paper](https://mem0.ai/research) # Introduction -[Mem0](https://mem0.ai) (pronounced as "mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. Mem0 remembers user preferences, adapts to individual needs, and continuously improves over time, making it ideal for customer support chatbots, AI assistants, and autonomous systems. +[Mem0](https://mem0.ai) ("mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences, adapts to individual needs, and continuously learns over time—ideal for customer support chatbots, AI assistants, and autonomous systems. -### Features & Use Cases +### Key Features & Use Cases -Core Capabilities: -- **Multi-Level Memory**: User, Session, and AI Agent memory retention with adaptive personalization -- **Developer-Friendly**: Simple API integration, cross-platform consistency, and hassle-free managed service +**Core Capabilities:** +- **Multi-Level Memory**: Seamlessly retains User, Session, and Agent state with adaptive personalization +- **Developer-Friendly**: Intuitive API, cross-platform SDKs, and a fully managed service option -Applications: -- **AI Assistants**: Seamless conversations with context and personalization -- **Learning & Support**: Tailored content recommendations and context-aware customer assistance -- **Healthcare & Companions**: Patient history tracking and deeper relationship building -- **Productivity & Gaming**: Streamlined workflows and adaptive environments based on user behavior +**Applications:** +- **AI Assistants**: Consistent, context-rich conversations +- **Customer Support**: Recall past tickets and user history for tailored help +- **Healthcare**: Track patient preferences and history for personalized care +- **Productivity & Gaming**: Adaptive workflows and environments based on user behavior -## Get Started +## 🚀 Quickstart Guide -Get started quickly with [Mem0 Platform](https://app.mem0.ai) - our fully managed solution that provides automatic updates, advanced analytics, enterprise security, and dedicated support. [Create a free account](https://app.mem0.ai) to begin. +Choose between our hosted platform or self-hosted package: -For complete control, you can self-host Mem0 using our open-source package. See the [Quickstart guide](#quickstart) below to set up your own instance. +### Hosted Platform -## Quickstart Guide +Get up and running in minutes with automatic updates, analytics, and enterprise security. -Install the Mem0 package via pip: +1. Sign up on [Mem0 Platform](https://app.mem0.ai) +2. Embed the memory layer via SDK or API keys + +### Self-Hosted (Open Source) + +Install the sdk via pip: ```bash pip install mem0ai ``` -Install the Mem0 package via npm: - +Install sdk via npm: ```bash npm install mem0ai ``` @@ -96,7 +108,7 @@ def chat_with_memories(message: str, user_id: str = "default_user") -> str: # Retrieve relevant memories relevant_memories = memory.search(query=message, user_id=user_id, limit=3) memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"]) - + # Generate Assistant response system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}" messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}] @@ -122,68 +134,21 @@ if __name__ == "__main__": main() ``` -See the example for [Node.js](https://docs.mem0.ai/examples/ai_companion_js). +For detailed integration steps, see the [Quickstart](https://docs.mem0.ai/quickstart) and [API Reference](https://docs.mem0.ai). -For more advanced usage and API documentation, visit our [documentation](https://docs.mem0.ai). +## 🔗 Integrations & Demos -> [!TIP] -> For a hassle-free experience, try our [hosted platform](https://app.mem0.ai) with automatic updates and enterprise features. +- **ChatGPT with Memory**: Personalized chat powered by Mem0 ([Live Demo](https://mem0.dev/demo)) +- **Browser Extension**: Store memories across ChatGPT, Perplexity, and Claude ([Chrome Extension](https://chromewebstore.google.com/detail/mem0)) +- **Langgraph Support**: Build a customer bot with Langgraph + Mem0 ([Guide](https://docs.mem0.ai/integrations/langgraph)) +- **CrewAI Integration**: Tailor CrewAI outputs with Mem0 ([Example](https://docs.mem0.ai/integrations/crewai)) -## Demos +## 📚 Documentation & Support -- Mem0 - ChatGPT with Memory: A personalized AI chat app powered by Mem0 that remembers your preferences, facts, and memories. +- Full docs: https://docs.mem0.ai +- Community: [Discord](https://mem0.dev/DiG) · [Twitter](https://x.com/mem0ai) +- Contact: founders@mem0.ai -[Mem0 - ChatGPT with Memory](https://github.com/user-attachments/assets/cebc4f8e-bdb9-4837-868d-13c5ab7bb433) +## ⚖️ License -Try live [demo](https://mem0.dev/demo/) - -

- -- AI Companion: Experience personalized conversations with an AI that remembers your preferences and past interactions - -[AI Companion Demo](https://github.com/user-attachments/assets/3fc72023-a72c-4593-8be0-3cee3ba744da) - -

- -- Enhance your AI interactions by storing memories across ChatGPT, Perplexity, and Claude using our browser extension. Get [chrome extension](https://chromewebstore.google.com/detail/mem0/onihkkbipkfeijkadecaafbgagkhglop?hl=en). - - -[Chrome Extension Demo](https://github.com/user-attachments/assets/ca92e40b-c453-4ff6-b25e-739fb18a8650) - -

- -- Customer support bot using Langgraph and Mem0. Get the complete code from [here](https://docs.mem0.ai/integrations/langgraph) - - -[Langgraph: Customer Bot](https://github.com/user-attachments/assets/ca6b482e-7f46-42c8-aa08-f88d1d93a5f4) - -

- -- Use Mem0 with CrewAI to get personalized results. Full example [here](https://docs.mem0.ai/integrations/crewai) - -[CrewAI Demo](https://github.com/user-attachments/assets/69172a79-ccb9-4340-91f1-caa7d2dd4213) - - - -## Documentation - -For detailed usage instructions and API reference, visit our [documentation](https://docs.mem0.ai). You'll find: -- Complete API reference -- Integration guides -- Advanced configuration options -- Best practices and examples -- More details about: - - Open-source version - - [Hosted Mem0 Platform](https://app.mem0.ai) - -## Support - -Join our community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods: - -- [Join our Discord](https://mem0.dev/DiG) -- [Follow us on Twitter](https://x.com/mem0ai) -- [Email founders](mailto:founders@mem0.ai) - -## License - -This project is licensed under the Apache 2.0 License - see the [LICENSE](LICENSE) file for details. +Apache 2.0 — see the [LICENSE](LICENSE) file for details. \ No newline at end of file diff --git a/docs/_snippets/paper-release.mdx b/docs/_snippets/paper-release.mdx new file mode 100644 index 00000000..ba9229bd --- /dev/null +++ b/docs/_snippets/paper-release.mdx @@ -0,0 +1,3 @@ + + 📢 Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! [Read the paper](https://mem0.ai/research) to learn how we're revolutionizing AI agent memory. + \ No newline at end of file diff --git a/docs/api-reference.mdx b/docs/api-reference.mdx index c994ec71..ef01d747 100644 --- a/docs/api-reference.mdx +++ b/docs/api-reference.mdx @@ -4,6 +4,8 @@ icon: "info" iconType: "solid" --- + + Mem0 provides a powerful set of APIs that allow you to integrate advanced memory management capabilities into your applications. Our APIs are designed to be intuitive, efficient, and scalable, enabling you to create, retrieve, update, and delete memories across various entities such as users, agents, apps, and runs. ## Key Features diff --git a/docs/changelog.mdx b/docs/changelog.mdx index b27bb623..209ac2a1 100644 --- a/docs/changelog.mdx +++ b/docs/changelog.mdx @@ -3,6 +3,8 @@ title: "Product Updates" mode: "wide" --- + + diff --git a/docs/components/embedders/config.mdx b/docs/components/embedders/config.mdx index dc84c497..a8884c32 100644 --- a/docs/components/embedders/config.mdx +++ b/docs/components/embedders/config.mdx @@ -4,6 +4,8 @@ icon: "gear" iconType: "solid" --- + + Config in mem0 is a dictionary that specifies the settings for your embedding models. It allows you to customize the behavior and connection details of your chosen embedder. ## How to define configurations? diff --git a/docs/components/embedders/overview.mdx b/docs/components/embedders/overview.mdx index f98f2db4..3ae4d592 100644 --- a/docs/components/embedders/overview.mdx +++ b/docs/components/embedders/overview.mdx @@ -4,6 +4,8 @@ icon: "info" iconType: "solid" --- + + Mem0 offers support for various embedding models, allowing users to choose the one that best suits their needs. ## Supported Embedders diff --git a/docs/components/llms/config.mdx b/docs/components/llms/config.mdx index 57c8f70c..bb13cbcb 100644 --- a/docs/components/llms/config.mdx +++ b/docs/components/llms/config.mdx @@ -4,6 +4,8 @@ icon: "gear" iconType: "solid" --- + + ## How to define configurations? diff --git a/docs/components/llms/models/anthropic.mdx b/docs/components/llms/models/anthropic.mdx index 84ed1011..e30988cd 100644 --- a/docs/components/llms/models/anthropic.mdx +++ b/docs/components/llms/models/anthropic.mdx @@ -2,6 +2,8 @@ title: Anthropic --- + + To use anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys). ## Usage diff --git a/docs/components/llms/models/aws_bedrock.mdx b/docs/components/llms/models/aws_bedrock.mdx index bd04d99b..5561e698 100644 --- a/docs/components/llms/models/aws_bedrock.mdx +++ b/docs/components/llms/models/aws_bedrock.mdx @@ -2,6 +2,8 @@ title: AWS Bedrock --- + + ### Setup - Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess). - You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials) diff --git a/docs/components/llms/models/azure_openai.mdx b/docs/components/llms/models/azure_openai.mdx index 4b333590..e1b6ddb8 100644 --- a/docs/components/llms/models/azure_openai.mdx +++ b/docs/components/llms/models/azure_openai.mdx @@ -2,6 +2,8 @@ title: Azure OpenAI --- + + Mem0 Now Supports Azure OpenAI Models in TypeScript SDK To use Azure OpenAI models, you have to set the `LLM_AZURE_OPENAI_API_KEY`, `LLM_AZURE_ENDPOINT`, `LLM_AZURE_DEPLOYMENT` and `LLM_AZURE_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/). diff --git a/docs/components/llms/models/deepseek.mdx b/docs/components/llms/models/deepseek.mdx index af1783a1..56fc7f42 100644 --- a/docs/components/llms/models/deepseek.mdx +++ b/docs/components/llms/models/deepseek.mdx @@ -2,6 +2,8 @@ title: DeepSeek --- + + To use DeepSeek LLM models, you have to set the `DEEPSEEK_API_KEY` environment variable. You can also optionally set `DEEPSEEK_API_BASE` if you need to use a different API endpoint (defaults to "https://api.deepseek.com"). ## Usage diff --git a/docs/components/llms/models/gemini.mdx b/docs/components/llms/models/gemini.mdx index 4c166ea4..7a502ad5 100644 --- a/docs/components/llms/models/gemini.mdx +++ b/docs/components/llms/models/gemini.mdx @@ -2,6 +2,8 @@ title: Gemini --- + + To use Gemini model, you have to set the `GEMINI_API_KEY` environment variable. You can obtain the Gemini API key from the [Google AI Studio](https://aistudio.google.com/app/apikey) ## Usage diff --git a/docs/components/llms/models/google_AI.mdx b/docs/components/llms/models/google_AI.mdx index 4e08ea74..a3ed2393 100644 --- a/docs/components/llms/models/google_AI.mdx +++ b/docs/components/llms/models/google_AI.mdx @@ -2,6 +2,8 @@ title: Google AI --- + + To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey) ## Usage diff --git a/docs/components/llms/models/groq.mdx b/docs/components/llms/models/groq.mdx index d8f0727c..556ac763 100644 --- a/docs/components/llms/models/groq.mdx +++ b/docs/components/llms/models/groq.mdx @@ -2,6 +2,8 @@ title: Groq --- + + [Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine. In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example. diff --git a/docs/components/llms/models/langchain.mdx b/docs/components/llms/models/langchain.mdx index 4113bc05..8f5744ac 100644 --- a/docs/components/llms/models/langchain.mdx +++ b/docs/components/llms/models/langchain.mdx @@ -2,6 +2,8 @@ title: LangChain --- + + Mem0 supports LangChain as a provider to access a wide range of LLM models. LangChain is a framework for developing applications powered by language models, making it easy to integrate various LLM providers through a consistent interface. For a complete list of available chat models supported by LangChain, refer to the [LangChain Chat Models documentation](https://python.langchain.com/docs/integrations/chat). diff --git a/docs/components/llms/models/litellm.mdx b/docs/components/llms/models/litellm.mdx index d66669f8..4d6caa6f 100644 --- a/docs/components/llms/models/litellm.mdx +++ b/docs/components/llms/models/litellm.mdx @@ -1,3 +1,5 @@ + + [Litellm](https://litellm.vercel.app/docs/) is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the [available models](https://litellm.vercel.app/docs/providers) to use with Litellm. Ensure you set the `API_KEY` for the model you choose to use. ## Usage diff --git a/docs/components/llms/models/lmstudio.mdx b/docs/components/llms/models/lmstudio.mdx index f88490db..9c519561 100644 --- a/docs/components/llms/models/lmstudio.mdx +++ b/docs/components/llms/models/lmstudio.mdx @@ -2,6 +2,8 @@ title: LM Studio --- + + To use LM Studio with Mem0, you'll need to have LM Studio running locally with its server enabled. LM Studio provides a way to run local LLMs with an OpenAI-compatible API. ## Usage diff --git a/docs/components/llms/models/mistral_AI.mdx b/docs/components/llms/models/mistral_AI.mdx index 632d4877..855ccaef 100644 --- a/docs/components/llms/models/mistral_AI.mdx +++ b/docs/components/llms/models/mistral_AI.mdx @@ -2,6 +2,8 @@ title: Mistral AI --- + + To use mistral's models, please obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example. ## Usage diff --git a/docs/components/llms/models/ollama.mdx b/docs/components/llms/models/ollama.mdx index 757fd2cc..8eb14e3f 100644 --- a/docs/components/llms/models/ollama.mdx +++ b/docs/components/llms/models/ollama.mdx @@ -1,3 +1,5 @@ + + You can use LLMs from Ollama to run Mem0 locally. These [models](https://ollama.com/search?c=tools) support tool support. ## Usage diff --git a/docs/components/llms/models/openai.mdx b/docs/components/llms/models/openai.mdx index 0c7ecfd5..44e4e3d3 100644 --- a/docs/components/llms/models/openai.mdx +++ b/docs/components/llms/models/openai.mdx @@ -2,6 +2,8 @@ title: OpenAI --- + + To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys). ## Usage diff --git a/docs/components/llms/models/together.mdx b/docs/components/llms/models/together.mdx index 63182918..2cbca70d 100644 --- a/docs/components/llms/models/together.mdx +++ b/docs/components/llms/models/together.mdx @@ -1,3 +1,5 @@ + + To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys). ## Usage diff --git a/docs/components/llms/models/xAI.mdx b/docs/components/llms/models/xAI.mdx index 39b159ca..6b267e25 100644 --- a/docs/components/llms/models/xAI.mdx +++ b/docs/components/llms/models/xAI.mdx @@ -2,6 +2,8 @@ title: xAI --- + + [xAI](https://x.ai/) is a new AI company founded by Elon Musk that develops large language models, including Grok. Grok is trained on real-time data from X (formerly Twitter) and aims to provide accurate, up-to-date responses with a touch of wit and humor. In order to use LLMs from xAI, go to their [platform](https://console.x.ai) and get the API key. Set the API key as `XAI_API_KEY` environment variable to use the model as given below in the example. diff --git a/docs/components/llms/overview.mdx b/docs/components/llms/overview.mdx index a719bc72..e3114bb7 100644 --- a/docs/components/llms/overview.mdx +++ b/docs/components/llms/overview.mdx @@ -4,6 +4,8 @@ icon: "info" iconType: "solid" --- + + Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs. ## Usage diff --git a/docs/components/vectordbs/config.mdx b/docs/components/vectordbs/config.mdx index a36e5579..abe9647e 100644 --- a/docs/components/vectordbs/config.mdx +++ b/docs/components/vectordbs/config.mdx @@ -4,6 +4,8 @@ icon: "gear" iconType: "solid" --- + + ## How to define configurations? The `config` is defined as an object with two main keys: diff --git a/docs/components/vectordbs/overview.mdx b/docs/components/vectordbs/overview.mdx index 5d406eb6..bf034882 100644 --- a/docs/components/vectordbs/overview.mdx +++ b/docs/components/vectordbs/overview.mdx @@ -4,6 +4,8 @@ icon: "info" iconType: "solid" --- + + Mem0 includes built-in support for various popular databases. Memory can utilize the database provided by the user, ensuring efficient use for specific needs. ## Supported Vector Databases diff --git a/docs/contributing/development.mdx b/docs/contributing/development.mdx index e65b2070..be8aaa62 100644 --- a/docs/contributing/development.mdx +++ b/docs/contributing/development.mdx @@ -3,6 +3,8 @@ title: Development icon: "code" --- + + # Development Contributions We strive to make contributions **easy, collaborative, and enjoyable**. Follow the steps below to ensure a smooth contribution process. diff --git a/docs/contributing/documentation.mdx b/docs/contributing/documentation.mdx index 33b445de..e3dcb19d 100644 --- a/docs/contributing/documentation.mdx +++ b/docs/contributing/documentation.mdx @@ -3,6 +3,8 @@ title: Documentation icon: "book" --- + + # Documentation Contributions ## 📌 Prerequisites diff --git a/docs/core-concepts/memory-operations.mdx b/docs/core-concepts/memory-operations.mdx index 5fdffd21..804c3b99 100644 --- a/docs/core-concepts/memory-operations.mdx +++ b/docs/core-concepts/memory-operations.mdx @@ -5,6 +5,8 @@ icon: "gear" iconType: "solid" --- + + Mem0 provides two core operations for managing memories in AI applications: adding new memories and searching existing ones. This guide covers how these operations work and how to use them effectively in your application. diff --git a/docs/core-concepts/memory-types.mdx b/docs/core-concepts/memory-types.mdx index 73dfa860..fa63a0e5 100644 --- a/docs/core-concepts/memory-types.mdx +++ b/docs/core-concepts/memory-types.mdx @@ -4,6 +4,9 @@ description: Understanding different types of memory in AI Applications icon: "memory" iconType: "solid" --- + + + To build useful AI applications, we need to understand how different memory systems work together. This guide explores the fundamental types of memory in AI systems and shows how Mem0 implements these concepts. ## Why Memory Matters diff --git a/docs/examples.mdx b/docs/examples.mdx index ce279b08..ad26577a 100644 --- a/docs/examples.mdx +++ b/docs/examples.mdx @@ -3,6 +3,8 @@ title: Overview description: How to use mem0 in your existing applications? --- + + With Mem0, you can create stateful LLM-based applications such as chatbots, virtual assistants, or AI agents. Mem0 enhances your applications by providing a memory layer that makes responses: diff --git a/docs/examples/ai_companion.mdx b/docs/examples/ai_companion.mdx index 896985e0..55aecc7b 100644 --- a/docs/examples/ai_companion.mdx +++ b/docs/examples/ai_companion.mdx @@ -2,6 +2,8 @@ title: AI Companion --- + + You can create a personalised AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started. ## Overview diff --git a/docs/examples/ai_companion_js.mdx b/docs/examples/ai_companion_js.mdx index d170d12b..59924697 100644 --- a/docs/examples/ai_companion_js.mdx +++ b/docs/examples/ai_companion_js.mdx @@ -2,6 +2,8 @@ title: AI Companion in Node.js --- + + You can create a personalised AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started. ## Overview diff --git a/docs/examples/chrome-extension.mdx b/docs/examples/chrome-extension.mdx index a9ed8e3d..3177b465 100644 --- a/docs/examples/chrome-extension.mdx +++ b/docs/examples/chrome-extension.mdx @@ -1,5 +1,7 @@ # Mem0 Chrome Extension + + Enhance your AI interactions with **Mem0**, a Chrome extension that introduces a universal memory layer across platforms like `ChatGPT`, `Claude`, and `Perplexity`. Mem0 ensures seamless context sharing, making your AI experiences more personalized and efficient. diff --git a/docs/examples/customer-support-agent.mdx b/docs/examples/customer-support-agent.mdx index 171ddded..1d234c3d 100644 --- a/docs/examples/customer-support-agent.mdx +++ b/docs/examples/customer-support-agent.mdx @@ -2,6 +2,8 @@ title: Customer Support AI Agent --- + + You can create a personalized Customer Support AI Agent using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started. ## Overview diff --git a/docs/examples/document-writing.mdx b/docs/examples/document-writing.mdx index 0bd53b25..81361459 100644 --- a/docs/examples/document-writing.mdx +++ b/docs/examples/document-writing.mdx @@ -1,6 +1,7 @@ --- title: Document Editing with Mem0 --- + This guide demonstrates how to leverage **Mem0** to edit documents efficiently, ensuring they align with your unique writing style and preferences. diff --git a/docs/examples/email_processing.mdx b/docs/examples/email_processing.mdx index 572d1832..c3afa2ab 100644 --- a/docs/examples/email_processing.mdx +++ b/docs/examples/email_processing.mdx @@ -2,6 +2,8 @@ title: Email Processing with Mem0 --- + + This guide demonstrates how to build an intelligent email processing system using Mem0's memory capabilities. You'll learn how to store, categorize, retrieve, and analyze emails to create a smart email management solution. ## Overview diff --git a/docs/examples/llama-index-mem0.mdx b/docs/examples/llama-index-mem0.mdx index 6c68abf9..73743292 100644 --- a/docs/examples/llama-index-mem0.mdx +++ b/docs/examples/llama-index-mem0.mdx @@ -1,6 +1,8 @@ --- title: LlamaIndex ReAct Agent --- + + Create a ReAct Agent with LlamaIndex which uses Mem0 as the memory store. ### Overview diff --git a/docs/examples/mem0-agentic-tool.mdx b/docs/examples/mem0-agentic-tool.mdx index d146d2ec..c6dafd9a 100644 --- a/docs/examples/mem0-agentic-tool.mdx +++ b/docs/examples/mem0-agentic-tool.mdx @@ -2,6 +2,8 @@ title: Mem0 as an Agentic Tool --- + + Integrate Mem0's memory capabilities with OpenAI's Agents SDK to create AI agents with persistent memory. You can create agents that remember past conversations and use that context to provide better responses. diff --git a/docs/examples/mem0-demo.mdx b/docs/examples/mem0-demo.mdx index e5d2a379..5c0987ca 100644 --- a/docs/examples/mem0-demo.mdx +++ b/docs/examples/mem0-demo.mdx @@ -2,6 +2,9 @@ title: Mem0 Demo --- + + + You can create a personalized AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete setup instructions to get you started.