Docs SOC2 and HIPAA update (#3075)
This commit is contained in:
3
docs/_snippets/security-compliance.mdx
Normal file
3
docs/_snippets/security-compliance.mdx
Normal file
@@ -0,0 +1,3 @@
|
||||
<Note type="info">
|
||||
🔐 Mem0 is now <strong>SOC 2</strong> and <strong>HIPAA</strong> compliant! We're committed to the highest standards of data security and privacy, enabling secure memory for enterprises, healthcare, and beyond.
|
||||
</Note>
|
||||
@@ -4,7 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 provides a powerful set of APIs that allow you to integrate advanced memory management capabilities into your applications. Our APIs are designed to be intuitive, efficient, and scalable, enabling you to create, retrieve, update, and delete memories across various entities such as users, agents, apps, and runs.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: "Product Updates"
|
||||
mode: "wide"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
<Tabs>
|
||||
<Tab title="Python">
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "gear"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Config in mem0 is a dictionary that specifies the settings for your embedding models. It allows you to customize the behavior and connection details of your chosen embedder.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 offers support for various embedding models, allowing users to choose the one that best suits their needs.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "gear"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## How to define configurations?
|
||||
|
||||
|
||||
@@ -2,9 +2,9 @@
|
||||
title: Anthropic
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys).
|
||||
To use Anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys).
|
||||
|
||||
## Usage
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: AWS Bedrock
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
### Setup
|
||||
- Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Azure OpenAI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
<Note> Mem0 Now Supports Azure OpenAI Models in TypeScript SDK </Note>
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: DeepSeek
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use DeepSeek LLM models, you have to set the `DEEPSEEK_API_KEY` environment variable. You can also optionally set `DEEPSEEK_API_BASE` if you need to use a different API endpoint (defaults to "https://api.deepseek.com").
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Gemini
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use the Gemini model, set the `GEMINI_API_KEY` environment variable. You can obtain the Gemini API key from [Google AI Studio](https://aistudio.google.com/app/apikey).
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Google AI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Groq
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
[Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: LangChain
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 supports LangChain as a provider to access a wide range of LLM models. LangChain is a framework for developing applications powered by language models, making it easy to integrate various LLM providers through a consistent interface.
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
[Litellm](https://litellm.vercel.app/docs/) is compatible with over 100 large language models (LLMs), all using a standardized input/output format. You can explore the [available models](https://litellm.vercel.app/docs/providers) to use with Litellm. Ensure you set the `API_KEY` for the model you choose to use.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: LM Studio
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use LM Studio with Mem0, you'll need to have LM Studio running locally with its server enabled. LM Studio provides a way to run local LLMs with an OpenAI-compatible API.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Mistral AI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use mistral's models, please obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example.
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can use LLMs from Ollama to run Mem0 locally. These [models](https://ollama.com/search?c=tools) support tool support.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: OpenAI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Sarvam AI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
**Sarvam AI** is an Indian AI company developing language models with a focus on Indian languages and cultural context. Their latest model **Sarvam-M** is designed to understand and generate content in multiple Indian languages while maintaining high performance in English.
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys).
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: vLLM
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
[vLLM](https://docs.vllm.ai/) is a high-performance inference engine for large language models that provides significant performance improvements for local inference. It's designed to maximize throughput and memory efficiency for serving LLMs.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: xAI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
[xAI](https://x.ai/) is a new AI company founded by Elon Musk that develops large language models, including Grok. Grok is trained on real-time data from X (formerly Twitter) and aims to provide accurate, up-to-date responses with a touch of wit and humor.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "gear"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## How to define configurations?
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 includes built-in support for various popular databases. Memory can utilize the database provided by the user, ensuring efficient use for specific needs.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Development
|
||||
icon: "code"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Development Contributions
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Documentation
|
||||
icon: "book"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Documentation Contributions
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "gear"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 provides two core operations for managing memories in AI applications: adding new memories and searching existing ones. This guide covers how these operations work and how to use them effectively in your application.
|
||||
|
||||
|
||||
@@ -5,6 +5,8 @@ icon: "plus"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
The `add` operation is how you store memory into Mem0. Whether you're working with a chatbot, a voice assistant, or a multi-agent system, this is the entry point to create long-term memory.
|
||||
|
||||
@@ -5,6 +5,8 @@ icon: "trash"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
Memories can become outdated, irrelevant, or need to be removed for privacy or compliance reasons. Mem0 offers flexible ways to delete memory:
|
||||
|
||||
@@ -5,6 +5,8 @@ icon: "magnifying-glass"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
The `search` operation allows you to retrieve relevant memories based on a natural language query and optional filters like user ID, agent ID, categories, and more. This is the foundation of giving your agents memory-aware behavior.
|
||||
|
||||
@@ -5,6 +5,8 @@ icon: "pencil"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
User preferences, interests, and behaviors often evolve over time. The `update` operation lets you revise a stored memory, whether it's updating facts and memories, rephrasing a message, or enriching metadata.
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "memory"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
To build useful AI applications, we need to understand how different memory systems work together. This guide explores the fundamental types of memory in AI systems and shows how Mem0 implements these concepts.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Overview
|
||||
description: How to use mem0 in your existing applications?
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
|
||||
With Mem0, you can create stateful LLM-based applications such as chatbots, virtual assistants, or AI agents. Mem0 enhances your applications by providing a memory layer that makes responses:
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: AI Companion
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalised AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: AI Companion in Node.js
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalised AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: AWS Bedrock and AOSS
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock** and **OpenSearch Service (AOSS)** for persistent memory capabilities in Python.
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Mem0 Chrome Extension
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Enhance your AI interactions with **Mem0**, a Chrome extension that introduces a universal memory layer across platforms like `ChatGPT`, `Claude`, and `Perplexity`. Mem0 ensures seamless context sharing, making your AI experiences more personalized and efficient.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Multi-User Collaboration with Mem0
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Customer Support AI Agent
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalized Customer Support AI Agent using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started.
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: Document Editing with Mem0
|
||||
---
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
This guide demonstrates how to leverage **Mem0** to edit documents efficiently, ensuring they align with your unique writing style and preferences.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Eliza OS Character
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalised Eliza OS Character using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Email Processing with Mem0
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
This guide demonstrates how to build an intelligent email processing system using Mem0's memory capabilities. You'll learn how to store, categorize, retrieve, and analyze emails to create a smart email management solution.
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: LlamaIndex ReAct Agent
|
||||
---
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Create a ReAct Agent with LlamaIndex which uses Mem0 as the memory store.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Mem0 as an Agentic Tool
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Integrate Mem0's memory capabilities with OpenAI's Agents SDK to create AI agents with persistent memory.
|
||||
You can create agents that remember past conversations and use that context to provide better responses.
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Mem0 Demo
|
||||
---
|
||||
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalized AI Companion using Mem0. This guide will walk you through the necessary steps and provide the complete setup instructions to get you started.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: 'Healthcare Assistant with Mem0 and Google ADK'
|
||||
description: 'Build a personalized healthcare agent that remembers patient information across conversations using Mem0 and Google ADK'
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Healthcare Assistant with Memory
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Mem0 with Mastra
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
In this example you'll learn how to use the Mem0 to add long-term memory capabilities to [Mastra's agent](https://mastra.ai/) via tool-use.
|
||||
This memory integration can work alongside Mastra's [agent memory features](https://mastra.ai/docs/agents/01-agent-memory).
|
||||
|
||||
@@ -3,7 +3,7 @@ title: 'Mem0 with OpenAI Agents SDK for Voice'
|
||||
description: 'Integrate memory capabilities into your voice agents using Mem0 and OpenAI Agents SDK'
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Building Voice Agents with Memory using Mem0 and OpenAI Agents SDK
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Mem0 with Ollama
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Running Mem0 Locally with Ollama
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Multimodal Demo with Mem0
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Enhance your AI interactions with **Mem0**'s multimodal capabilities. Mem0 now supports image understanding, allowing for richer context and more natural interactions across supported AI platforms.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: OpenAI Inbuilt Tools
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Integrate Mem0’s memory capabilities with OpenAI’s Inbuilt Tools to create AI agents with persistent memory.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Personalized AI Tutor
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
You can create a personalized AI Tutor using Mem0. This guide will walk you through the necessary steps and provide the complete code to get you started.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Personal AI Travel Assistant
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Create a personalized AI Travel Assistant using Mem0. This guide provides step-by-step instructions and the complete code to get you started.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Personalized Deep Research
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Deep Research is an intelligent agent that synthesizes large amounts of online data and completes complex research tasks, customized to your unique preferences and insights. Built on Mem0's technology, it enhances AI-driven online exploration with personalized memories.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: YouTube Assistant Extension
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Enhance your YouTube experience with Mem0's **YouTube Assistant**, a Chrome extension that brings AI-powered chat directly to your YouTube videos. Get instant, personalized answers about video content while leveraging your own knowledge and memories - all without leaving the page.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "question"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
<AccordionGroup>
|
||||
<Accordion title="How does Mem0 work?">
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "wrench"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Core features
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Overview
|
||||
description: How to integrate Mem0 into other frameworks
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 seamlessly integrates with popular AI frameworks and tools to enhance your LLM-based applications with persistent memory capabilities. By integrating Mem0, your applications benefit from:
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: AgentOps
|
||||
---
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Integrate [**Mem0**](https://github.com/mem0ai/mem0) with [AgentOps](https://agentops.ai), a comprehensive monitoring and analytics platform for AI agents. This integration enables automatic tracking and analysis of memory operations, providing insights into agent performance and memory usage patterns.
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: Agno
|
||||
---
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Integrate [**Mem0**](https://github.com/mem0ai/mem0) with [Agno](https://github.com/agno-agi/agno), a Python framework for building autonomous agents. This integration enables Agno agents to access persistent memory across conversations, enhancing context retention and personalization.
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Build conversational AI agents with memory capabilities. This integration combines AutoGen for creating AI agents with Mem0 for memory management, enabling context-aware and personalized interactions.
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: CrewAI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Build an AI system that combines CrewAI's agent-based architecture with Mem0's memory capabilities. This integration enables persistent memory across agent interactions and personalized task execution based on user history.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Dify
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Integrating Mem0 with Dify AI
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: ElevenLabs
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Create voice-based conversational AI agents with memory capabilities by integrating ElevenLabs and Mem0. This integration enables persistent, context-aware voice interactions that remember past conversations.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Flowise
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
The [**Mem0 Memory**](https://github.com/mem0ai/mem0) integration with [Flowise](https://github.com/FlowiseAI/Flowise) enables persistent memory capabilities for your AI chatflows. [Flowise](https://flowiseai.com/) is an open-source low-code tool for developers to build customized LLM orchestration flows & AI agents using a drag & drop interface.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Keywords AI
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Build AI applications with persistent memory and comprehensive LLM observability by integrating Mem0 with Keywords AI.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: Langchain Tools
|
||||
description: 'Integrate Mem0 with LangChain tools to enable AI agents to store, search, and manage memories through structured interfaces'
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Overview
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Langchain
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Build a personalized Travel Agent AI using LangChain for conversation flow and Mem0 for memory retention. This integration enables context-aware and efficient travel planning experiences.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: LangGraph
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Build a personalized Customer Support AI Agent using LangGraph for conversation flow and Mem0 for memory retention. This integration enables context-aware and efficient support experiences.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Livekit
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
This guide demonstrates how to create a memory-enabled voice assistant using LiveKit, Deepgram, OpenAI, and Mem0, focusing on creating an intelligent, context-aware travel planning agent.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: LlamaIndex
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
LlamaIndex supports Mem0 as a [memory store](https://llamahub.ai/l/memory/llama-index-memory-mem0). In this guide, we'll show you how to use it.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Mastra
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
The [**Mastra**](https://mastra.ai/) integration demonstrates how to use Mastra's agent system with Mem0 as the memory backend through custom tools. This enables agents to remember and recall information across conversations.
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: MCP Server
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Integrating mem0 as an MCP Server in Cursor
|
||||
[mem0](https://github.com/mem0ai/mem0-mcp) is a powerful tool designed to enhance AI-driven workflows, particularly in code generation and contextual memory. In this guide, we'll walk through integrating mem0 as an **MCP (Model Context Protocol) server** within [Cursor](https://cursor.sh/), an AI-powered coding editor.
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: MultiOn
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Build a personal browser agent that remembers user preferences and automates web tasks. It integrates Mem0 for memory management with MultiOn for executing browser actions, enabling personalized and efficient web interactions.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ title: 'Pipecat'
|
||||
description: 'Integrate Mem0 with Pipecat for conversational memory in AI agents'
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
# Pipecat Integration
|
||||
|
||||
|
||||
@@ -3,11 +3,11 @@ title: "Raycast Extension"
|
||||
description: "Mem0 Raycast extension for intelligent memory management"
|
||||
---
|
||||
|
||||
# Mem0
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 is a self-improving memory layer for LLM applications, enabling personalized AI experiences that save costs and delight users. This extension lets you store and retrieve text snippets using Mem0's intelligent memory system. Find Mem0 in [Raycast Store](https://www.raycast.com/dev_khant/mem0) for using it.
|
||||
|
||||
## 🚀 Getting Started
|
||||
## Getting Started
|
||||
|
||||
**Get your API Key**: You'll need a Mem0 API key to use this extension:
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Vercel AI SDK
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
The [**Mem0 AI SDK Provider**](https://www.npmjs.com/package/@mem0/vercel-ai-provider) is a library developed by **Mem0** to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
|
||||
|
||||
|
||||
@@ -5,6 +5,8 @@ icon: "bolt"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## AsyncMemory
|
||||
|
||||
The `AsyncMemory` class is a direct asynchronous interface to Mem0's in-process memory operations. Unlike the memory, which interacts with an API, `AsyncMemory` works directly with the underlying storage systems. This makes it ideal for applications where you want to embed Mem0 directly into your codebase.
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "pencil"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Introduction to Custom Fact Extraction Prompt
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "pencil"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Update memory prompt is a prompt used to determine the action to be performed on the memory.
|
||||
By customizing this prompt, you can control how the memory is updated.
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "image"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 extends its capabilities beyond text by supporting multimodal data. With this feature, users can seamlessly integrate images into their interactions—allowing Mem0 to extract relevant information.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "code"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 can be easily integrated into chat applications to enhance conversational agents with structured memory. Mem0's APIs are designed to be compatible with OpenAI's, with the goal of making it easy to leverage Mem0 in applications you may have already built.
|
||||
|
||||
|
||||
@@ -4,6 +4,8 @@ icon: "server"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 provides a REST API server (written using FastAPI). Users can perform all operations through REST endpoints. The API also includes OpenAPI documentation, accessible at `/docs` when the server is running.
|
||||
|
||||
<Frame caption="APIs supported by Mem0 REST API Server">
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "list-check"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Graph Memory is a powerful feature that allows users to create and utilize complex relationships between pieces of information.
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 now supports **Graph Memory**.
|
||||
With Graph Memory, users can now create and utilize complex relationships between pieces of information, allowing for more nuanced and context-aware responses.
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "image"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 extends its capabilities beyond text by supporting multimodal data, including images. Users can seamlessly integrate images into their interactions, allowing Mem0 to extract pertinent information from visual content and enrich the memory system.
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "node"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
> Welcome to the Mem0 quickstart guide. This guide will help you get up and running with Mem0 in no time.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "eye"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Welcome to Mem0 Open Source - a powerful, self-hosted memory management solution for AI agents and assistants. With Mem0 OSS, you get full control over your infrastructure while maintaining complete customization flexibility.
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "python"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
> Welcome to the Mem0 quickstart guide. This guide will help you get up and running with Mem0 in no time.
|
||||
|
||||
|
||||
@@ -4,6 +4,8 @@ icon: "plug"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## Connecting an MCP Client
|
||||
|
||||
Once your OpenMemory server is running locally, you can connect any compatible MCP client to your personal memory stream. This enables a seamless memory layer integration for AI tools and agents.
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## 🚀 Hosted OpenMemory MCP Now Available!
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "terminal"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## 🚀 Hosted OpenMemory MCP Now Available!
|
||||
|
||||
|
||||
@@ -4,10 +4,7 @@ icon: "info"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Note type="info">
|
||||
🎉 We're excited to announce that Claude 4 is now available with Mem0! Check it out [here](components/llms/models/anthropic).
|
||||
</Note>
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
|
||||
# Introduction
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "magnifying-glass"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0’s **Advanced Retrieval** provides additional control over how memories are selected and ranked during search. While the default search uses embedding-based semantic similarity, Advanced Retrieval introduces specialized options to improve recall, ranking accuracy, or filtering based on specific use case.
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "bolt"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
The `AsyncMemoryClient` is an asynchronous client for interacting with the Mem0 API. It provides similar functionality to the synchronous `MemoryClient` but allows for non-blocking operations, which can be beneficial in applications that require high concurrency.
|
||||
|
||||
## Initialization
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "square-plus"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
Mem0 now supports an contextual add version (v2). To use it, set `version="v2"` during the add call. The default version is v1, which is deprecated now. We recommend migrating to `v2` for new applications.
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ icon: "magnifying-glass-plus"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
|
||||
Mem0’s **Criteria Retrieval** feature allows you to retrieve memories based on your defined criteria. It goes beyond generic semantic relevance and rank memories based on what matters to your application - emotional tone, intent, behavioral signals, or other custom traits.
|
||||
|
||||
@@ -5,7 +5,7 @@ icon: "tags"
|
||||
iconType: "solid"
|
||||
---
|
||||
|
||||
<Snippet file="paper-release.mdx" />
|
||||
<Snippet file="security-compliance.mdx" />
|
||||
|
||||
## How to set custom categories?
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user