[Mem0] Update docs and improve readability (#1727)

This commit is contained in:
Deshraj Yadav
2024-08-21 00:18:43 -07:00
committed by GitHub
parent 2d66c23116
commit 7de35b4a68
21 changed files with 365 additions and 320 deletions

View File

@@ -1,153 +0,0 @@
---
title: Autogen with Mem0
---
This guide demonstrates how to integrate AutoGen with Mem0 to create a conversational AI system with memory capabilities. The system includes a customer service bot and a manager agent, both leveraging Mem0 for context-aware interactions.
## Installation
First, install the required libraries:
```bash
pip install pyautogen mem0ai
```
## Setup
Import the necessary modules and set up your API keys:
```python
import os
from autogen import ConversableAgent
from mem0 import MemoryClient
os.environ["OPENAI_API_KEY"] = "your_openai_api_key_here"
os.environ["MEM0_API_KEY"] = "your_mem0_api_key_here"
```
## Initialize Agents and Memory
Create the conversational agent and Mem0 client:
```python
agent = ConversableAgent(
"chatbot",
llm_config={"config_list": [{"model": "gpt-4o", "api_key": os.environ.get("OPENAI_API_KEY")}]},
code_execution_config=False,
function_map=None,
human_input_mode="NEVER",
)
memory = MemoryClient(api_key=os.environ.get("MEM0_API_KEY"))
```
## Storing Conversations in Memory
You can store conversations in Mem0 for future reference:
<Accordion title="Conversation">
```python
conversation = [
{
"role": "assistant",
"content": "Hi, I'm Best Buy's chatbot!\n\nThanks for being a My Best Buy TotalTM member.\n\nWhat can I help you with?"
},
{
"role": "user",
"content": "Seeing horizontal lines on our tv. TV model: Sony - 77\" Class BRAVIA XR A80K OLED 4K UHD Smart Google TV"
},
{
"role": "assistant",
"content": "Thanks for being a My Best Buy Total™ member. I can connect you to an expert immediately - just one perk of your membership!\n\nSelect the button below when you're ready to chat."
},
{
"role": "assistant",
"content": "Good evening, thank you for choosing Best Buy, Fnu. My name is Lovely. I hope you are doing well. I'm sorry to hear that you're seeing horizontal lines on your TV.\n\nI'm absolutely committed to exploring all possible ways to assist you to fix this issue.\n\nTo ensure that we are on the right account, may I please have your email address registered with your Best Buy account?"
},
{
"role": "user",
"content": "dd@gmail.com"
},
{
"role": "assistant",
"content": "Perfect! Thank you for providing all the details, surely you have made my job easier by doing this. I really appreciate it.\n\nI also want to take a moment to express our heartfelt appreciation for your trust and loyalty. Thank you for being an amazing customer of BestBuy Total.\n\nCould you please help me with the order number or product's details to check it quickly?\n\nSamsung - 49\" Odyssey OLED G9 (G95SC) DQHD 240Hz 0.03ms G-Sync Compatible Curved Smart Gaming Monitor - Silver - just to confirm this is the item, right?"
},
{
"role": "user",
"content": "Order number: 112217629"
},
{
"role": "assistant",
"content": "Superb! Thank you for confirmation.\n\nThank you for your patience. After exploring all possible solutions, I can help you to arrange a home repair appointment for your device. Our Geek Squad experts will visit your home to inspect and fix your device.\n\nIt's great that you have a protection plan - rest assured, we've got your back! As a valued Total member, you can avail this service at a minimal service fee. This fee, applicable to all repairs, covers the cost of diagnosing the issue and any small parts needed for the repair. It's part of our 24-month free protection plan.\n\nPlease click here to review the service fee and plan coverage details -\n\nhttps://www.bestbuy.com/site/best-buy-membership/best-buy-protection/pcmcat1608643232014.c?id=pcmcat1608643232014#jl-servicefees\n\nFnu - just to confirm shall I proceed to schedule the appointment?"
},
{
"role": "user",
"content": "Yes please"
}
]
```
</Accordion>
```python
memory.add(messages=conversation, user_id="customer_service_bot")
```
## Retrieving and Using Memory
When you need to answer a question, retrieve relevant memories and use them for context:
```python
data = "Which TV am I using?"
relevant_memories = memory.search(data, user_id="customer_service_bot")
flatten_relevant_memories = "\n".join([m["memory"] for m in relevant_memories])
prompt = f"""Answer the user question considering the memories.
Memories:
{flatten_relevant_memories}
Question: {data}
"""
reply = agent.generate_reply(messages=[{"content": prompt, "role": "user"}])
print(reply)
```
## Multi-Agent Conversation
You can create multiple agents for more complex interactions:
```python
manager = ConversableAgent(
"manager",
system_message="You are a manager who helps in resolving customer issues.",
llm_config={"config_list": [{"model": "gpt-4", "temperature": 0, "api_key": os.environ.get("OPENAI_API_KEY")}]},
human_input_mode="NEVER"
)
customer_bot = ConversableAgent(
"customer_bot",
system_message="You are a customer service bot who gathers information on issues customers are facing.",
llm_config={"config_list": [{"model": "gpt-4", "temperature": 0, "api_key": os.environ.get("OPENAI_API_KEY")}]},
human_input_mode="NEVER"
)
data = "What appointment is booked?"
relevant_memories = memory.search(data, user_id="customer_service_bot")
flatten_relevant_memories = "\n".join([m["memory"] for m in relevant_memories])
prompt = f"""
Context:
{flatten_relevant_memories}
Question: {data}
"""
result = manager.send(prompt, customer_bot, request_reply=True)
```
This setup allows for a manager agent to interact with a customer service bot, both having access to the shared memory from Mem0.
## Conclusion
By integrating AutoGen with Mem0, you can create sophisticated conversational AI systems that maintain context across interactions. This approach is particularly useful for customer service applications, where understanding user history and preferences is crucial for providing personalized assistance.

View File

@@ -1,123 +0,0 @@
---
title: LangGraph with Mem0
---
This guide demonstrates how to create a personalized Customer Support AI Agent using LangGraph and Mem0. The agent retains information across interactions, enabling a personalized and efficient support experience.
## Overview
The Customer Support AI Agent leverages LangGraph for conversational flow and Mem0 for memory retention, creating a more context-aware and personalized support experience.
## Setup
Install the necessary packages using pip:
```bash
pip install langgraph langchain-openai mem0ai
```
## Full Code Example
Below is the complete code to create and interact with a Customer Support AI Agent using LangGraph and Mem0:
```python
from typing import Annotated, TypedDict, List
from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langchain_openai import ChatOpenAI
from mem0 import Memory
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
llm = ChatOpenAI(model="gpt-4o")
mem0 = Memory()
# Define the State
class State(TypedDict):
messages: Annotated[List[HumanMessage | AIMessage], add_messages]
mem0_user_id: str
graph = StateGraph(State)
def chatbot(state: State):
messages = state["messages"]
user_id = state["mem0_user_id"]
# Retrieve relevant memories
memories = mem0.search(messages[-1].content, user_id=user_id)
context = "Relevant information from previous conversations:\n"
for memory in memories:
context += f"- {memory['memory']}\n"
system_message = SystemMessage(content=f"""You are a helpful customer support assistant. Use the provided context to personalize your responses and remember user preferences and past interactions.
{context}""")
full_messages = [system_message] + messages
response = llm.invoke(full_messages)
# Store the interaction in Mem0
mem0.add(f"User: {messages[-1].content}\nAssistant: {response.content}", user_id=user_id)
return {"messages": [response]}
# Add nodes to the graph
graph.add_node("chatbot", chatbot)
# Add edge from START to chatbot
graph.add_edge(START, "chatbot")
# Add edge from chatbot back to itself
graph.add_edge("chatbot", "chatbot")
compiled_graph = graph.compile()
def run_conversation(user_input: str, mem0_user_id: str):
config = {"configurable": {"thread_id": mem0_user_id}}
state = {"messages": [HumanMessage(content=user_input)], "mem0_user_id": mem0_user_id}
for event in compiled_graph.stream(state, config):
for value in event.values():
if value.get("messages"):
print("Customer Support:", value["messages"][-1].content)
return # Exit after printing the response
if __name__ == "__main__":
print("Welcome to Customer Support! How can I assist you today?")
mem0_user_id = "test123"
while True:
user_input = input("You: ")
if user_input.lower() in ['quit', 'exit', 'bye']:
print("Customer Support: Thank you for contacting us. Have a great day!")
break
run_conversation(user_input, mem0_user_id)
```
## Key Components
1. **State Definition**: The `State` class defines the structure of the conversation state, including messages and user ID.
2. **Chatbot Node**: The `chatbot` function handles the core logic, including:
- Retrieving relevant memories
- Preparing context and system message
- Generating responses
- Storing interactions in Mem0
3. **Graph Setup**: The code sets up a `StateGraph` with the chatbot node and necessary edges.
4. **Conversation Runner**: The `run_conversation` function manages the flow of the conversation, processing user input and displaying responses.
## Usage
To use the Customer Support AI Agent:
1. Run the script.
2. Enter your queries when prompted.
3. Type 'quit', 'exit', or 'bye' to end the conversation.
## Key Points
- **Memory Integration**: Mem0 is used to store and retrieve relevant information from past interactions.
- **Personalization**: The agent uses past interactions to provide more contextual and personalized responses.
- **Flexible Architecture**: The LangGraph structure allows for easy expansion and modification of the conversation flow.
## Conclusion
This Customer Support AI Agent demonstrates the power of combining LangGraph for conversation management and Mem0 for memory retention. As the conversation progresses, the agent's responses become increasingly personalized, providing an improved support experience.

View File

@@ -14,19 +14,19 @@ With Mem0, you can create stateful LLM-based applications such as chatbots, virt
Here are some examples of how Mem0 can be integrated into various applications:
## Example Use Cases
## Examples
<CardGroup cols={1}>
<Card title="Personal AI Tutor" icon="square-1" href="/examples/personal-ai-tutor">
<img width="100%" src="/images/ai-tutor.png" />
<CardGroup cols={2}>
<Card title="Mem0 with Ollama" icon="square-1" href="/examples/mem0-with-ollama">
Run Mem0 locally with Ollama.
</Card>
<Card title="Personal AI Tutor" icon="square-2" href="/examples/personal-ai-tutor">
Create a Personalized AI Tutor that adapts to student progress and learning preferences.
</Card>
<Card title="Personal Travel Assistant" icon="square-2" href="/examples/personal-travel-assistant">
<img src="/images/personal-travel-agent.png" />
<Card title="Personal Travel Assistant" icon="square-3" href="/examples/personal-travel-assistant">
Build a Personalized AI Travel Assistant that understands your travel preferences and past itineraries.
</Card>
<Card title="Customer Support Agent" icon="square-3" href="/examples/customer-support-agent">
<img width="100%" src="/images/customer-support-agent.png" />
<Card title="Customer Support Agent" icon="square-4" href="/examples/customer-support-agent">
Develop a Personal AI Assistant that remembers user preferences, past interactions, and context to provide personalized and efficient assistance.
</Card>
</CardGroup>