[Mem0] Update docs and improve readability (#1727)
This commit is contained in:
130
docs/integrations/autogen.mdx
Normal file
130
docs/integrations/autogen.mdx
Normal file
@@ -0,0 +1,130 @@
|
||||
Build conversational AI agents with memory capabilities. This integration combines AutoGen for creating AI agents with Mem0 for memory management, enabling context-aware and personalized interactions.
|
||||
|
||||
## Overview
|
||||
|
||||
In this guide, we'll explore an example of creating a conversational AI system with memory:
|
||||
- A customer service bot that can recall previous interactions and provide personalized responses.
|
||||
|
||||
## Setup and Configuration
|
||||
|
||||
Install necessary libraries:
|
||||
|
||||
```bash
|
||||
pip install pyautogen mem0ai openai
|
||||
```
|
||||
|
||||
First, we'll import the necessary libraries and set up our configurations.
|
||||
|
||||
<Note>Remember to get the Mem0 API key from [Mem0 Platform](https://app.mem0.ai).</Note>
|
||||
|
||||
```python
|
||||
import os
|
||||
from autogen import ConversableAgent
|
||||
from mem0 import MemoryClient
|
||||
from openai import OpenAI
|
||||
|
||||
# Configuration
|
||||
OPENAI_API_KEY = 'sk-xxx' # Replace with your actual OpenAI API key
|
||||
MEM0_API_KEY = 'your-mem0-key' # Replace with your actual Mem0 API key from https://app.mem0.ai
|
||||
USER_ID = "customer_service_bot"
|
||||
|
||||
# Set up OpenAI API key
|
||||
os.environ['OPENAI_API_KEY'] = OPENAI_API_KEY
|
||||
|
||||
# Initialize Mem0 and AutoGen agents
|
||||
memory_client = MemoryClient(api_key=MEM0_API_KEY)
|
||||
agent = ConversableAgent(
|
||||
"chatbot",
|
||||
llm_config={"config_list": [{"model": "gpt-4", "api_key": OPENAI_API_KEY}]},
|
||||
code_execution_config=False,
|
||||
human_input_mode="NEVER",
|
||||
)
|
||||
```
|
||||
|
||||
## Storing Conversations in Memory
|
||||
|
||||
Add conversation history to Mem0 for future reference:
|
||||
|
||||
```python
|
||||
conversation = [
|
||||
{"role": "assistant", "content": "Hi, I'm Best Buy's chatbot! How can I help you?"},
|
||||
{"role": "user", "content": "I'm seeing horizontal lines on my TV."},
|
||||
{"role": "assistant", "content": "I'm sorry to hear that. Can you provide your TV model?"},
|
||||
{"role": "user", "content": "It's a Sony - 77\" Class BRAVIA XR A80K OLED 4K UHD Smart Google TV"},
|
||||
{"role": "assistant", "content": "Thank you for the information. Let's troubleshoot this issue..."}
|
||||
]
|
||||
|
||||
memory_client.add(messages=conversation, user_id=USER_ID)
|
||||
print("Conversation added to memory.")
|
||||
```
|
||||
|
||||
## Retrieving and Using Memory
|
||||
|
||||
Create a function to get context-aware responses based on user's question and previous interactions:
|
||||
|
||||
```python
|
||||
def get_context_aware_response(question):
|
||||
relevant_memories = memory_client.search(question, user_id=USER_ID)
|
||||
context = "\n".join([m["memory"] for m in relevant_memories])
|
||||
|
||||
prompt = f"""Answer the user question considering the previous interactions:
|
||||
Previous interactions:
|
||||
{context}
|
||||
|
||||
Question: {question}
|
||||
"""
|
||||
|
||||
reply = agent.generate_reply(messages=[{"content": prompt, "role": "user"}])
|
||||
return reply
|
||||
|
||||
# Example usage
|
||||
question = "What was the issue with my TV?"
|
||||
answer = get_context_aware_response(question)
|
||||
print("Context-aware answer:", answer)
|
||||
```
|
||||
|
||||
## Multi-Agent Conversation
|
||||
|
||||
For more complex scenarios, you can create multiple agents:
|
||||
|
||||
```python
|
||||
manager = ConversableAgent(
|
||||
"manager",
|
||||
system_message="You are a manager who helps in resolving complex customer issues.",
|
||||
llm_config={"config_list": [{"model": "gpt-4", "api_key": OPENAI_API_KEY}]},
|
||||
human_input_mode="NEVER"
|
||||
)
|
||||
|
||||
def escalate_to_manager(question):
|
||||
relevant_memories = memory_client.search(question, user_id=USER_ID)
|
||||
context = "\n".join([m["memory"] for m in relevant_memories])
|
||||
|
||||
prompt = f"""
|
||||
Context from previous interactions:
|
||||
{context}
|
||||
|
||||
Customer question: {question}
|
||||
|
||||
As a manager, how would you address this issue?
|
||||
"""
|
||||
|
||||
manager_response = manager.generate_reply(messages=[{"content": prompt, "role": "user"}])
|
||||
return manager_response
|
||||
|
||||
# Example usage
|
||||
complex_question = "I'm not satisfied with the troubleshooting steps. What else can be done?"
|
||||
manager_answer = escalate_to_manager(complex_question)
|
||||
print("Manager's response:", manager_answer)
|
||||
```
|
||||
|
||||
## Conclusion
|
||||
|
||||
By integrating AutoGen with Mem0, you've created a conversational AI system with memory capabilities. This example demonstrates a customer service bot that can recall previous interactions and provide context-aware responses, with the ability to escalate complex issues to a manager agent.
|
||||
|
||||
This integration enables the creation of more intelligent and personalized AI agents for various applications, such as customer support, virtual assistants, and interactive chatbots.
|
||||
|
||||
## Help
|
||||
|
||||
In case of any questions, please feel free to reach out to us using one of the following methods:
|
||||
|
||||
<Snippet file="get-help.mdx" />
|
||||
144
docs/integrations/langgraph.mdx
Normal file
144
docs/integrations/langgraph.mdx
Normal file
@@ -0,0 +1,144 @@
|
||||
---
|
||||
title: LangGraph
|
||||
---
|
||||
|
||||
Build a personalized Customer Support AI Agent using LangGraph for conversation flow and Mem0 for memory retention. This integration enables context-aware and efficient support experiences.
|
||||
|
||||
## Overview
|
||||
|
||||
In this guide, we'll create a Customer Support AI Agent that:
|
||||
1. Uses LangGraph to manage conversation flow
|
||||
2. Leverages Mem0 to store and retrieve relevant information from past interactions
|
||||
3. Provides personalized responses based on user history
|
||||
|
||||
## Setup and Configuration
|
||||
|
||||
Install necessary libraries:
|
||||
|
||||
```bash
|
||||
pip install langgraph langchain-openai mem0ai
|
||||
```
|
||||
|
||||
|
||||
Import required modules and set up configurations:
|
||||
|
||||
<Note>Remember to get the Mem0 API key from [Mem0 Platform](https://app.mem0.ai).</Note>
|
||||
|
||||
```python
|
||||
from typing import Annotated, TypedDict, List
|
||||
from langgraph.graph import StateGraph, START
|
||||
from langgraph.graph.message import add_messages
|
||||
from langchain_openai import ChatOpenAI
|
||||
from mem0 import Memory
|
||||
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
|
||||
|
||||
# Configuration
|
||||
OPENAI_API_KEY = 'sk-xxx' # Replace with your actual OpenAI API key
|
||||
MEM0_API_KEY = 'your-mem0-key' # Replace with your actual Mem0 API key
|
||||
|
||||
# Initialize LangChain and Mem0
|
||||
llm = ChatOpenAI(model="gpt-4", api_key=OPENAI_API_KEY)
|
||||
mem0 = Memory(api_key=MEM0_API_KEY)
|
||||
```
|
||||
|
||||
## Define State and Graph
|
||||
|
||||
Set up the conversation state and LangGraph structure:
|
||||
|
||||
```python
|
||||
class State(TypedDict):
|
||||
messages: Annotated[List[HumanMessage | AIMessage], add_messages]
|
||||
mem0_user_id: str
|
||||
|
||||
graph = StateGraph(State)
|
||||
```
|
||||
|
||||
## Create Chatbot Function
|
||||
|
||||
Define the core logic for the Customer Support AI Agent:
|
||||
|
||||
```python
|
||||
def chatbot(state: State):
|
||||
messages = state["messages"]
|
||||
user_id = state["mem0_user_id"]
|
||||
|
||||
# Retrieve relevant memories
|
||||
memories = mem0.search(messages[-1].content, user_id=user_id)
|
||||
|
||||
context = "Relevant information from previous conversations:\n"
|
||||
for memory in memories:
|
||||
context += f"- {memory['memory']}\n"
|
||||
|
||||
system_message = SystemMessage(content=f"""You are a helpful customer support assistant. Use the provided context to personalize your responses and remember user preferences and past interactions.
|
||||
{context}""")
|
||||
|
||||
full_messages = [system_message] + messages
|
||||
response = llm.invoke(full_messages)
|
||||
|
||||
# Store the interaction in Mem0
|
||||
mem0.add(f"User: {messages[-1].content}\nAssistant: {response.content}", user_id=user_id)
|
||||
return {"messages": [response]}
|
||||
```
|
||||
|
||||
## Set Up Graph Structure
|
||||
|
||||
Configure the LangGraph with appropriate nodes and edges:
|
||||
|
||||
```python
|
||||
graph.add_node("chatbot", chatbot)
|
||||
graph.add_edge(START, "chatbot")
|
||||
graph.add_edge("chatbot", "chatbot")
|
||||
|
||||
compiled_graph = graph.compile()
|
||||
```
|
||||
|
||||
## Create Conversation Runner
|
||||
|
||||
Implement a function to manage the conversation flow:
|
||||
|
||||
```python
|
||||
def run_conversation(user_input: str, mem0_user_id: str):
|
||||
config = {"configurable": {"thread_id": mem0_user_id}}
|
||||
state = {"messages": [HumanMessage(content=user_input)], "mem0_user_id": mem0_user_id}
|
||||
|
||||
for event in compiled_graph.stream(state, config):
|
||||
for value in event.values():
|
||||
if value.get("messages"):
|
||||
print("Customer Support:", value["messages"][-1].content)
|
||||
return
|
||||
```
|
||||
|
||||
## Main Interaction Loop
|
||||
|
||||
Set up the main program loop for user interaction:
|
||||
|
||||
```python
|
||||
if __name__ == "__main__":
|
||||
print("Welcome to Customer Support! How can I assist you today?")
|
||||
mem0_user_id = "customer_123" # You can generate or retrieve this based on your user management system
|
||||
while True:
|
||||
user_input = input("You: ")
|
||||
if user_input.lower() in ['quit', 'exit', 'bye']:
|
||||
print("Customer Support: Thank you for contacting us. Have a great day!")
|
||||
break
|
||||
run_conversation(user_input, mem0_user_id)
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
1. **Memory Integration**: Uses Mem0 to store and retrieve relevant information from past interactions.
|
||||
2. **Personalization**: Provides context-aware responses based on user history.
|
||||
3. **Flexible Architecture**: LangGraph structure allows for easy expansion of the conversation flow.
|
||||
4. **Continuous Learning**: Each interaction is stored, improving future responses.
|
||||
|
||||
## Conclusion
|
||||
|
||||
By integrating LangGraph with Mem0, you can build a personalized Customer Support AI Agent that can maintain context across interactions and provide personalized assistance.
|
||||
|
||||
## Help
|
||||
|
||||
- For more details on LangGraph, visit the [LangChain documentation](https://python.langchain.com/docs/langgraph).
|
||||
- For Mem0 documentation, refer to the [Mem0 Platform](https://app.mem0.ai/).
|
||||
- If you need further assistance, please feel free to reach out to us through following methods:
|
||||
|
||||
<Snippet file="get-help.mdx" />
|
||||
Reference in New Issue
Block a user