Fix CI issues related to missing dependency (#3096)
This commit is contained in:
@@ -1,271 +1,267 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "ApdaLD4Qi30H"
|
||||
},
|
||||
"source": [
|
||||
"# Neo4j as Graph Memory"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "l7bi3i21i30I"
|
||||
},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"\n",
|
||||
"### 1. Install Mem0 with Graph Memory support\n",
|
||||
"\n",
|
||||
"To use Mem0 with Graph Memory support, install it using pip:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"pip install \"mem0ai[graph]\"\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"This command installs Mem0 along with the necessary dependencies for graph functionality.\n",
|
||||
"\n",
|
||||
"### 2. Install Neo4j\n",
|
||||
"\n",
|
||||
"To utilize Neo4j as Graph Memory, run it with Docker:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"docker run \\\n",
|
||||
" -p 7474:7474 -p 7687:7687 \\\n",
|
||||
" -e NEO4J_AUTH=neo4j/password \\\n",
|
||||
" neo4j:5\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"This command starts Neo4j with default credentials (`neo4j` / `password`) and exposes both the HTTP (7474) and Bolt (7687) ports.\n",
|
||||
"\n",
|
||||
"You can access the Neo4j browser at [http://localhost:7474](http://localhost:7474).\n",
|
||||
"\n",
|
||||
"Additional information can be found in the [Neo4j documentation](https://neo4j.com/docs/).\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "DkeBdFEpi30I"
|
||||
},
|
||||
"source": [
|
||||
"## Configuration\n",
|
||||
"\n",
|
||||
"Do all the imports and configure OpenAI (enter your OpenAI API key):"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"metadata": {
|
||||
"id": "d99EfBpii30I"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from mem0 import Memory\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"os.environ[\"OPENAI_API_KEY\"] = (\n",
|
||||
" \"\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "QTucZJjIi30J"
|
||||
},
|
||||
"source": [
|
||||
"Set up configuration to use the embedder model and Neo4j as a graph store:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"metadata": {
|
||||
"id": "QSE0RFoSi30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"config = {\n",
|
||||
" \"embedder\": {\n",
|
||||
" \"provider\": \"openai\",\n",
|
||||
" \"config\": {\"model\": \"text-embedding-3-large\", \"embedding_dims\": 1536},\n",
|
||||
" },\n",
|
||||
" \"graph_store\": {\n",
|
||||
" \"provider\": \"neo4j\",\n",
|
||||
" \"config\": {\n",
|
||||
" \"url\": \"bolt://54.87.227.131:7687\",\n",
|
||||
" \"username\": \"neo4j\",\n",
|
||||
" \"password\": \"causes-bins-vines\",\n",
|
||||
" },\n",
|
||||
" },\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "OioTnv6xi30J"
|
||||
},
|
||||
"source": [
|
||||
"## Graph Memory initializiation\n",
|
||||
"\n",
|
||||
"Initialize Neo4j as a Graph Memory store:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"metadata": {
|
||||
"id": "fX-H9vgNi30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"m = Memory.from_config(config_dict=config)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "kr1fVMwEi30J"
|
||||
},
|
||||
"source": [
|
||||
"## Store memories\n",
|
||||
"\n",
|
||||
"Create memories:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"metadata": {
|
||||
"id": "sEfogqp_i30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"messages = [\n",
|
||||
" {\n",
|
||||
" \"role\": \"user\",\n",
|
||||
" \"content\": \"I'm planning to watch a movie tonight. Any recommendations?\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": \"How about a thriller movies? They can be quite engaging.\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"user\",\n",
|
||||
" \"content\": \"I'm not a big fan of thriller movies but I love sci-fi movies.\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": \"Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future.\",\n",
|
||||
" },\n",
|
||||
"]\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "gtBHCyIgi30J"
|
||||
},
|
||||
"source": [
|
||||
"Store memories in Neo4j:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"metadata": {
|
||||
"id": "BMVGgZMFi30K"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Store inferred memories (default behavior)\n",
|
||||
"result = m.add(\n",
|
||||
" messages, user_id=\"alice\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "lQRptOywi30K"
|
||||
},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "LBXW7Gv-i30K"
|
||||
},
|
||||
"source": [
|
||||
"## Search memories"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"base_uri": "https://localhost:8080/"
|
||||
},
|
||||
"id": "UHFDeQBEi30K",
|
||||
"outputId": "2c69de7d-a79a-48f6-e3c4-bd743067857c"
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Loves sci-fi movies 0.3153664287340898\n",
|
||||
"Planning to watch a movie tonight 0.09683349296551162\n",
|
||||
"Not a big fan of thriller movies 0.09468540071789466\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"for result in m.search(\"what does alice love?\", user_id=\"alice\")[\"results\"]:\n",
|
||||
" print(result[\"memory\"], result[\"score\"])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"metadata": {
|
||||
"id": "2jXEIma9kK_Q"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"provenance": []
|
||||
},
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.13.2"
|
||||
}
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "ApdaLD4Qi30H"
|
||||
},
|
||||
"source": [
|
||||
"# Neo4j as Graph Memory"
|
||||
]
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "l7bi3i21i30I"
|
||||
},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"\n",
|
||||
"### 1. Install Mem0 with Graph Memory support\n",
|
||||
"\n",
|
||||
"To use Mem0 with Graph Memory support, install it using pip:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"pip install \"mem0ai[graph]\"\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"This command installs Mem0 along with the necessary dependencies for graph functionality.\n",
|
||||
"\n",
|
||||
"### 2. Install Neo4j\n",
|
||||
"\n",
|
||||
"To utilize Neo4j as Graph Memory, run it with Docker:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"docker run \\\n",
|
||||
" -p 7474:7474 -p 7687:7687 \\\n",
|
||||
" -e NEO4J_AUTH=neo4j/password \\\n",
|
||||
" neo4j:5\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"This command starts Neo4j with default credentials (`neo4j` / `password`) and exposes both the HTTP (7474) and Bolt (7687) ports.\n",
|
||||
"\n",
|
||||
"You can access the Neo4j browser at [http://localhost:7474](http://localhost:7474).\n",
|
||||
"\n",
|
||||
"Additional information can be found in the [Neo4j documentation](https://neo4j.com/docs/).\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "DkeBdFEpi30I"
|
||||
},
|
||||
"source": [
|
||||
"## Configuration\n",
|
||||
"\n",
|
||||
"Do all the imports and configure OpenAI (enter your OpenAI API key):"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"metadata": {
|
||||
"id": "d99EfBpii30I"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from mem0 import Memory\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"os.environ[\"OPENAI_API_KEY\"] = \"\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "QTucZJjIi30J"
|
||||
},
|
||||
"source": [
|
||||
"Set up configuration to use the embedder model and Neo4j as a graph store:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"metadata": {
|
||||
"id": "QSE0RFoSi30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"config = {\n",
|
||||
" \"embedder\": {\n",
|
||||
" \"provider\": \"openai\",\n",
|
||||
" \"config\": {\"model\": \"text-embedding-3-large\", \"embedding_dims\": 1536},\n",
|
||||
" },\n",
|
||||
" \"graph_store\": {\n",
|
||||
" \"provider\": \"neo4j\",\n",
|
||||
" \"config\": {\n",
|
||||
" \"url\": \"bolt://54.87.227.131:7687\",\n",
|
||||
" \"username\": \"neo4j\",\n",
|
||||
" \"password\": \"causes-bins-vines\",\n",
|
||||
" },\n",
|
||||
" },\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "OioTnv6xi30J"
|
||||
},
|
||||
"source": [
|
||||
"## Graph Memory initializiation\n",
|
||||
"\n",
|
||||
"Initialize Neo4j as a Graph Memory store:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"metadata": {
|
||||
"id": "fX-H9vgNi30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"m = Memory.from_config(config_dict=config)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "kr1fVMwEi30J"
|
||||
},
|
||||
"source": [
|
||||
"## Store memories\n",
|
||||
"\n",
|
||||
"Create memories:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"metadata": {
|
||||
"id": "sEfogqp_i30J"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"messages = [\n",
|
||||
" {\n",
|
||||
" \"role\": \"user\",\n",
|
||||
" \"content\": \"I'm planning to watch a movie tonight. Any recommendations?\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": \"How about a thriller movies? They can be quite engaging.\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"user\",\n",
|
||||
" \"content\": \"I'm not a big fan of thriller movies but I love sci-fi movies.\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": \"Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future.\",\n",
|
||||
" },\n",
|
||||
"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "gtBHCyIgi30J"
|
||||
},
|
||||
"source": [
|
||||
"Store memories in Neo4j:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"metadata": {
|
||||
"id": "BMVGgZMFi30K"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Store inferred memories (default behavior)\n",
|
||||
"result = m.add(messages, user_id=\"alice\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "lQRptOywi30K"
|
||||
},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "LBXW7Gv-i30K"
|
||||
},
|
||||
"source": [
|
||||
"## Search memories"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"base_uri": "https://localhost:8080/"
|
||||
},
|
||||
"id": "UHFDeQBEi30K",
|
||||
"outputId": "2c69de7d-a79a-48f6-e3c4-bd743067857c"
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Loves sci-fi movies 0.3153664287340898\n",
|
||||
"Planning to watch a movie tonight 0.09683349296551162\n",
|
||||
"Not a big fan of thriller movies 0.09468540071789466\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"for result in m.search(\"what does alice love?\", user_id=\"alice\")[\"results\"]:\n",
|
||||
" print(result[\"memory\"], result[\"score\"])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"metadata": {
|
||||
"id": "2jXEIma9kK_Q"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"provenance": []
|
||||
},
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.13.2"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
}
|
||||
|
||||
@@ -45,11 +45,7 @@ def get_food_recommendation(user_query: str, user_id):
|
||||
"""Get food recommendation with memory context"""
|
||||
|
||||
# Search memory for relevant food preferences
|
||||
memories_result = memory_client.search(
|
||||
query=user_query,
|
||||
user_id=user_id,
|
||||
limit=5
|
||||
)
|
||||
memories_result = memory_client.search(query=user_query, user_id=user_id, limit=5)
|
||||
|
||||
# Add memory context to the message
|
||||
memories = [f"- {result['memory']}" for result in memories_result]
|
||||
@@ -71,6 +67,7 @@ def get_food_recommendation(user_query: str, user_id):
|
||||
# Save audio file
|
||||
if response.audio:
|
||||
import time
|
||||
|
||||
timestamp = int(time.time())
|
||||
filename = f"food_recommendation_{timestamp}.mp3"
|
||||
write_audio_to_file(
|
||||
@@ -118,7 +115,11 @@ def initialize_food_memory(user_id):
|
||||
# Initialize the memory for the user once in order for the agent to learn the user preference
|
||||
initialize_food_memory(user_id=USER_ID)
|
||||
|
||||
print(get_food_recommendation("Which type of restaurants should I go tonight for dinner and cuisines preferred?", user_id=USER_ID))
|
||||
print(
|
||||
get_food_recommendation(
|
||||
"Which type of restaurants should I go tonight for dinner and cuisines preferred?", user_id=USER_ID
|
||||
)
|
||||
)
|
||||
# OUTPUT: 🎵 Audio saved as food_recommendation_1750162610.mp3
|
||||
# For dinner tonight, considering your love for healthy spic optionsy, you could try a nice Thai, Indian, or Mexican restaurant.
|
||||
# You might find dishes with quinoa, chickpeas, tofu, and fresh herbs delightful. Enjoy your dinner!
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from agents import Agent, Runner, function_tool, handoffs, enable_verbose_stdout_logging
|
||||
from agents import Agent, Runner, function_tool, enable_verbose_stdout_logging
|
||||
from dotenv import load_dotenv
|
||||
|
||||
from mem0 import MemoryClient
|
||||
@@ -35,7 +35,7 @@ travel_agent = Agent(
|
||||
understand the user's travel preferences and history before making recommendations.
|
||||
After providing your response, use store_conversation to save important details.""",
|
||||
tools=[search_memory, save_memory],
|
||||
model="gpt-4o"
|
||||
model="gpt-4o",
|
||||
)
|
||||
|
||||
health_agent = Agent(
|
||||
@@ -44,7 +44,7 @@ health_agent = Agent(
|
||||
understand the user's health goals and dietary preferences.
|
||||
After providing advice, use store_conversation to save relevant information.""",
|
||||
tools=[search_memory, save_memory],
|
||||
model="gpt-4o"
|
||||
model="gpt-4o",
|
||||
)
|
||||
|
||||
# Triage agent with handoffs
|
||||
@@ -55,7 +55,7 @@ triage_agent = Agent(
|
||||
For health-related questions (fitness, diet, wellness, exercise), hand off to Health Advisor.
|
||||
For general questions, you can handle them directly using available tools.""",
|
||||
handoffs=[travel_agent, health_agent],
|
||||
model="gpt-4o"
|
||||
model="gpt-4o",
|
||||
)
|
||||
|
||||
|
||||
@@ -74,10 +74,7 @@ def chat_with_handoffs(user_input: str, user_id: str) -> str:
|
||||
result = Runner.run_sync(triage_agent, user_input)
|
||||
|
||||
# Store the original conversation in memory
|
||||
conversation = [
|
||||
{"role": "user", "content": user_input},
|
||||
{"role": "assistant", "content": result.final_output}
|
||||
]
|
||||
conversation = [{"role": "user", "content": user_input}, {"role": "assistant", "content": result.final_output}]
|
||||
mem0.add(conversation, user_id=user_id)
|
||||
|
||||
return result.final_output
|
||||
|
||||
@@ -34,96 +34,91 @@ config = {
|
||||
"api_key": "vllm-api-key",
|
||||
"temperature": 0.7,
|
||||
"max_tokens": 100,
|
||||
}
|
||||
},
|
||||
"embedder": {
|
||||
"provider": "openai",
|
||||
"config": {
|
||||
"model": "text-embedding-3-small"
|
||||
}
|
||||
},
|
||||
},
|
||||
"embedder": {"provider": "openai", "config": {"model": "text-embedding-3-small"}},
|
||||
"vector_store": {
|
||||
"provider": "qdrant",
|
||||
"config": {
|
||||
"collection_name": "vllm_memories",
|
||||
"host": "localhost",
|
||||
"port": 6333
|
||||
}
|
||||
}
|
||||
"config": {"collection_name": "vllm_memories", "host": "localhost", "port": 6333},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def main():
|
||||
"""
|
||||
Demonstrate vLLM integration with mem0
|
||||
"""
|
||||
print("--> Initializing mem0 with vLLM...")
|
||||
|
||||
|
||||
# Initialize memory with vLLM
|
||||
memory = Memory.from_config(config)
|
||||
|
||||
|
||||
print("--> Memory initialized successfully!")
|
||||
|
||||
|
||||
# Example conversations to store
|
||||
conversations = [
|
||||
{
|
||||
"messages": [
|
||||
{"role": "user", "content": "I love playing chess on weekends"},
|
||||
{"role": "assistant", "content": "That's great! Chess is an excellent strategic game that helps improve critical thinking."}
|
||||
{
|
||||
"role": "assistant",
|
||||
"content": "That's great! Chess is an excellent strategic game that helps improve critical thinking.",
|
||||
},
|
||||
],
|
||||
"user_id": "user_123"
|
||||
"user_id": "user_123",
|
||||
},
|
||||
{
|
||||
"messages": [
|
||||
{"role": "user", "content": "I'm learning Python programming"},
|
||||
{"role": "assistant", "content": "Python is a fantastic language for beginners! What specific areas are you focusing on?"}
|
||||
{
|
||||
"role": "assistant",
|
||||
"content": "Python is a fantastic language for beginners! What specific areas are you focusing on?",
|
||||
},
|
||||
],
|
||||
"user_id": "user_123"
|
||||
"user_id": "user_123",
|
||||
},
|
||||
{
|
||||
"messages": [
|
||||
{"role": "user", "content": "I prefer working late at night, I'm more productive then"},
|
||||
{"role": "assistant", "content": "Many people find they're more creative and focused during nighttime hours. It's important to maintain a consistent schedule that works for you."}
|
||||
{
|
||||
"role": "assistant",
|
||||
"content": "Many people find they're more creative and focused during nighttime hours. It's important to maintain a consistent schedule that works for you.",
|
||||
},
|
||||
],
|
||||
"user_id": "user_123"
|
||||
}
|
||||
"user_id": "user_123",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
print("\n--> Adding memories using vLLM...")
|
||||
|
||||
|
||||
# Add memories - now powered by vLLM's high-performance inference
|
||||
for i, conversation in enumerate(conversations, 1):
|
||||
result = memory.add(
|
||||
messages=conversation["messages"],
|
||||
user_id=conversation["user_id"]
|
||||
)
|
||||
result = memory.add(messages=conversation["messages"], user_id=conversation["user_id"])
|
||||
print(f"Memory {i} added: {result}")
|
||||
|
||||
|
||||
print("\n🔍 Searching memories...")
|
||||
|
||||
|
||||
# Search memories - vLLM will process the search and memory operations
|
||||
search_queries = [
|
||||
"What does the user like to do on weekends?",
|
||||
"What is the user learning?",
|
||||
"When is the user most productive?"
|
||||
"When is the user most productive?",
|
||||
]
|
||||
|
||||
|
||||
for query in search_queries:
|
||||
print(f"\nQuery: {query}")
|
||||
memories = memory.search(
|
||||
query=query,
|
||||
user_id="user_123"
|
||||
)
|
||||
|
||||
memories = memory.search(query=query, user_id="user_123")
|
||||
|
||||
for memory_item in memories:
|
||||
print(f" - {memory_item['memory']}")
|
||||
|
||||
|
||||
print("\n--> Getting all memories for user...")
|
||||
all_memories = memory.get_all(user_id="user_123")
|
||||
print(f"Total memories stored: {len(all_memories)}")
|
||||
|
||||
|
||||
for memory_item in all_memories:
|
||||
print(f" - {memory_item['memory']}")
|
||||
|
||||
|
||||
print("\n--> vLLM integration demo completed successfully!")
|
||||
print("\nBenefits of using vLLM:")
|
||||
print(" -> 2.7x higher throughput compared to standard implementations")
|
||||
|
||||
Reference in New Issue
Block a user