Docs: Integration/Vercel AI SDK (#2009)

This commit is contained in:
Saket Aryan
2024-11-04 20:49:51 +05:30
committed by GitHub
parent 2e74667cc6
commit 6a00643bfa
3 changed files with 170 additions and 1 deletions

View File

@@ -0,0 +1,168 @@
---
title: Vercel AI SDK
---
The [**Mem0 AI SDK Provider**](https://www.npmjs.com/package/@mem0/vercel-ai-provider) is a library developed by **Mem0** to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
<Note type="info">
🎉 Exciting news! Mem0 AI SDK now supports **OpenAI**, **Anthropic**, **Cohere**, and **Groq** providers.
</Note>
## Overview
In this guide, we'll create a Travel Agent AI that:
1. 🧠 Offers persistent memory storage for conversational AI
2. 🔄 Enables smooth integration with the Vercel AI SDK
3. 🚀 Ensures compatibility with multiple LLM providers
4. 📝 Supports structured message formats for clarity
5. ⚡ Facilitates streaming response capabilities
## Setup and Configuration
Install the SDK provider using npm:
```bash
npm install @mem0/vercel-ai-provider
```
## Getting Started
### Setting Up Mem0
1. Get your **Mem0 API Key** from the [Mem0 Dashboard](https://app.mem0.ai/dashboard/api-keys).
2. Initialize the Mem0 Client in your application:
```typescript
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0({
provider: "openai",
mem0ApiKey: "m0-xxx",
apiKey: "provider-api-key",
config: {
compatibility: "strict",
},
});
```
> **Note**: The `openai` provider is set as default. Consider using `MEM0_API_KEY` and `OPENAI_API_KEY` as environment variables for security.
3. Add Memories to Enhance Context:
```typescript
import { LanguageModelV1Prompt } from "ai";
import { addMemories } from "@mem0/vercel-ai-provider";
const messages: LanguageModelV1Prompt = [
{ role: "user", content: [{ type: "text", text: "I love red cars." }] },
];
await addMemories(messages, { user_id: "borat" });
```
### Standalone Features:
```typescript
await addMemories(messages, { user_id: "borat", mem0ApiKey: "m0-xxx" });
await retrieveMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx" });
```
> **Note**: For standalone features, such as `addMemories` and `retrieveMemories`, you must either set `MEM0_API_KEY` as an environment variable or pass it directly in the function call.
### 1. Basic Text Generation with Memory Context
```typescript
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", { user_id: "borat" }),
prompt: "Suggest me a good car to buy!",
});
```
### 2. Combining OpenAI Provider with Memory Utils
```typescript
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { retrieveMemories } from "@mem0/vercel-ai-provider";
const prompt = "Suggest me a good car to buy.";
const memories = await retrieveMemories(prompt, { user_id: "borat" });
const { text } = await generateText({
model: openai("gpt-4-turbo"),
prompt: prompt,
system: memories,
});
```
### 3. Structured Message Format with Memory
```typescript
import { generateText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { text } = await generateText({
model: mem0("gpt-4-turbo", { user_id: "borat" }),
messages: [
{
role: "user",
content: [
{ type: "text", text: "Suggest me a good car to buy." },
{ type: "text", text: "Why is it better than the other cars for me?" },
],
},
],
});
```
### 3. Streaming Responses with Memory Context
```typescript
import { streamText } from "ai";
import { createMem0 } from "@mem0/vercel-ai-provider";
const mem0 = createMem0();
const { textStream } = await streamText({
model: mem0("gpt-4-turbo", {
user_id: "borat",
}),
prompt: "Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.",
});
for await (const textPart of textStream) {
process.stdout.write(textPart);
}
```
## Key Features
- `createMem0()`: Initializes a new Mem0 provider instance.
- `retrieveMemories()`: Retrieves memory context for prompts.
- `addMemories()`: Adds user memories to enhance contextual responses.
## Best Practices
1. **User Identification**: Use a unique `user_id` for consistent memory retrieval.
2. **Memory Cleanup**: Regularly clean up unused memory data.
> **Note**: We also have support for `agent_id`, `app_id`, and `run_id`. Refer [Docs](/api-reference/memory/add-memories).
## Conclusion
Mem0s Vercel AI SDK enables the creation of intelligent, context-aware applications with persistent memory and seamless integration.
## Help
- For more details on LangGraph, visit the [Vercel AI SDK documentation](https://sdk.vercel.ai/docs/introduction).
- For Mem0 documentation, refer to the [Mem0 Platform](https://app.mem0.ai/).
- If you need further assistance, please feel free to reach out to us through following methods:
<Snippet file="get-help.mdx" />

View File

@@ -207,6 +207,7 @@
{
"group": "Integrations",
"pages": [
"integrations/vercel-ai-sdk",
"integrations/multion",
"integrations/autogen",
"integrations/langchain",

View File

@@ -3,7 +3,7 @@ title: Overview
---
<Note type="info">
🎉 Exciting news! Mem0 now supports all the latest [Claude models](https://www.anthropic.com/news/3-5-models-and-computer-use).
🎉 Exciting news! Mem0 now supports [Vercel's AI SDK](/integrations/vercel-ai-sdk).
</Note>
[Mem0](https://mem0.dev/wd) (pronounced "mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. Mem0 remembers user preferences and traits and continuously updates over time, making it ideal for applications like customer support chatbots and AI assistants.