Docs: Integration/Vercel AI SDK (#2009)
This commit is contained in:
168
docs/integrations/vercel-ai-sdk.mdx
Normal file
168
docs/integrations/vercel-ai-sdk.mdx
Normal file
@@ -0,0 +1,168 @@
|
||||
---
|
||||
title: Vercel AI SDK
|
||||
---
|
||||
|
||||
The [**Mem0 AI SDK Provider**](https://www.npmjs.com/package/@mem0/vercel-ai-provider) is a library developed by **Mem0** to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
|
||||
|
||||
<Note type="info">
|
||||
🎉 Exciting news! Mem0 AI SDK now supports **OpenAI**, **Anthropic**, **Cohere**, and **Groq** providers.
|
||||
</Note>
|
||||
|
||||
## Overview
|
||||
|
||||
In this guide, we'll create a Travel Agent AI that:
|
||||
1. 🧠 Offers persistent memory storage for conversational AI
|
||||
2. 🔄 Enables smooth integration with the Vercel AI SDK
|
||||
3. 🚀 Ensures compatibility with multiple LLM providers
|
||||
4. 📝 Supports structured message formats for clarity
|
||||
5. ⚡ Facilitates streaming response capabilities
|
||||
|
||||
## Setup and Configuration
|
||||
|
||||
Install the SDK provider using npm:
|
||||
|
||||
```bash
|
||||
npm install @mem0/vercel-ai-provider
|
||||
```
|
||||
|
||||
## Getting Started
|
||||
|
||||
### Setting Up Mem0
|
||||
|
||||
1. Get your **Mem0 API Key** from the [Mem0 Dashboard](https://app.mem0.ai/dashboard/api-keys).
|
||||
|
||||
2. Initialize the Mem0 Client in your application:
|
||||
|
||||
```typescript
|
||||
import { createMem0 } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const mem0 = createMem0({
|
||||
provider: "openai",
|
||||
mem0ApiKey: "m0-xxx",
|
||||
apiKey: "provider-api-key",
|
||||
config: {
|
||||
compatibility: "strict",
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
> **Note**: The `openai` provider is set as default. Consider using `MEM0_API_KEY` and `OPENAI_API_KEY` as environment variables for security.
|
||||
|
||||
3. Add Memories to Enhance Context:
|
||||
|
||||
```typescript
|
||||
import { LanguageModelV1Prompt } from "ai";
|
||||
import { addMemories } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const messages: LanguageModelV1Prompt = [
|
||||
{ role: "user", content: [{ type: "text", text: "I love red cars." }] },
|
||||
];
|
||||
|
||||
await addMemories(messages, { user_id: "borat" });
|
||||
```
|
||||
|
||||
### Standalone Features:
|
||||
|
||||
```typescript
|
||||
await addMemories(messages, { user_id: "borat", mem0ApiKey: "m0-xxx" });
|
||||
await retrieveMemories(prompt, { user_id: "borat", mem0ApiKey: "m0-xxx" });
|
||||
```
|
||||
> **Note**: For standalone features, such as `addMemories` and `retrieveMemories`, you must either set `MEM0_API_KEY` as an environment variable or pass it directly in the function call.
|
||||
|
||||
### 1. Basic Text Generation with Memory Context
|
||||
|
||||
```typescript
|
||||
import { generateText } from "ai";
|
||||
import { createMem0 } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const mem0 = createMem0();
|
||||
|
||||
const { text } = await generateText({
|
||||
model: mem0("gpt-4-turbo", { user_id: "borat" }),
|
||||
prompt: "Suggest me a good car to buy!",
|
||||
});
|
||||
```
|
||||
|
||||
### 2. Combining OpenAI Provider with Memory Utils
|
||||
|
||||
```typescript
|
||||
import { generateText } from "ai";
|
||||
import { openai } from "@ai-sdk/openai";
|
||||
import { retrieveMemories } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const prompt = "Suggest me a good car to buy.";
|
||||
const memories = await retrieveMemories(prompt, { user_id: "borat" });
|
||||
|
||||
const { text } = await generateText({
|
||||
model: openai("gpt-4-turbo"),
|
||||
prompt: prompt,
|
||||
system: memories,
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Structured Message Format with Memory
|
||||
|
||||
```typescript
|
||||
import { generateText } from "ai";
|
||||
import { createMem0 } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const mem0 = createMem0();
|
||||
|
||||
const { text } = await generateText({
|
||||
model: mem0("gpt-4-turbo", { user_id: "borat" }),
|
||||
messages: [
|
||||
{
|
||||
role: "user",
|
||||
content: [
|
||||
{ type: "text", text: "Suggest me a good car to buy." },
|
||||
{ type: "text", text: "Why is it better than the other cars for me?" },
|
||||
],
|
||||
},
|
||||
],
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Streaming Responses with Memory Context
|
||||
|
||||
```typescript
|
||||
import { streamText } from "ai";
|
||||
import { createMem0 } from "@mem0/vercel-ai-provider";
|
||||
|
||||
const mem0 = createMem0();
|
||||
|
||||
const { textStream } = await streamText({
|
||||
model: mem0("gpt-4-turbo", {
|
||||
user_id: "borat",
|
||||
}),
|
||||
prompt: "Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.",
|
||||
});
|
||||
|
||||
for await (const textPart of textStream) {
|
||||
process.stdout.write(textPart);
|
||||
}
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
- `createMem0()`: Initializes a new Mem0 provider instance.
|
||||
- `retrieveMemories()`: Retrieves memory context for prompts.
|
||||
- `addMemories()`: Adds user memories to enhance contextual responses.
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **User Identification**: Use a unique `user_id` for consistent memory retrieval.
|
||||
2. **Memory Cleanup**: Regularly clean up unused memory data.
|
||||
|
||||
> **Note**: We also have support for `agent_id`, `app_id`, and `run_id`. Refer [Docs](/api-reference/memory/add-memories).
|
||||
|
||||
## Conclusion
|
||||
|
||||
Mem0’s Vercel AI SDK enables the creation of intelligent, context-aware applications with persistent memory and seamless integration.
|
||||
|
||||
## Help
|
||||
|
||||
- For more details on LangGraph, visit the [Vercel AI SDK documentation](https://sdk.vercel.ai/docs/introduction).
|
||||
- For Mem0 documentation, refer to the [Mem0 Platform](https://app.mem0.ai/).
|
||||
- If you need further assistance, please feel free to reach out to us through following methods:
|
||||
|
||||
<Snippet file="get-help.mdx" />
|
||||
Reference in New Issue
Block a user