Files
t6_mem0/docs/components/llms/models/deepseek.mdx
Prateek Chhikara 393a4fd5a6 Docs Update (#2591)
2025-04-29 08:15:25 -07:00

57 lines
1.7 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: DeepSeek
---
<Snippet file="paper-release.mdx" />
To use DeepSeek LLM models, you have to set the `DEEPSEEK_API_KEY` environment variable. You can also optionally set `DEEPSEEK_API_BASE` if you need to use a different API endpoint (defaults to "https://api.deepseek.com").
## Usage
```python
import os
from mem0 import Memory
os.environ["DEEPSEEK_API_KEY"] = "your-api-key"
os.environ["OPENAI_API_KEY"] = "your-api-key" # for embedder model
config = {
"llm": {
"provider": "deepseek",
"config": {
"model": "deepseek-chat", # default model
"temperature": 0.2,
"max_tokens": 2000,
"top_p": 1.0
}
}
}
m = Memory.from_config(config)
messages = [
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
{"role": "user", "content": "Im not a big fan of thriller movies but I love sci-fi movies."},
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
m.add(messages, user_id="alice", metadata={"category": "movies"})
```
You can also configure the API base URL in the config:
```python
config = {
"llm": {
"provider": "deepseek",
"config": {
"model": "deepseek-chat",
"deepseek_base_url": "https://your-custom-endpoint.com",
"api_key": "your-api-key" # alternatively to using environment variable
}
}
}
```
## Config
All available parameters for the `deepseek` config are present in [Master List of All Params in Config](../config).