DeepSeek Integration (#2173)
This commit is contained in:
@@ -72,6 +72,7 @@ Here's the table based on the provided parameters:
|
||||
| `ollama_base_url` | Base URL for Ollama API | Ollama |
|
||||
| `openai_base_url` | Base URL for OpenAI API | OpenAI |
|
||||
| `azure_kwargs` | Azure LLM args for initialization | AzureOpenAI |
|
||||
| `deepseek_base_url` | Base URL for DeepSeek API | DeepSeek |
|
||||
|
||||
|
||||
## Supported LLMs
|
||||
|
||||
49
docs/components/llms/models/deepseek.mdx
Normal file
49
docs/components/llms/models/deepseek.mdx
Normal file
@@ -0,0 +1,49 @@
|
||||
---
|
||||
title: DeepSeek
|
||||
---
|
||||
|
||||
To use DeepSeek LLM models, you have to set the `DEEPSEEK_API_KEY` environment variable. You can also optionally set `DEEPSEEK_API_BASE` if you need to use a different API endpoint (defaults to "https://api.deepseek.com").
|
||||
|
||||
## Usage
|
||||
|
||||
```python
|
||||
import os
|
||||
from mem0 import Memory
|
||||
|
||||
os.environ["DEEPSEEK_API_KEY"] = "your-api-key"
|
||||
os.environ["OPENAI_API_KEY"] = "your-api-key" # for embedder model
|
||||
|
||||
config = {
|
||||
"llm": {
|
||||
"provider": "deepseek",
|
||||
"config": {
|
||||
"model": "deepseek-chat", # default model
|
||||
"temperature": 0.2,
|
||||
"max_tokens": 1500,
|
||||
"top_p": 1.0
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
m = Memory.from_config(config)
|
||||
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
||||
```
|
||||
|
||||
You can also configure the API base URL in the config:
|
||||
|
||||
```python
|
||||
config = {
|
||||
"llm": {
|
||||
"provider": "deepseek",
|
||||
"config": {
|
||||
"model": "deepseek-chat",
|
||||
"deepseek_base_url": "https://your-custom-endpoint.com",
|
||||
"api_key": "your-api-key" # alternatively to using environment variable
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Config
|
||||
|
||||
All available parameters for the `deepseek` config are present in [Master List of All Params in Config](../config).
|
||||
@@ -24,6 +24,7 @@ To view all supported llms, visit the [Supported LLMs](./models).
|
||||
<Card title="Google AI" href="/components/llms/models/google_ai"></Card>
|
||||
<Card title="AWS bedrock" href="/components/llms/models/aws_bedrock"></Card>
|
||||
<Card title="Gemini" href="/components/llms/models/gemini"></Card>
|
||||
<Card title="DeepSeek" href="/components/llms/models/deepseek"></Card>
|
||||
</CardGroup>
|
||||
|
||||
## Structured vs Unstructured Outputs
|
||||
|
||||
Reference in New Issue
Block a user