Files
t6_mem0/docs/components/llms/models/azure_openai.mdx
2025-03-25 00:34:21 +05:30

86 lines
2.9 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: Azure OpenAI
---
To use Azure OpenAI models, you have to set the `LLM_AZURE_OPENAI_API_KEY`, `LLM_AZURE_ENDPOINT`, `LLM_AZURE_DEPLOYMENT` and `LLM_AZURE_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
> **Note**: The following are currently unsupported with reasoning models `Parallel tool calling`,`temperature`, `top_p`, `presence_penalty`, `frequency_penalty`, `logprobs`, `top_logprobs`, `logit_bias`, `max_tokens`
## Usage
```python
import os
from mem0 import Memory
os.environ["LLM_AZURE_OPENAI_API_KEY"] = "your-api-key"
os.environ["LLM_AZURE_DEPLOYMENT"] = "your-deployment-name"
os.environ["LLM_AZURE_ENDPOINT"] = "your-api-base-url"
os.environ["LLM_AZURE_API_VERSION"] = "version-to-use"
config = {
"llm": {
"provider": "azure_openai",
"config": {
"model": "your-deployment-name",
"temperature": 0.1,
"max_tokens": 2000,
"azure_kwargs": {
"azure_deployment": "",
"api_version": "",
"azure_endpoint": "",
"api_key": "",
"default_headers": {
"CustomHeader": "your-custom-header",
}
}
}
}
}
m = Memory.from_config(config)
messages = [
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
{"role": "user", "content": "Im not a big fan of thriller movies but I love sci-fi movies."},
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
m.add(messages, user_id="alice", metadata={"category": "movies"})
```
We also support the new [OpenAI structured-outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) model.
```python
import os
from mem0 import Memory
os.environ["LLM_AZURE_OPENAI_API_KEY"] = "your-api-key"
os.environ["LLM_AZURE_DEPLOYMENT"] = "your-deployment-name"
os.environ["LLM_AZURE_ENDPOINT"] = "your-api-base-url"
os.environ["LLM_AZURE_API_VERSION"] = "version-to-use"
config = {
"llm": {
"provider": "azure_openai_structured",
"config": {
"model": "your-deployment-name",
"temperature": 0.1,
"max_tokens": 2000,
"azure_kwargs": {
"azure_deployment": "",
"api_version": "",
"azure_endpoint": "",
"api_key": "",
"default_headers": {
"CustomHeader": "your-custom-header",
}
}
}
}
}
```
## Config
All available parameters for the `azure_openai` config are present in [Master List of All Params in Config](../config).