Adds Azure OpenAI LLM to Mem0 TS SDK (#2536)
This commit is contained in:
@@ -129,6 +129,7 @@ mode: "wide"
|
||||
|
||||
<Update label="2025-04-11" description="v2.1.16">
|
||||
**New Features:**
|
||||
- **Azure OpenAI:** Added support for Azure OpenAI
|
||||
- **Mistral LLM:** Added Mistral LLM integration in OSS
|
||||
|
||||
**Improvements:**
|
||||
|
||||
@@ -2,6 +2,8 @@
|
||||
title: Azure OpenAI
|
||||
---
|
||||
|
||||
<Note> Mem0 Now Supports Azure OpenAI Models in TypeScript SDK </Note>
|
||||
|
||||
To use Azure OpenAI models, you have to set the `LLM_AZURE_OPENAI_API_KEY`, `LLM_AZURE_ENDPOINT`, `LLM_AZURE_DEPLOYMENT` and `LLM_AZURE_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
|
||||
|
||||
> **Note**: The following are currently unsupported with reasoning models `Parallel tool calling`,`temperature`, `top_p`, `presence_penalty`, `frequency_penalty`, `logprobs`, `top_logprobs`, `logit_bias`, `max_tokens`
|
||||
@@ -9,7 +11,8 @@ To use Azure OpenAI models, you have to set the `LLM_AZURE_OPENAI_API_KEY`, `LLM
|
||||
|
||||
## Usage
|
||||
|
||||
```python
|
||||
<CodeGroup>
|
||||
```python Python
|
||||
import os
|
||||
from mem0 import Memory
|
||||
|
||||
@@ -48,7 +51,38 @@ messages = [
|
||||
m.add(messages, user_id="alice", metadata={"category": "movies"})
|
||||
```
|
||||
|
||||
We also support the new [OpenAI structured-outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) model.
|
||||
```typescript TypeScript
|
||||
import { Memory } from 'mem0ai/oss';
|
||||
|
||||
const config = {
|
||||
llm: {
|
||||
provider: 'azure_openai',
|
||||
config: {
|
||||
apiKey: process.env.AZURE_OPENAI_API_KEY || '',
|
||||
modelProperties: {
|
||||
endpoint: 'https://your-api-base-url',
|
||||
deployment: 'your-deployment-name',
|
||||
modelName: 'your-model-name',
|
||||
apiVersion: 'version-to-use',
|
||||
// Any other parameters you want to pass to the Azure OpenAI API
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const memory = new Memory(config);
|
||||
const messages = [
|
||||
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
|
||||
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
|
||||
{"role": "user", "content": "I’m not a big fan of thriller movies but I love sci-fi movies."},
|
||||
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
|
||||
]
|
||||
await memory.add(messages, { userId: "alice", metadata: { category: "movies" } });
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
|
||||
We also support the new [OpenAI structured-outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) model. Typescript SDK does not support the `azure_openai_structured` model yet.
|
||||
|
||||
```python
|
||||
import os
|
||||
|
||||
Reference in New Issue
Block a user