---
title: OpenAI
---
To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
## Usage
```python Python
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "openai",
"config": {
"model": "gpt-4o",
"temperature": 0.2,
"max_tokens": 2000,
}
}
}
# Use Openrouter by passing it's api key
# os.environ["OPENROUTER_API_KEY"] = "your-api-key"
# config = {
# "llm": {
# "provider": "openai",
# "config": {
# "model": "meta-llama/llama-3.1-70b-instruct",
# }
# }
# }
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
```
```typescript TypeScript
import { Memory } from 'mem0ai/oss';
const config = {
llm: {
provider: 'openai',
config: {
apiKey: process.env.OPENAI_API_KEY || '',
model: 'gpt-4-turbo-preview',
temperature: 0.2,
maxTokens: 1500,
},
},
};
const memory = new Memory(config);
await memory.add("Likes to play cricket on weekends", { userId: "alice", metadata: { category: "hobbies" } });
```
We also support the new [OpenAI structured-outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) model.
```python
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "openai_structured",
"config": {
"model": "gpt-4o-2024-08-06",
"temperature": 0.0,
}
}
}
m = Memory.from_config(config)
```
OpenAI structured-outputs is currently only available in the Python implementation.
## Config
All available parameters for the `openai` config are present in [Master List of All Params in Config](../config).