126 lines
3.6 KiB
Plaintext
126 lines
3.6 KiB
Plaintext
---
|
|
title: 🤖 Overview
|
|
---
|
|
|
|
## Overview
|
|
|
|
Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
|
|
|
|
<CardGroup cols={4}>
|
|
<Card title="OpenAI" href="#openai"></Card>
|
|
<Card title="Groq" href="#groq"></Card>
|
|
<Card title="Together" href="#together"></Card>
|
|
<Card title="AWS Bedrock" href="#aws_bedrock"></Card>
|
|
</CardGroup>
|
|
|
|
## OpenAI
|
|
|
|
To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
|
|
|
|
Once you have obtained the key, you can use it like this:
|
|
|
|
```python
|
|
import os
|
|
from mem0 import Memory
|
|
|
|
os.environ['OPENAI_API_KEY'] = 'xxx'
|
|
|
|
config = {
|
|
"llm": {
|
|
"provider": "openai",
|
|
"config": {
|
|
"model": "gpt-4o",
|
|
"temperature": 0.2,
|
|
"max_tokens": 1500,
|
|
}
|
|
}
|
|
}
|
|
|
|
m = Memory.from_config(config)
|
|
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
|
```
|
|
|
|
## Groq
|
|
|
|
[Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
|
|
|
|
In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
|
|
|
|
```python
|
|
import os
|
|
from mem0 import Memory
|
|
|
|
os.environ['GROQ_API_KEY'] = 'xxx'
|
|
|
|
config = {
|
|
"llm": {
|
|
"provider": "groq",
|
|
"config": {
|
|
"model": "mixtral-8x7b-32768",
|
|
"temperature": 0.1,
|
|
"max_tokens": 1000,
|
|
}
|
|
}
|
|
}
|
|
|
|
m = Memory.from_config(config)
|
|
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
|
```
|
|
|
|
## TogetherAI
|
|
|
|
To use TogetherAI LLM models, you have to set the `TOGETHER_API_KEY` environment variable. You can obtain the TogetherAI API key from their [Account settings page](https://api.together.xyz/settings/api-keys).
|
|
|
|
Once you have obtained the key, you can use it like this:
|
|
|
|
```python
|
|
import os
|
|
from mem0 import Memory
|
|
|
|
os.environ['TOGETHER_API_KEY'] = 'xxx'
|
|
|
|
config = {
|
|
"llm": {
|
|
"provider": "togetherai",
|
|
"config": {
|
|
"model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
|
|
"temperature": 0.2,
|
|
"max_tokens": 1500,
|
|
}
|
|
}
|
|
}
|
|
|
|
m = Memory.from_config(config)
|
|
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
|
```
|
|
|
|
## AWS Bedrock
|
|
|
|
### Setup
|
|
- Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
|
|
- You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
|
|
- You will have to export `AWS_REGION`, `AWS_ACCESS_KEY`, and `AWS_SECRET_ACCESS_KEY` to set environment variables.
|
|
|
|
```python
|
|
import os
|
|
from mem0 import Memory
|
|
|
|
os.environ['AWS_REGION'] = 'us-east-1'
|
|
os.environ["AWS_ACCESS_KEY"] = "xx"
|
|
os.environ["AWS_SECRET_ACCESS_KEY"] = "xx"
|
|
|
|
config = {
|
|
"llm": {
|
|
"provider": "aws_bedrock",
|
|
"config": {
|
|
"model": "arn:aws:bedrock:us-east-1:123456789012:model/your-model-name",
|
|
"temperature": 0.2,
|
|
"max_tokens": 1500,
|
|
}
|
|
}
|
|
}
|
|
|
|
m = Memory.from_config(config)
|
|
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
|
```
|