31 lines
1.0 KiB
Plaintext
31 lines
1.0 KiB
Plaintext
[Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
|
|
|
|
In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
|
|
|
|
## Usage
|
|
|
|
```python
|
|
import os
|
|
from mem0 import Memory
|
|
|
|
os.environ["OPENAI_API_KEY"] = "your-api-key" # used for embedding model
|
|
os.environ["GROQ_API_KEY"] = "your-api-key"
|
|
|
|
config = {
|
|
"llm": {
|
|
"provider": "groq",
|
|
"config": {
|
|
"model": "mixtral-8x7b-32768",
|
|
"temperature": 0.1,
|
|
"max_tokens": 1000,
|
|
}
|
|
}
|
|
}
|
|
|
|
m = Memory.from_config(config)
|
|
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
|
|
```
|
|
|
|
## Config
|
|
|
|
All available parameters for the `groq` config are present in [Master List of All Params in Config](../config). |