Add Grok Support (#2260)

This commit is contained in:
Dev Khant
2025-02-26 13:34:01 +05:30
committed by GitHub
parent a236aa2315
commit e9bc4cdc95
8 changed files with 95 additions and 4 deletions

View File

@@ -75,7 +75,7 @@ Here's the table based on the provided parameters:
| `openai_base_url` | Base URL for OpenAI API | OpenAI |
| `azure_kwargs` | Azure LLM args for initialization | AzureOpenAI |
| `deepseek_base_url` | Base URL for DeepSeek API | DeepSeek |
| `xai_base_url` | Base URL for XAI API | XAI |
## Supported LLMs

View File

@@ -0,0 +1,31 @@
[XAI](https://x.ai/) is a new AI company founded by Elon Musk that develops large language models, including Grok. Grok is trained on real-time data from X (formerly Twitter) and aims to provide accurate, up-to-date responses with a touch of wit and humor.
In order to use LLMs from XAI, go to their [platform](https://console.x.ai) and get the API key. Set the API key as `XAI_API_KEY` environment variable to use the model as given below in the example.
## Usage
```python
import os
from mem0 import Memory
os.environ["OPENAI_API_KEY"] = "your-api-key" # used for embedding model
os.environ["XAI_API_KEY"] = "your-api-key"
config = {
"llm": {
"provider": "xai",
"config": {
"model": "grok-2-latest",
"temperature": 0.1,
"max_tokens": 1000,
}
}
}
m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
```
## Config
All available parameters for the `xai` config are present in [Master List of All Params in Config](../config).

View File

@@ -27,6 +27,7 @@ To view all supported llms, visit the [Supported LLMs](./models).
<Card title="AWS bedrock" href="/components/llms/models/aws_bedrock"></Card>
<Card title="Gemini" href="/components/llms/models/gemini"></Card>
<Card title="DeepSeek" href="/components/llms/models/deepseek"></Card>
<Card title="XAI" href="/components/llms/models/xai"></Card>
</CardGroup>
## Structured vs Unstructured Outputs