Update Docs (#2277)

This commit is contained in:
Saket Aryan
2025-03-01 06:07:05 +05:30
committed by GitHub
parent c1aba35884
commit 5606c3ffb8
30 changed files with 437 additions and 877 deletions

View File

@@ -1,8 +1,13 @@
---
title: Anthropic
---
To use anthropic's models, please set the `ANTHROPIC_API_KEY` which you find on their [Account Settings Page](https://console.anthropic.com/account/keys).
## Usage
```python
<CodeGroup>
```python Python
import os
from mem0 import Memory
@@ -24,6 +29,26 @@ m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
```
```typescript TypeScript
import { Memory } from 'mem0ai/oss';
const config = {
llm: {
provider: 'anthropic',
config: {
apiKey: process.env.ANTHROPIC_API_KEY || '',
model: 'claude-3-7-sonnet-latest',
temperature: 0.1,
maxTokens: 2000,
},
},
};
const memory = new Memory(config);
await memory.add("Likes to play cricket on weekends", { userId: "alice", metadata: { category: "hobbies" } });
```
</CodeGroup>
## Config
All available parameters for the `anthropic` config are present in [Master List of All Params in Config](../config).

View File

@@ -1,10 +1,15 @@
---
title: Groq
---
[Groq](https://groq.com/) is the creator of the world's first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
In order to use LLMs from Groq, go to their [platform](https://console.groq.com/keys) and get the API key. Set the API key as `GROQ_API_KEY` environment variable to use the model as given below in the example.
## Usage
```python
<CodeGroup>
```python Python
import os
from mem0 import Memory
@@ -26,6 +31,26 @@ m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
```
```typescript TypeScript
import { Memory } from 'mem0ai/oss';
const config = {
llm: {
provider: 'groq',
config: {
apiKey: process.env.GROQ_API_KEY || '',
model: 'mixtral-8x7b-32768',
temperature: 0.1,
maxTokens: 1000,
},
},
};
const memory = new Memory(config);
await memory.add("Likes to play cricket on weekends", { userId: "alice", metadata: { category: "hobbies" } });
```
</CodeGroup>
## Config
All available parameters for the `groq` config are present in [Master List of All Params in Config](../config).

View File

@@ -6,7 +6,8 @@ To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment varia
## Usage
```python
<CodeGroup>
```python Python
import os
from mem0 import Memory
@@ -38,6 +39,26 @@ m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
```
```typescript TypeScript
import { Memory } from 'mem0ai/oss';
const config = {
llm: {
provider: 'openai',
config: {
apiKey: process.env.OPENAI_API_KEY || '',
model: 'gpt-4-turbo-preview',
temperature: 0.2,
maxTokens: 1500,
},
},
};
const memory = new Memory(config);
await memory.add("Likes to play cricket on weekends", { userId: "alice", metadata: { category: "hobbies" } });
```
</CodeGroup>
We also support the new [OpenAI structured-outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) model.
```python
@@ -59,7 +80,9 @@ config = {
m = Memory.from_config(config)
```
<Note>
OpenAI structured-outputs is currently only available in the Python implementation.
</Note>
## Config