26 lines
1.3 KiB
Plaintext
26 lines
1.3 KiB
Plaintext
---
|
|
title: Overview
|
|
---
|
|
|
|
Mem0 includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
|
|
|
|
## Usage
|
|
|
|
To use a llm, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and `OpenAI` will be used as the llm.
|
|
|
|
For a comprehensive list of available parameters for llm configuration, please refer to [Config](./config).
|
|
|
|
To view all supported llms, visit the [Supported LLMs](./models).
|
|
|
|
<CardGroup cols={4}>
|
|
<Card title="OpenAI" href="/components/llms/models/openai"></Card>
|
|
<Card title="Ollama" href="/components/llms/models/ollama"></Card>
|
|
<Card title="Azure OpenAI" href="/components/llms/models/azure_openai"></Card>
|
|
<Card title="Anthropic" href="/components/llms/models/anthropic"></Card>
|
|
<Card title="Together" href="/components/llms/models/together"></Card>
|
|
<Card title="Groq" href="/components/llms/models/groq"></Card>
|
|
<Card title="Litellm" href="/components/llms/models/litellm"></Card>
|
|
<Card title="Mistral AI" href="/components/llms/models/mistral_ai"></Card>
|
|
<Card title="Google AI" href="/components/llms/models/google_ai"></Card>
|
|
<Card title="AWS bedrock" href="/components/llms/models/aws_bedrock"></Card>
|
|
</CardGroup> |