[Mem0] Update docs and improve readability (#1727)
This commit is contained in:
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: AWS Bedrock
|
||||
---
|
||||
|
||||
### Setup
|
||||
- Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
|
||||
- You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Azure OpenAI
|
||||
---
|
||||
|
||||
To use Azure OpenAI models, you have to set the `AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_ENDPOINT`, and `OPENAI_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Google AI
|
||||
---
|
||||
|
||||
To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Mistral AI
|
||||
---
|
||||
|
||||
To use mistral's models, please Obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example.
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: OpenAI
|
||||
---
|
||||
|
||||
To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -11,3 +11,16 @@ To use a llm, you must provide a configuration to customize its usage. If no con
|
||||
For a comprehensive list of available parameters for llm configuration, please refer to [Config](./config).
|
||||
|
||||
To view all supported llms, visit the [Supported LLMs](./models).
|
||||
|
||||
<CardGroup cols={4}>
|
||||
<Card title="OpenAI" href="/components/llms/models/openai"></Card>
|
||||
<Card title="Ollama" href="/components/llms/models/ollama"></Card>
|
||||
<Card title="Azure OpenAI" href="/components/llms/models/azure_openai"></Card>
|
||||
<Card title="Anthropic" href="/components/llms/models/anthropic"></Card>
|
||||
<Card title="Together" href="/components/llms/models/together"></Card>
|
||||
<Card title="Groq" href="/components/llms/models/groq"></Card>
|
||||
<Card title="Litellm" href="/components/llms/models/litellm"></Card>
|
||||
<Card title="Mistral AI" href="/components/llms/models/mistral_ai"></Card>
|
||||
<Card title="Google AI" href="/components/llms/models/google_ai"></Card>
|
||||
<Card title="AWS bedrock" href="/components/llms/models/aws_bedrock"></Card>
|
||||
</CardGroup>
|
||||
Reference in New Issue
Block a user