[Mem0] Update docs and improve readability (#1727)
This commit is contained in:
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Azure OpenAI
|
||||
---
|
||||
|
||||
To use Azure OpenAI embedding models, set the `AZURE_OPENAI_API_KEY` environment variable. You can obtain the Azure OpenAI API key from the Azure.
|
||||
|
||||
### Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Hugging Face
|
||||
---
|
||||
|
||||
You can use embedding models from Huggingface to run Mem0 locally.
|
||||
|
||||
### Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: OpenAI
|
||||
---
|
||||
|
||||
To use OpenAI embedding models, set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
|
||||
|
||||
### Usage
|
||||
|
||||
@@ -4,12 +4,19 @@ title: Overview
|
||||
|
||||
Mem0 offers support for various embedding models, allowing users to choose the one that best suits their needs.
|
||||
|
||||
## Supported Embedders
|
||||
|
||||
See the list of supported embedders below.
|
||||
|
||||
<CardGroup cols={4}>
|
||||
<Card title="OpenAI" href="/components/embedders/models/openai"></Card>
|
||||
<Card title="Azure OpenAI" href="/components/embedders/models/azure_openai"></Card>
|
||||
<Card title="Ollama" href="/components/embedders/models/ollama"></Card>
|
||||
<Card title="Hugging Face" href="/components/embedders/models/huggingface"></Card>
|
||||
</CardGroup>
|
||||
|
||||
## Usage
|
||||
|
||||
To utilize a embedder, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and `OpenAI` will be used as the embedder.
|
||||
|
||||
For a comprehensive list of available parameters for embedder configuration, please refer to [Config](./config).
|
||||
|
||||
To view all supported embedders, visit the [Supported embedders](./models).
|
||||
|
||||
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: AWS Bedrock
|
||||
---
|
||||
|
||||
### Setup
|
||||
- Before using the AWS Bedrock LLM, make sure you have the appropriate model access from [Bedrock Console](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess).
|
||||
- You will also need to authenticate the `boto3` client by using a method in the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials)
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Azure OpenAI
|
||||
---
|
||||
|
||||
To use Azure OpenAI models, you have to set the `AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_ENDPOINT`, and `OPENAI_API_VERSION` environment variables. You can obtain the Azure API key from the [Azure](https://azure.microsoft.com/).
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Google AI
|
||||
---
|
||||
|
||||
To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: Mistral AI
|
||||
---
|
||||
|
||||
To use mistral's models, please Obtain the Mistral AI api key from their [console](https://console.mistral.ai/). Set the `MISTRAL_API_KEY` environment variable to use the model as given below in the example.
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
---
|
||||
title: OpenAI
|
||||
---
|
||||
|
||||
To use OpenAI LLM models, you have to set the `OPENAI_API_KEY` environment variable. You can obtain the OpenAI API key from the [OpenAI Platform](https://platform.openai.com/account/api-keys).
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -11,3 +11,16 @@ To use a llm, you must provide a configuration to customize its usage. If no con
|
||||
For a comprehensive list of available parameters for llm configuration, please refer to [Config](./config).
|
||||
|
||||
To view all supported llms, visit the [Supported LLMs](./models).
|
||||
|
||||
<CardGroup cols={4}>
|
||||
<Card title="OpenAI" href="/components/llms/models/openai"></Card>
|
||||
<Card title="Ollama" href="/components/llms/models/ollama"></Card>
|
||||
<Card title="Azure OpenAI" href="/components/llms/models/azure_openai"></Card>
|
||||
<Card title="Anthropic" href="/components/llms/models/anthropic"></Card>
|
||||
<Card title="Together" href="/components/llms/models/together"></Card>
|
||||
<Card title="Groq" href="/components/llms/models/groq"></Card>
|
||||
<Card title="Litellm" href="/components/llms/models/litellm"></Card>
|
||||
<Card title="Mistral AI" href="/components/llms/models/mistral_ai"></Card>
|
||||
<Card title="Google AI" href="/components/llms/models/google_ai"></Card>
|
||||
<Card title="AWS bedrock" href="/components/llms/models/aws_bedrock"></Card>
|
||||
</CardGroup>
|
||||
@@ -4,14 +4,22 @@ title: Overview
|
||||
|
||||
Mem0 includes built-in support for various popular databases. Memory can utilize the database provided by the user, ensuring efficient use for specific needs.
|
||||
|
||||
## Supported Vector Databases
|
||||
|
||||
See the list of supported vector databases below.
|
||||
|
||||
<CardGroup cols={3}>
|
||||
<Card title="Qdrant" href="/components/vectordbs/dbs/qdrant"></Card>
|
||||
<Card title="Chroma" href="/components/vectordbs/dbs/chroma"></Card>
|
||||
<Card title="Pgvector" href="/components/vectordbs/dbs/pgvector"></Card>
|
||||
</CardGroup>
|
||||
|
||||
## Usage
|
||||
|
||||
To utilize a vector database, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and `Qdrant` will be used as the vector database.
|
||||
|
||||
For a comprehensive list of available parameters for vector database configuration, please refer to [Config](./config).
|
||||
|
||||
To view all supported vector databases, visit the [Supported Vector Databases](./dbs).
|
||||
|
||||
## Common issues
|
||||
|
||||
### Using model with different dimensions
|
||||
@@ -22,3 +30,4 @@ for example 768, you may encounter below error:
|
||||
`ValueError: shapes (0,1536) and (768,) not aligned: 1536 (dim 1) != 768 (dim 0)`
|
||||
|
||||
you could add `"embedding_model_dims": 768,` to the config of the vector_store to overcome this issue.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user