[Docs] Add docs for Azure OpenAI provider (#804)
This commit is contained in:
@@ -8,6 +8,7 @@ Embedchain supports several embedding models from the following providers:
|
||||
|
||||
<CardGroup cols={4}>
|
||||
<Card title="OpenAI" href="#openai"></Card>
|
||||
<Card title="Azure OpenAI" href="#azure-openai"></Card>
|
||||
<Card title="GPT4All" href="#gpt4all"></Card>
|
||||
<Card title="Hugging Face" href="#hugging-face"></Card>
|
||||
<Card title="Vertex AI" href="#vertex-ai"></Card>
|
||||
@@ -43,6 +44,45 @@ embedder:
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
## Azure OpenAI
|
||||
|
||||
To use Azure OpenAI embedding model, you have to set some of the azure openai related environment variables as given in the code block below:
|
||||
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
import os
|
||||
from embedchain import App
|
||||
|
||||
os.environ["OPENAI_API_TYPE"] = "azure"
|
||||
os.environ["OPENAI_API_BASE"] = "https://xxx.openai.azure.com/"
|
||||
os.environ["OPENAI_API_KEY"] = "xxx"
|
||||
os.environ["OPENAI_API_VERSION"] = "xxx"
|
||||
|
||||
app = App.from_config(yaml_path="config.yaml")
|
||||
```
|
||||
|
||||
```yaml config.yaml
|
||||
llm:
|
||||
provider: azure_openai
|
||||
model: gpt-35-turbo
|
||||
config:
|
||||
deployment_name: your_llm_deployment_name
|
||||
temperature: 0.5
|
||||
max_tokens: 1000
|
||||
top_p: 1
|
||||
stream: false
|
||||
|
||||
embedder:
|
||||
provider: azure_openai
|
||||
config:
|
||||
model: text-embedding-ada-002
|
||||
deployment_name: you_embedding_model_deployment_name
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
You can find the list of models and deployment name on the [Azure OpenAI Platform](https://oai.azure.com/portal).
|
||||
|
||||
## GPT4ALL
|
||||
|
||||
GPT4All supports generating high quality embeddings of arbitrary length documents of text using a CPU optimized contrastively trained Sentence Transformer.
|
||||
|
||||
@@ -65,7 +65,42 @@ llm:
|
||||
|
||||
## Azure OpenAI
|
||||
|
||||
_Coming soon_
|
||||
To use Azure OpenAI model, you have to set some of the azure openai related environment variables as given in the code block below:
|
||||
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
import os
|
||||
from embedchain import App
|
||||
|
||||
os.environ["OPENAI_API_TYPE"] = "azure"
|
||||
os.environ["OPENAI_API_BASE"] = "https://xxx.openai.azure.com/"
|
||||
os.environ["OPENAI_API_KEY"] = "xxx"
|
||||
os.environ["OPENAI_API_VERSION"] = "xxx"
|
||||
|
||||
app = App.from_config(yaml_path="config.yaml")
|
||||
```
|
||||
|
||||
```yaml config.yaml
|
||||
llm:
|
||||
provider: azure_openai
|
||||
model: gpt-35-turbo
|
||||
config:
|
||||
deployment_name: your_llm_deployment_name
|
||||
temperature: 0.5
|
||||
max_tokens: 1000
|
||||
top_p: 1
|
||||
stream: false
|
||||
|
||||
embedder:
|
||||
provider: azure_openai
|
||||
config:
|
||||
model: text-embedding-ada-002
|
||||
deployment_name: you_embedding_model_deployment_name
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
You can find the list of models and deployment name on the [Azure OpenAI Platform](https://oai.azure.com/portal).
|
||||
|
||||
## Anthropic
|
||||
|
||||
|
||||
@@ -119,11 +119,17 @@ Install related dependencies using the following command:
|
||||
pip install --upgrade 'embedchain[milvus]'
|
||||
```
|
||||
|
||||
Set the Zilliz environment variables `ZILLIZ_CLOUD_URI` and `ZILLIZ_CLOUD_TOKEN` which you can find it on their [cloud platform](https://cloud.zilliz.com/).
|
||||
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
import os
|
||||
from embedchain import App
|
||||
|
||||
os.environ['ZILLIZ_CLOUD_URI'] = 'https://xxx.zillizcloud.com'
|
||||
os.environ['ZILLIZ_CLOUD_TOKEN'] = 'xxx'
|
||||
|
||||
# load zilliz configuration from yaml file
|
||||
app = App.from_config(yaml_path="config.yaml")
|
||||
```
|
||||
@@ -147,8 +153,16 @@ _Coming soon_
|
||||
|
||||
## Pinecone
|
||||
|
||||
Install pinecone related dependencies using the following command:
|
||||
|
||||
```bash
|
||||
pip install --upgrade 'embedchain[pinecone]'
|
||||
```
|
||||
|
||||
In order to use Pinecone as vector database, set the environment variables `PINECONE_API_KEY` and `PINECONE_ENV` which you can find on [Pinecone dashboard](https://app.pinecone.io/).
|
||||
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
from embedchain import App
|
||||
|
||||
@@ -165,6 +179,8 @@ vectordb:
|
||||
collection_name: my-pinecone-index
|
||||
```
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
## Qdrant
|
||||
|
||||
_Coming soon_
|
||||
|
||||
Reference in New Issue
Block a user