Adding proxy server settings to azure openai (#1753)

This commit is contained in:
Pranav Puranik
2024-08-29 04:48:50 -05:00
committed by GitHub
parent deeb4f2250
commit fee3c27af3
7 changed files with 46 additions and 6 deletions

View File

@@ -48,6 +48,7 @@ Here's a comprehensive list of all parameters that can be used across different
| `model` | Embedding model to use |
| `api_key` | API key of the provider |
| `embedding_dims` | Dimensions of the embedding model |
| `http_client_proxies` | Allow proxy server settings |
| `ollama_base_url` | Base URL for the Ollama embedding model |
| `model_kwargs` | Key-Value arguments for the Huggingface embedding model |

View File

@@ -53,6 +53,7 @@ Here's the table based on the provided parameters:
| `max_tokens` | Tokens to generate | All |
| `top_p` | Probability threshold for nucleus sampling | All |
| `top_k` | Number of highest probability tokens to keep | All |
| `http_client_proxies`| Allow proxy server settings | AzureOpenAI |
| `models` | List of models | Openrouter |
| `route` | Routing strategy | Openrouter |
| `openrouter_base_url`| Base URL for Openrouter API | Openrouter |