[Improvements] Add support for creating app from YAML string config (#980)
This commit is contained in:
@@ -2,13 +2,36 @@
|
||||
title: ❓ FAQs
|
||||
description: 'Collections of all the frequently asked questions'
|
||||
---
|
||||
|
||||
#### Does Embedchain support OpenAI's Assistant APIs?
|
||||
|
||||
<AccordionGroup>
|
||||
<Accordion title="Does Embedchain support OpenAI's Assistant APIs?">
|
||||
Yes, it does. Please refer to the [OpenAI Assistant docs page](/get-started/openai-assistant).
|
||||
</Accordion>
|
||||
<Accordion title="How to use MistralAI language model?">
|
||||
Use the model provided on huggingface: `mistralai/Mistral-7B-v0.1`
|
||||
<CodeGroup>
|
||||
```python main.py
|
||||
import os
|
||||
from embedchain import Pipeline as App
|
||||
|
||||
#### How to use `gpt-4-turbo` model released on OpenAI DevDay?
|
||||
os.environ["OPENAI_API_KEY"] = "sk-xxx"
|
||||
os.environ["HUGGINGFACE_ACCESS_TOKEN"] = "hf_your_token"
|
||||
|
||||
app = App.from_config("huggingface.yaml")
|
||||
```
|
||||
```yaml huggingface.yaml
|
||||
llm:
|
||||
provider: huggingface
|
||||
config:
|
||||
model: 'mistralai/Mistral-7B-v0.1'
|
||||
temperature: 0.5
|
||||
max_tokens: 1000
|
||||
top_p: 0.5
|
||||
stream: false
|
||||
```
|
||||
</CodeGroup>
|
||||
</Accordion>
|
||||
<Accordion title="How to use ChatGPT 4 turbo model released on OpenAI DevDay?">
|
||||
Use the model `gpt-4-turbo` provided my openai.
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
@@ -18,7 +41,7 @@ from embedchain import Pipeline as App
|
||||
os.environ['OPENAI_API_KEY'] = 'xxx'
|
||||
|
||||
# load llm configuration from gpt4_turbo.yaml file
|
||||
app = App.from_config(yaml_path="gpt4_turbo.yaml")
|
||||
app = App.from_config(config_path="gpt4_turbo.yaml")
|
||||
```
|
||||
|
||||
```yaml gpt4_turbo.yaml
|
||||
@@ -31,12 +54,9 @@ llm:
|
||||
top_p: 1
|
||||
stream: false
|
||||
```
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
|
||||
#### How to use GPT-4 as the LLM model?
|
||||
|
||||
</Accordion>
|
||||
<Accordion title="How to use GPT-4 as the LLM model?">
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
@@ -46,7 +66,7 @@ from embedchain import Pipeline as App
|
||||
os.environ['OPENAI_API_KEY'] = 'xxx'
|
||||
|
||||
# load llm configuration from gpt4.yaml file
|
||||
app = App.from_config(yaml_path="gpt4.yaml")
|
||||
app = App.from_config(config_path="gpt4.yaml")
|
||||
```
|
||||
|
||||
```yaml gpt4.yaml
|
||||
@@ -61,9 +81,8 @@ llm:
|
||||
```
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
#### I don't have OpenAI credits. How can I use some open source model?
|
||||
|
||||
</Accordion>
|
||||
<Accordion title="I don't have OpenAI credits. How can I use some open source model?">
|
||||
<CodeGroup>
|
||||
|
||||
```python main.py
|
||||
@@ -73,7 +92,7 @@ from embedchain import Pipeline as App
|
||||
os.environ['OPENAI_API_KEY'] = 'xxx'
|
||||
|
||||
# load llm configuration from opensource.yaml file
|
||||
app = App.from_config(yaml_path="opensource.yaml")
|
||||
app = App.from_config(config_path="opensource.yaml")
|
||||
```
|
||||
|
||||
```yaml opensource.yaml
|
||||
@@ -93,8 +112,10 @@ embedder:
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
#### How to contact support?
|
||||
</Accordion>
|
||||
</AccordionGroup>
|
||||
|
||||
#### Need more help?
|
||||
If docs aren't sufficient, please feel free to reach out to us using one of the following methods:
|
||||
|
||||
<Snippet file="get-help.mdx" />
|
||||
|
||||
Reference in New Issue
Block a user