82 lines
2.4 KiB
Plaintext
82 lines
2.4 KiB
Plaintext
---
|
|
title: '⚡ Quickstart'
|
|
description: '💡 Create a RAG app on your own data in a minute'
|
|
---
|
|
|
|
## Installation
|
|
|
|
First install the python package.
|
|
|
|
```bash
|
|
pip install embedchain
|
|
```
|
|
|
|
Once you have installed the package, depending upon your preference you can either use:
|
|
|
|
<CardGroup cols={2}>
|
|
<Card title="Open Source Models" icon="osi" href="#open-source-models">
|
|
This includes Open source LLMs like Mistral, Llama, etc.<br/>
|
|
Free to use, and runs locally on your machine.
|
|
</Card>
|
|
<Card title="Paid Models" icon="dollar-sign" href="#paid-models" color="#4A154B">
|
|
This includes paid LLMs like GPT 4, Claude, etc.<br/>
|
|
Cost money and are accessible via an API.
|
|
</Card>
|
|
</CardGroup>
|
|
|
|
## Open Source Models
|
|
|
|
This section gives a quickstart example of using Mistral as the Open source LLM and Sentence transformers as the Open source embedding model. These models are free and run mostly on your local machine.
|
|
|
|
We are using Mistral hosted at Hugging Face, so will you need a Hugging Face token to run this example. Its *free* and you can create one [here](https://huggingface.co/docs/hub/security-tokens).
|
|
|
|
<CodeGroup>
|
|
```python quickstart.py
|
|
import os
|
|
# replace this with your HF key
|
|
os.environ["HUGGINGFACE_ACCESS_TOKEN"] = "hf_xxxx"
|
|
|
|
from embedchain import App
|
|
app = App.from_config("mistral.yaml")
|
|
app.add("https://www.forbes.com/profile/elon-musk")
|
|
app.add("https://en.wikipedia.org/wiki/Elon_Musk")
|
|
app.query("What is the net worth of Elon Musk today?")
|
|
# Answer: The net worth of Elon Musk today is $258.7 billion.
|
|
```
|
|
```yaml mistral.yaml
|
|
llm:
|
|
provider: huggingface
|
|
config:
|
|
model: 'mistralai/Mistral-7B-v0.1'
|
|
embedder:
|
|
provider: huggingface
|
|
config:
|
|
model: 'sentence-transformers/all-mpnet-base-v2'
|
|
```
|
|
</CodeGroup>
|
|
|
|
## Paid Models
|
|
|
|
In this section, we will use both LLM and embedding model from OpenAI.
|
|
|
|
```python quickstart.py
|
|
import os
|
|
# replace this with your OpenAI key
|
|
os.environ["OPENAI_API_KEY"] = "sk-xxxx"
|
|
|
|
from embedchain import App
|
|
app = App()
|
|
app.add("https://www.forbes.com/profile/elon-musk")
|
|
app.add("https://en.wikipedia.org/wiki/Elon_Musk")
|
|
app.query("What is the net worth of Elon Musk today?")
|
|
# Answer: The net worth of Elon Musk today is $258.7 billion.
|
|
```
|
|
|
|
# Next Steps
|
|
|
|
Now that you have created your first app, you can follow any of the links:
|
|
|
|
* [Introduction](/get-started/introduction)
|
|
* [Customization](/components/introduction)
|
|
* [Use cases](/use-cases/introduction)
|
|
* [Deployment](/get-started/deployment) |