feat: Add private ai example (#1101)
This commit is contained in:
26
examples/private-ai/README.md
Normal file
26
examples/private-ai/README.md
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
# Private AI
|
||||||
|
|
||||||
|
In this example, we will create a private AI using embedchain.
|
||||||
|
|
||||||
|
Private AI is useful when you want to chat with your data and you dont want to spend money and your data should stay on your machine.
|
||||||
|
|
||||||
|
## How to install
|
||||||
|
|
||||||
|
First create a virtual environment and install the requirements by running
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
## How to use
|
||||||
|
|
||||||
|
* Now open privateai.py file and change the line `app.add` to point to your directory or data source.
|
||||||
|
* If you want to add any other data type, you can browse the supported data types [here](https://docs.embedchain.ai/components/data-sources/overview)
|
||||||
|
|
||||||
|
* Now simply run the file by
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python privateai.py
|
||||||
|
```
|
||||||
|
|
||||||
|
* Now you can enter and ask any questions from your data.
|
||||||
10
examples/private-ai/config.yaml
Normal file
10
examples/private-ai/config.yaml
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
llm:
|
||||||
|
provider: gpt4all
|
||||||
|
config:
|
||||||
|
model: 'orca-mini-3b-gguf2-q4_0.gguf'
|
||||||
|
max_tokens: 1000
|
||||||
|
top_p: 1
|
||||||
|
embedder:
|
||||||
|
provider: huggingface
|
||||||
|
config:
|
||||||
|
model: 'sentence-transformers/all-MiniLM-L6-v2'
|
||||||
15
examples/private-ai/privateai.py
Normal file
15
examples/private-ai/privateai.py
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
from embedchain import App
|
||||||
|
|
||||||
|
app = App.from_config("config.yaml")
|
||||||
|
app.add("/path/to/your/folder", data_type="directory")
|
||||||
|
|
||||||
|
while True:
|
||||||
|
user_input = input("Enter your question (type 'exit' to quit): ")
|
||||||
|
|
||||||
|
# Break the loop if the user types 'exit'
|
||||||
|
if user_input.lower() == 'exit':
|
||||||
|
break
|
||||||
|
|
||||||
|
# Process the input and provide a response
|
||||||
|
response = app.chat(user_input)
|
||||||
|
print(response)
|
||||||
1
examples/private-ai/requirements.txt
Normal file
1
examples/private-ai/requirements.txt
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"embedchain[opensource]"
|
||||||
Reference in New Issue
Block a user