Rename embedchain to mem0 and open sourcing code for long term memory (#1474)

Co-authored-by: Deshraj Yadav <deshrajdry@gmail.com>
This commit is contained in:
Taranjeet Singh
2024-07-12 07:51:33 -07:00
committed by GitHub
parent 83e8c97295
commit f842a92e25
665 changed files with 9427 additions and 6592 deletions

View File

@@ -0,0 +1,26 @@
# Private AI
In this example, we will create a private AI using embedchain.
Private AI is useful when you want to chat with your data and you dont want to spend money and your data should stay on your machine.
## How to install
First create a virtual environment and install the requirements by running
```bash
pip install -r requirements.txt
```
## How to use
* Now open privateai.py file and change the line `app.add` to point to your directory or data source.
* If you want to add any other data type, you can browse the supported data types [here](https://docs.embedchain.ai/components/data-sources/overview)
* Now simply run the file by
```bash
python privateai.py
```
* Now you can enter and ask any questions from your data.

View File

@@ -0,0 +1,10 @@
llm:
provider: gpt4all
config:
model: 'orca-mini-3b-gguf2-q4_0.gguf'
max_tokens: 1000
top_p: 1
embedder:
provider: huggingface
config:
model: 'sentence-transformers/all-MiniLM-L6-v2'

View File

@@ -0,0 +1,15 @@
from embedchain import App
app = App.from_config("config.yaml")
app.add("/path/to/your/folder", data_type="directory")
while True:
user_input = input("Enter your question (type 'exit' to quit): ")
# Break the loop if the user types 'exit'
if user_input.lower() == "exit":
break
# Process the input and provide a response
response = app.chat(user_input)
print(response)

View File

@@ -0,0 +1 @@
"embedchain[opensource]"