Files
t6_mem0/configs/ollama.yaml
2024-03-04 18:09:45 -08:00

14 lines
223 B
YAML

llm:
provider: ollama
config:
model: 'llama2'
temperature: 0.5
top_p: 1
stream: true
base_url: http://localhost:11434
embedder:
provider: huggingface
config:
model: 'BAAI/bge-small-en-v1.5'