Files
t6_mem0/configs/llama2.yaml
2023-10-13 15:38:15 -07:00

9 lines
213 B
YAML

llm:
provider: llama2
model: 'a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5'
config:
temperature: 0.5
max_tokens: 1000
top_p: 0.5
stream: false