Files
t6_mem0/configs/llama2.yaml
2023-11-03 00:32:51 -07:00

9 lines
215 B
YAML

llm:
provider: llama2
config:
model: 'a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5'
temperature: 0.5
max_tokens: 1000
top_p: 0.5
stream: false