docs: update LlamaCpp max_tokens args (#9238)

This PR updates documentations only, `max_length` should be `max_tokens`
according to latest LlamaCpp API doc:
https://api.python.langchain.com/en/latest/llms/langchain.llms.llamacpp.LlamaCpp.html
This commit is contained in:
fanyou-wbd 2023-08-15 00:50:20 -07:00 committed by GitHub
parent a8aa1aba1c
commit 5e43768f61
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -242,7 +242,7 @@
"llm = LlamaCpp(\n",
" model_path=\"/Users/rlm/Desktop/Code/llama/llama-2-7b-ggml/llama-2-7b-chat.ggmlv3.q4_0.bin\",\n",
" temperature=0.75,\n",
" max_length=2000,\n",
" max_tokens=2000,\n",
" top_p=1,\n",
" callback_manager=callback_manager,\n",
" verbose=True,\n",