Change with_history option to False for ChatGLM by default (#8076)

ChatGLM LLM integration will by default accumulate conversation
history(with_history=True) to ChatGLM backend api, which is not expected
in most cases. This PR set with_history=False by default, user should
explicitly set llm.with_history=True to turn this feature on. Related
PR: #8048 #7774

---------

Co-authored-by: mlot <limpo2000@gmail.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
This commit is contained in:
Liu Ming
2023-07-25 06:46:02 +08:00
committed by GitHub
parent 1f055775f8
commit 24f889f2bc
2 changed files with 6 additions and 18 deletions

View File

@@ -37,7 +37,7 @@ class ChatGLM(LLM):
"""History of the conversation"""
top_p: float = 0.7
"""Top P for nucleus sampling from 0 to 1"""
with_history: bool = True
with_history: bool = False
"""Whether to use history or not"""
@property