langchain/docs/extras/integrations
Liu Ming 24f889f2bc
Change with_history option to False for ChatGLM by default (#8076)
ChatGLM LLM integration will by default accumulate conversation
history(with_history=True) to ChatGLM backend api, which is not expected
in most cases. This PR set with_history=False by default, user should
explicitly set llm.with_history=True to turn this feature on. Related
PR: #8048 #7774

---------

Co-authored-by: mlot <limpo2000@gmail.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-07-24 15:46:02 -07:00
..
callbacks mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
chat mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
document_loaders Extend Cube Semantic Loader functionality (#8186) 2023-07-24 12:11:58 -07:00
document_transformers mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
llms Change with_history option to False for ChatGLM by default (#8076) 2023-07-24 15:46:02 -07:00
memory mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
providers ArangoDB/AQL support for Graph QA Chain (#7880) 2023-07-24 15:16:52 -07:00
retrievers mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
text_embedding Update SageMaker Endpoint Embeddings docs to be up to date with current requirements (#8103) 2023-07-24 13:35:06 -07:00
toolkits mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
tools mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
vectorstores mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00