langchain: init_chat_model() to support ChatOllama from langchain-ollama (#24818)

Description: Since moving away from `langchain-community` is
recommended, `init_chat_models()` should import ChatOllama from
`langchain-ollama` instead.
This commit is contained in:
Jerron Lim 2024-07-30 22:17:38 +08:00 committed by GitHub
parent 4fab8996cf
commit 5abfc85fec
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -118,7 +118,7 @@ def init_chat_model(
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-community)
- ollama (langchain-ollama)
Will attempt to infer model_provider from model if not specified. The
following providers will be inferred based on these model prefixes:
@ -336,8 +336,8 @@ def _init_chat_model_helper(
return ChatFireworks(model=model, **kwargs)
elif model_provider == "ollama":
_check_pkg("langchain_community")
from langchain_community.chat_models import ChatOllama
_check_pkg("langchain_ollama")
from langchain_ollama import ChatOllama
return ChatOllama(model=model, **kwargs)
elif model_provider == "together":