feat: add support for IBM WatsonX AI chat models (#29688)

**Description:** Updated init_chat_model to support Granite models
deployed on IBM WatsonX
**Dependencies:**
[langchain-ibm](https://github.com/langchain-ai/langchain-ibm)

Tagging @baskaryan @efriis for review when you get a chance.
This commit is contained in:
Jawahar S 2025-02-12 05:04:29 +05:30 committed by GitHub
parent c7d74eb7a3
commit 994c5465e0
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -118,6 +118,7 @@ def init_chat_model(
- 'ollama' -> langchain-ollama
- 'google_anthropic_vertex' -> langchain-google-vertexai
- 'deepseek' -> langchain-deepseek
- 'ibm' -> langchain-ibm
- 'nvidia' -> langchain-nvidia-ai-endpoints
Will attempt to infer model_provider from model if not specified. The
@ -428,6 +429,11 @@ def _init_chat_model_helper(
from langchain_nvidia_ai_endpoints import ChatNVIDIA
return ChatNVIDIA(model=model, **kwargs)
elif model_provider == "ibm":
_check_pkg("langchain_ibm")
from langchain_ibm import ChatWatsonx
return ChatWatsonx(model_id=model, **kwargs)
else:
supported = ", ".join(_SUPPORTED_PROVIDERS)
raise ValueError(
@ -453,6 +459,7 @@ _SUPPORTED_PROVIDERS = {
"bedrock_converse",
"google_anthropic_vertex",
"deepseek",
"ibm",
}