mirror of
https://github.com/hwchase17/langchain.git
synced 2025-07-09 14:35:50 +00:00
Fixed the assignment of custom_llm_provider argument (#11628)
- **Description:** Assigning the custom_llm_provider to the default params function so that it will be passed to the litellm - **Issue:** Even though the custom_llm_provider argument is being defined it's not being assigned anywhere in the code and hence its not being passed to litellm, therefore any litellm call which uses the custom_llm_provider as required parameter is being failed. This parameter is mainly used by litellm when we are doing inference via Custom API server. https://docs.litellm.ai/docs/providers/custom_openai_proxy - **Dependencies:** No dependencies are required @krrishdholakia , @baskaryan --------- Co-authored-by: Bagatur <baskaryan@gmail.com>
This commit is contained in:
parent
db67ccb0bb
commit
c9d4d53545
@ -207,6 +207,7 @@ class ChatLiteLLM(BaseChatModel):
|
||||
"stream": self.streaming,
|
||||
"n": self.n,
|
||||
"temperature": self.temperature,
|
||||
"custom_llm_provider": self.custom_llm_provider,
|
||||
**self.model_kwargs,
|
||||
}
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user