ENH: Add llm_kwargs for Xinference LLMs (#10354)

- This pr adds `llm_kwargs` to the initialization of Xinference LLMs
(integrated in #8171 ).
- With this enhancement, users can not only provide `generate_configs`
when calling the llms for generation but also during the initialization
process. This allows users to include custom configurations when
utilizing LangChain features like LLMChain.
- It also fixes some format issues for the docstrings.
This commit is contained in:
Jiayi Ni
2023-09-18 23:36:29 +08:00
committed by GitHub
parent 1eefb9052b
commit ce61840e3b
3 changed files with 57 additions and 26 deletions

View File

@@ -93,10 +93,10 @@ llm(
### Usage
For more information and detailed examples, refer to the
[example notebook for xinference](../modules/models/llms/integrations/xinference.ipynb)
[example for xinference LLMs](/docs/integrations/llms/xinference.html)
### Embeddings
Xinference also supports embedding queries and documents. See
[example notebook for xinference embeddings](../modules/data_connection/text_embedding/integrations/xinference.ipynb)
[example for xinference embeddings](/docs/integrations/text_embedding/xinference.html)
for a more detailed demo.