langchain/docs/extras
Massimiliano Pronesti a616e19975
feat(llms): add support for vLLM (#8806)
Hello langchain maintainers, 
this PR aims at integrating
[vllm](https://vllm.readthedocs.io/en/latest/#) into langchain. This PR
closes #8729.

This feature clearly depends on `vllm`, but I've seen other models
supported here depend on packages that are not included in the
pyproject.toml (e.g. `gpt4all`, `text-generation`) so I thought it was
the case for this as well.

@hwchase17, @baskaryan

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
2023-08-07 07:32:02 -07:00
..
_templates Update Integrations links (#8206) 2023-07-24 21:20:32 -07:00
additional_resources Link to use cases from tutorials (#8371) 2023-07-27 11:54:04 -07:00
ecosystem
guides add example of memory and returning retrieved docs (#8830) 2023-08-06 15:25:12 -07:00
integrations feat(llms): add support for vLLM (#8806) 2023-08-07 07:32:02 -07:00
modules Fix typo in long_context_reorder.ipynb (#8811) 2023-08-06 15:31:38 -07:00
use_cases Update links on QA Use Case docs (#8784) 2023-08-05 17:30:56 -07:00