mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-25 16:13:25 +00:00
docs: consolidate llms and chat models (#27525)
Consolidate the docs. After this, we'll need to potentially set up redirect, and update links to existing llm pages. Still not sure this is a good idea. May be better to rename the concept page and redirect appropriately better, since we do still have string in / string out llm integrations
This commit is contained in:
parent
e442e485fe
commit
acefe02334
@ -40,6 +40,7 @@ Please review the [chat model integrations](/docs/integrations/chat/) for a list
|
||||
Models that do **not** include the prefix "Chat" in their name or include "LLM" as a suffix in their name typically refer to older models that do not follow the chat model interface and instead use an interface that takes a string as input and returns a string as output.
|
||||
:::
|
||||
|
||||
|
||||
## Interface
|
||||
|
||||
LangChain chat models implement the [BaseChatModel](https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.chat_models.BaseChatModel.html) interface. Because [BaseChatModel] also implements the [Runnable Interface](/docs/concepts/runnables), chat models support a [standard streaming interface](/docs/concepts/streaming), [async programming](/docs/concepts/async), optimized [batching](/docs/concepts/runnables#batch), and more. Please see the [Runnable Interface](/docs/concepts/runnables) for more details.
|
||||
@ -48,6 +49,13 @@ Many of the key methods of chat models operate on [messages](/docs/concepts/mess
|
||||
|
||||
Chat models offer a standard set of parameters that can be used to configure the model. These parameters are typically used to control the behavior of the model, such as the temperature of the output, the maximum number of tokens in the response, and the maximum time to wait for a response. Please see the [standard parameters](#standard-parameters) section for more details.
|
||||
|
||||
:::note
|
||||
In documentation, we will often use the terms "LLM" and "Chat Model" interchangeably. This is because most modern LLMs are exposed to users via a chat model interface.
|
||||
|
||||
However, LangChain also has implementations of older LLMs that do not follow the chat model interface and instead use an interface that takes a string as input and returns a string as output. These models are typically named without the "Chat" prefix (e.g., `Ollama`, `Anthropic`, `OpenAI`, etc.).
|
||||
These models implement the [BaseLLM](https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.llms.BaseLLM.html#langchain_core.language_models.llms.BaseLLM) interface and may be named with the "LLM" suffix (e.g., `OllamaLLM`, `AnthropicLLM`, `OpenAILLM`, etc.). Generally, users should not use these models.
|
||||
:::
|
||||
|
||||
### Key Methods
|
||||
|
||||
The key methods of a chat model are:
|
||||
@ -157,6 +165,3 @@ Please see the [how to cache chat model responses](/docs/how_to/#chat-model-cach
|
||||
* [Multimodality](/docs/concepts/multimodality)
|
||||
* [Structured outputs](/docs/concepts#structured_output)
|
||||
* [Tokens](/docs/concepts/tokens)
|
||||
|
||||
|
||||
|
||||
|
@ -1,39 +1,3 @@
|
||||
# Large Language Models (LLMs)
|
||||
|
||||
Large Language Models (LLMs) are advanced machine learning models that excel in a wide range of language-related tasks such as
|
||||
text generation, translation, summarization, question answering, and more, without needing task-specific tuning for every scenario.
|
||||
|
||||
## Chat Models
|
||||
|
||||
Modern LLMs are typically exposed to users via a [Chat Model interface](/docs/concepts/chat_models). These models process sequences of [messages](/docs/concepts/messages) as input and output messages.
|
||||
|
||||
Popular chat models support native [tool calling](/docs/concepts#tool-calling) capabilities, which allows building applications
|
||||
that can interact with external services, APIs, databases, extract structured information from unstructured text, and more.
|
||||
|
||||
Modern LLMs are not limited to processing natural language text. They can also process other types of data, such as images, audio, and video. This is known as [multimodality](/docs/concepts/multimodality). Please see the [Chat Model Concept Guide](/docs/concepts/chat_models) page for more information.
|
||||
|
||||
## Terminology
|
||||
|
||||
In documentation, we will often use the terms "LLM" and "Chat Model" interchangeably. This is because most modern LLMs are exposed to users via a chat model interface.
|
||||
|
||||
However, users must know that there are two distinct interfaces for LLMs in LangChain:
|
||||
|
||||
1. Modern LLMs implement the [BaseChatModel](https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.chat_models.BaseChatModel.html) interface. These are chat models that process sequences of messages as input and output messages. Such models will typically be named with a convention that prefixes "Chat" to their class names (e.g., `ChatOllama`, `ChatAnthropic`, `ChatOpenAI`, etc.).
|
||||
2. Older LLMs implement the [BaseLLM](https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.llms.BaseLLM.html#langchain_core.language_models.llms.BaseLLM) interface. These are LLMs that take as input text strings and output text strings. Such models are typically named just using the provider's name (e.g., `Ollama`, `Anthropic`, `OpenAI`, etc.). Generally, users should not use these models.
|
||||
|
||||
## Related Resources
|
||||
|
||||
Modern LLMs (aka Chat Models):
|
||||
|
||||
* [Conceptual Guide about Chat Models](/docs/concepts/chat_models/)
|
||||
* [Chat Model Integrations](/docs/integrations/chat/)
|
||||
* How-to Guides: [LLMs](/docs/how_to/#chat_models)
|
||||
|
||||
Text-in, text-out LLMs (older or lower-level models):
|
||||
|
||||
:::caution
|
||||
Unless you have a specific use case that requires using these models, you should use the chat models instead.
|
||||
:::
|
||||
|
||||
* [LLM Integrations](/docs/integrations/llms/)
|
||||
* How-to Guides: [LLMs](/docs/how_to/#llms)
|
||||
Please see the [Chat Model Concept Guide](/docs/concepts/chat_models) page for more information.
|
Loading…
Reference in New Issue
Block a user