docs: fix language_models docstring (#26268)

This commit is contained in:
Bagatur
2024-09-10 10:41:28 -07:00
committed by GitHub
parent 301be2d40a
commit 162d3ff54b
2 changed files with 8 additions and 3 deletions

View File

@@ -17,7 +17,10 @@ def process_toc_h3_elements(html_content: str) -> str:
# Process each element
for element in toc_h3_elements:
element = element.a.code.span
try:
element = element.a.code.span
except Exception:
continue
# Get the text content of the element
content = element.get_text()

View File

@@ -4,7 +4,7 @@ text prompts.
LangChain has two main classes to work with language models: **Chat Models**
and "old-fashioned" **LLMs**.
## Chat Models
**Chat Models**
Language models that use a sequence of messages as inputs and return chat messages
as outputs (as opposed to using plain text). These are traditionally newer models (
@@ -21,7 +21,7 @@ the following guide for more information on how to implement a custom Chat Model
https://python.langchain.com/v0.2/docs/how_to/custom_chat_model/
## LLMs
**LLMs**
Language models that takes a string as input and returns a string.
These are traditionally older models (newer models generally are Chat Models, see below).
@@ -35,6 +35,8 @@ To implement a custom LLM, inherit from `BaseLLM` or `LLM`.
Please see the following guide for more information on how to implement a custom LLM:
https://python.langchain.com/v0.2/docs/how_to/custom_llm/
""" # noqa: E501
from langchain_core.language_models.base import (