mirror of
https://github.com/hwchase17/langchain.git
synced 2026-02-12 20:20:08 +00:00
Compare commits
130 Commits
langchain=
...
langchain-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c0e2f08f78 | ||
|
|
4dab5fafc0 | ||
|
|
e6dde2b99c | ||
|
|
126a337082 | ||
|
|
13c13c4bac | ||
|
|
6dc06da1bb | ||
|
|
d9ec4c5cc7 | ||
|
|
d62d77925c | ||
|
|
7ab4a7841a | ||
|
|
6e968fd23c | ||
|
|
640d85c60f | ||
|
|
fa7789d6c2 | ||
|
|
889e8b6de8 | ||
|
|
5cb0501c59 | ||
|
|
5838e3e8e5 | ||
|
|
fbd96c688a | ||
|
|
2085f69d68 | ||
|
|
df2ec0ca38 | ||
|
|
51e1447c9e | ||
|
|
bac96fe33f | ||
|
|
d8b08a1ecd | ||
|
|
9b5e00f578 | ||
|
|
8c22e69491 | ||
|
|
d62b4499ad | ||
|
|
f8bb3f0d19 | ||
|
|
8284e278d6 | ||
|
|
3a846eeb8d | ||
|
|
d273341249 | ||
|
|
db49a14a34 | ||
|
|
ab7eda236e | ||
|
|
d418cbdf44 | ||
|
|
b93d2f7f3a | ||
|
|
a763ebe86c | ||
|
|
5fa1094451 | ||
|
|
dd4de696b8 | ||
|
|
809a0216a5 | ||
|
|
44ec72fa0d | ||
|
|
5459ff1ee3 | ||
|
|
f95669aa0a | ||
|
|
3a465d635b | ||
|
|
0b51de4cab | ||
|
|
5904cbea89 | ||
|
|
c9590ef79d | ||
|
|
c972552c40 | ||
|
|
e16feb93b9 | ||
|
|
2bb57d45d2 | ||
|
|
d7cce2f469 | ||
|
|
48b77752d0 | ||
|
|
6f2d16e6be | ||
|
|
a9eda18e1e | ||
|
|
a89c549cb0 | ||
|
|
a336afaecd | ||
|
|
af07949d13 | ||
|
|
a10e880c00 | ||
|
|
7b5e839be3 | ||
|
|
740842485c | ||
|
|
08bb74f148 | ||
|
|
7d78ed9b53 | ||
|
|
7ccff656eb | ||
|
|
002d623f2d | ||
|
|
34f8031bd9 | ||
|
|
a541b5bee1 | ||
|
|
3e970506ba | ||
|
|
d1b0196faa | ||
|
|
aac69839a9 | ||
|
|
64141072a3 | ||
|
|
0795be2a04 | ||
|
|
9c97597175 | ||
|
|
eed0f6c289 | ||
|
|
729637a347 | ||
|
|
3325196be1 | ||
|
|
f402fdcea3 | ||
|
|
ca9217c02d | ||
|
|
f9bae40475 | ||
|
|
839a18e112 | ||
|
|
33a6def762 | ||
|
|
c456c8ae51 | ||
|
|
54ea62050b | ||
|
|
986302322f | ||
|
|
a5137b0a3e | ||
|
|
5bea28393d | ||
|
|
c3fed20940 | ||
|
|
6d418ba983 | ||
|
|
12daba63ff | ||
|
|
eaf8dce7c2 | ||
|
|
f82de1a8d7 | ||
|
|
e3efd1e891 | ||
|
|
d6769cf032 | ||
|
|
7ab2e0dd3b | ||
|
|
81319ad3f0 | ||
|
|
e3f3c13b75 | ||
|
|
c30844fce4 | ||
|
|
c9eb3bdb2d | ||
|
|
e97baeb9a6 | ||
|
|
3a6046b157 | ||
|
|
8fdc619f75 | ||
|
|
729bfe8369 | ||
|
|
9b624a79b2 | ||
|
|
c60c5a91cb | ||
|
|
d9e0c212e0 | ||
|
|
f015526e42 | ||
|
|
57d931532f | ||
|
|
50012d95e2 | ||
|
|
33f06875cb | ||
|
|
e5730307e7 | ||
|
|
4783a9c18e | ||
|
|
ee4d84de7c | ||
|
|
092dd5e174 | ||
|
|
dd81e1c3fb | ||
|
|
135a5b97e6 | ||
|
|
b92b394804 | ||
|
|
083bb3cdd7 | ||
|
|
2e9291cdd7 | ||
|
|
4f8a76b571 | ||
|
|
05ba941230 | ||
|
|
ae4976896e | ||
|
|
504ef96500 | ||
|
|
d99a02bb27 | ||
|
|
793de80429 | ||
|
|
7d4e9d8cda | ||
|
|
54dca494cf | ||
|
|
7b30e58386 | ||
|
|
e62b541dfd | ||
|
|
8699980d09 | ||
|
|
79e536b0d6 | ||
|
|
b5720ff17a | ||
|
|
48b05224ad | ||
|
|
89079ad411 | ||
|
|
2c95586f2a | ||
|
|
9c1285cf5b |
4
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
4
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -119,7 +119,3 @@ body:
|
||||
python -m langchain_core.sys_info
|
||||
validations:
|
||||
required: true
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
26
.github/ISSUE_TEMPLATE/feature-request.yml
vendored
26
.github/ISSUE_TEMPLATE/feature-request.yml
vendored
@@ -42,11 +42,11 @@ body:
|
||||
label: Feature Description
|
||||
description: |
|
||||
Please provide a clear and concise description of the feature you would like to see added to LangChain.
|
||||
|
||||
|
||||
What specific functionality are you requesting? Be as detailed as possible.
|
||||
placeholder: |
|
||||
I would like LangChain to support...
|
||||
|
||||
|
||||
This feature would allow users to...
|
||||
- type: textarea
|
||||
id: use-case
|
||||
@@ -56,13 +56,13 @@ body:
|
||||
label: Use Case
|
||||
description: |
|
||||
Describe the specific use case or problem this feature would solve.
|
||||
|
||||
|
||||
Why do you need this feature? What problem does it solve for you or other users?
|
||||
placeholder: |
|
||||
I'm trying to build an application that...
|
||||
|
||||
|
||||
Currently, I have to work around this by...
|
||||
|
||||
|
||||
This feature would help me/users to...
|
||||
- type: textarea
|
||||
id: proposed-solution
|
||||
@@ -72,13 +72,13 @@ body:
|
||||
label: Proposed Solution
|
||||
description: |
|
||||
If you have ideas about how this feature could be implemented, please describe them here.
|
||||
|
||||
|
||||
This is optional but can be helpful for maintainers to understand your vision.
|
||||
placeholder: |
|
||||
I think this could be implemented by...
|
||||
|
||||
|
||||
The API could look like...
|
||||
|
||||
|
||||
```python
|
||||
# Example of how the feature might work
|
||||
```
|
||||
@@ -90,15 +90,15 @@ body:
|
||||
label: Alternatives Considered
|
||||
description: |
|
||||
Have you considered any alternative solutions or workarounds?
|
||||
|
||||
|
||||
What other approaches have you tried or considered?
|
||||
placeholder: |
|
||||
I've tried using...
|
||||
|
||||
|
||||
Alternative approaches I considered:
|
||||
1. ...
|
||||
2. ...
|
||||
|
||||
|
||||
But these don't work because...
|
||||
- type: textarea
|
||||
id: additional-context
|
||||
@@ -110,9 +110,9 @@ body:
|
||||
Add any other context, screenshots, examples, or references that would help explain your feature request.
|
||||
placeholder: |
|
||||
Related issues: #...
|
||||
|
||||
|
||||
Similar features in other libraries:
|
||||
- ...
|
||||
|
||||
|
||||
Additional context or examples:
|
||||
- ...
|
||||
|
||||
2
.github/actions/people/action.yml
vendored
2
.github/actions/people/action.yml
vendored
@@ -1,4 +1,6 @@
|
||||
# Adapted from https://github.com/tiangolo/fastapi/blob/master/.github/actions/people/action.yml
|
||||
# TODO: fix this, migrate to new docs repo?
|
||||
|
||||
name: "Generate LangChain People"
|
||||
description: "Generate the data for the LangChain People page"
|
||||
author: "Jacob Lee <jacob@langchain.dev>"
|
||||
|
||||
8
.github/actions/uv_setup/action.yml
vendored
8
.github/actions/uv_setup/action.yml
vendored
@@ -1,3 +1,5 @@
|
||||
# Helper to set up Python and uv with caching
|
||||
|
||||
name: uv-install
|
||||
description: Set up Python and uv with caching
|
||||
|
||||
@@ -8,15 +10,15 @@ inputs:
|
||||
enable-cache:
|
||||
description: Enable caching for uv dependencies
|
||||
required: false
|
||||
default: 'true'
|
||||
default: "true"
|
||||
cache-suffix:
|
||||
description: Custom cache key suffix for cache invalidation
|
||||
required: false
|
||||
default: ''
|
||||
default: ""
|
||||
working-directory:
|
||||
description: Working directory for cache glob scoping
|
||||
required: false
|
||||
default: '**'
|
||||
default: "**"
|
||||
|
||||
env:
|
||||
UV_VERSION: "0.5.25"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# GitHub PR Labeler Configuration for LangChain
|
||||
# Label PRs (config)
|
||||
# Automatically applies labels based on changed files and branch patterns
|
||||
|
||||
# Core packages
|
||||
46
.github/pr-title-labeler.yml
vendored
46
.github/pr-title-labeler.yml
vendored
@@ -1,27 +1,41 @@
|
||||
# PR Title Labeler Configuration
|
||||
# PR title labeler config
|
||||
#
|
||||
# Labels PRs based on conventional commit patterns in titles
|
||||
#
|
||||
# Format: type(scope): description or type!: description (breaking)
|
||||
|
||||
add-missing-labels: true
|
||||
clear-prexisting: false
|
||||
include-commits: false
|
||||
include-title: true
|
||||
label-for-breaking-changes: breaking
|
||||
|
||||
label-mapping:
|
||||
# Features and enhancements
|
||||
feature: ["feat"]
|
||||
|
||||
# Bug fixes
|
||||
fix: ["fix"]
|
||||
|
||||
# Documentation
|
||||
documentation: ["docs"]
|
||||
|
||||
# Infrastructure and tooling
|
||||
infra: ["chore", "ci", "build", "infra"]
|
||||
|
||||
# Integration partners - detected by scope
|
||||
integration: ["anthropic", "chroma", "deepseek", "exa", "fireworks", "groq", "huggingface", "mistralai", "nomic", "ollama", "openai", "perplexity", "prompty", "qdrant", "xai"]
|
||||
|
||||
# Releases
|
||||
feature: ["feat"]
|
||||
fix: ["fix"]
|
||||
infra: ["build", "ci", "chore"]
|
||||
integration:
|
||||
[
|
||||
"anthropic",
|
||||
"chroma",
|
||||
"deepseek",
|
||||
"exa",
|
||||
"fireworks",
|
||||
"groq",
|
||||
"huggingface",
|
||||
"mistralai",
|
||||
"nomic",
|
||||
"ollama",
|
||||
"openai",
|
||||
"perplexity",
|
||||
"prompty",
|
||||
"qdrant",
|
||||
"xai",
|
||||
]
|
||||
linting: ["style"]
|
||||
performance: ["perf"]
|
||||
refactor: ["refactor"]
|
||||
release: ["release"]
|
||||
revert: ["revert"]
|
||||
tests: ["test"]
|
||||
|
||||
26
.github/scripts/check_diff.py
vendored
26
.github/scripts/check_diff.py
vendored
@@ -1,3 +1,18 @@
|
||||
"""Analyze git diffs to determine which directories need to be tested.
|
||||
|
||||
Intelligently determines which LangChain packages and directories need to be tested,
|
||||
linted, or built based on the changes. Handles dependency relationships between
|
||||
packages, maps file changes to appropriate CI job configurations, and outputs JSON
|
||||
configurations for GitHub Actions.
|
||||
|
||||
- Maps changed files to affected package directories (libs/core, libs/partners/*, etc.)
|
||||
- Builds dependency graph to include dependent packages when core components change
|
||||
- Generates test matrix configurations with appropriate Python versions
|
||||
- Handles special cases for Pydantic version testing and performance benchmarks
|
||||
|
||||
Used as part of the check_diffs workflow.
|
||||
"""
|
||||
|
||||
import glob
|
||||
import json
|
||||
import os
|
||||
@@ -17,7 +32,7 @@ LANGCHAIN_DIRS = [
|
||||
"libs/langchain_v1",
|
||||
]
|
||||
|
||||
# when set to True, we are ignoring core dependents
|
||||
# When set to True, we are ignoring core dependents
|
||||
# in order to be able to get CI to pass for each individual
|
||||
# package that depends on core
|
||||
# e.g. if you touch core, we don't then add textsplitters/etc to CI
|
||||
@@ -49,9 +64,9 @@ def all_package_dirs() -> Set[str]:
|
||||
|
||||
|
||||
def dependents_graph() -> dict:
|
||||
"""
|
||||
Construct a mapping of package -> dependents, such that we can
|
||||
run tests on all dependents of a package when a change is made.
|
||||
"""Construct a mapping of package -> dependents
|
||||
|
||||
Done such that we can run tests on all dependents of a package when a change is made.
|
||||
"""
|
||||
dependents = defaultdict(set)
|
||||
|
||||
@@ -123,9 +138,6 @@ def _get_configs_for_single_dir(job: str, dir_: str) -> List[Dict[str, str]]:
|
||||
elif dir_ == "libs/core":
|
||||
py_versions = ["3.9", "3.10", "3.11", "3.12", "3.13"]
|
||||
# custom logic for specific directories
|
||||
elif dir_ == "libs/partners/milvus":
|
||||
# milvus doesn't allow 3.12 because they declare deps in funny way
|
||||
py_versions = ["3.9", "3.11"]
|
||||
|
||||
elif dir_ in PY_312_MAX_PACKAGES:
|
||||
py_versions = ["3.9", "3.12"]
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
"""Check that no dependencies allow prereleases unless we're releasing a prerelease."""
|
||||
|
||||
import sys
|
||||
|
||||
import tomllib
|
||||
@@ -6,15 +8,14 @@ if __name__ == "__main__":
|
||||
# Get the TOML file path from the command line argument
|
||||
toml_file = sys.argv[1]
|
||||
|
||||
# read toml file
|
||||
with open(toml_file, "rb") as file:
|
||||
toml_data = tomllib.load(file)
|
||||
|
||||
# see if we're releasing an rc
|
||||
# See if we're releasing an rc or dev version
|
||||
version = toml_data["project"]["version"]
|
||||
releasing_rc = "rc" in version or "dev" in version
|
||||
|
||||
# if not, iterate through dependencies and make sure none allow prereleases
|
||||
# If not, iterate through dependencies and make sure none allow prereleases
|
||||
if not releasing_rc:
|
||||
dependencies = toml_data["project"]["dependencies"]
|
||||
for dep_version in dependencies:
|
||||
|
||||
45
.github/scripts/get_min_versions.py
vendored
45
.github/scripts/get_min_versions.py
vendored
@@ -1,3 +1,5 @@
|
||||
"""Get minimum versions of dependencies from a pyproject.toml file."""
|
||||
|
||||
import sys
|
||||
from collections import defaultdict
|
||||
from typing import Optional
|
||||
@@ -5,7 +7,7 @@ from typing import Optional
|
||||
if sys.version_info >= (3, 11):
|
||||
import tomllib
|
||||
else:
|
||||
# for python 3.10 and below, which doesnt have stdlib tomllib
|
||||
# For Python 3.10 and below, which doesnt have stdlib tomllib
|
||||
import tomli as tomllib
|
||||
|
||||
import re
|
||||
@@ -34,14 +36,13 @@ SKIP_IF_PULL_REQUEST = [
|
||||
|
||||
|
||||
def get_pypi_versions(package_name: str) -> List[str]:
|
||||
"""
|
||||
Fetch all available versions for a package from PyPI.
|
||||
"""Fetch all available versions for a package from PyPI.
|
||||
|
||||
Args:
|
||||
package_name (str): Name of the package
|
||||
package_name: Name of the package
|
||||
|
||||
Returns:
|
||||
List[str]: List of all available versions
|
||||
List of all available versions
|
||||
|
||||
Raises:
|
||||
requests.exceptions.RequestException: If PyPI API request fails
|
||||
@@ -54,24 +55,23 @@ def get_pypi_versions(package_name: str) -> List[str]:
|
||||
|
||||
|
||||
def get_minimum_version(package_name: str, spec_string: str) -> Optional[str]:
|
||||
"""
|
||||
Find the minimum published version that satisfies the given constraints.
|
||||
"""Find the minimum published version that satisfies the given constraints.
|
||||
|
||||
Args:
|
||||
package_name (str): Name of the package
|
||||
spec_string (str): Version specification string (e.g., ">=0.2.43,<0.4.0,!=0.3.0")
|
||||
package_name: Name of the package
|
||||
spec_string: Version specification string (e.g., ">=0.2.43,<0.4.0,!=0.3.0")
|
||||
|
||||
Returns:
|
||||
Optional[str]: Minimum compatible version or None if no compatible version found
|
||||
Minimum compatible version or None if no compatible version found
|
||||
"""
|
||||
# rewrite occurrences of ^0.0.z to 0.0.z (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^0.0.z to 0.0.z (can be anywhere in constraint string)
|
||||
spec_string = re.sub(r"\^0\.0\.(\d+)", r"0.0.\1", spec_string)
|
||||
# rewrite occurrences of ^0.y.z to >=0.y.z,<0.y+1 (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^0.y.z to >=0.y.z,<0.y+1 (can be anywhere in constraint string)
|
||||
for y in range(1, 10):
|
||||
spec_string = re.sub(
|
||||
rf"\^0\.{y}\.(\d+)", rf">=0.{y}.\1,<0.{y + 1}", spec_string
|
||||
)
|
||||
# rewrite occurrences of ^x.y.z to >=x.y.z,<x+1.0.0 (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^x.y.z to >=x.y.z,<x+1.0.0 (can be anywhere in constraint string)
|
||||
for x in range(1, 10):
|
||||
spec_string = re.sub(
|
||||
rf"\^{x}\.(\d+)\.(\d+)", rf">={x}.\1.\2,<{x + 1}", spec_string
|
||||
@@ -154,22 +154,25 @@ def get_min_version_from_toml(
|
||||
|
||||
|
||||
def check_python_version(version_string, constraint_string):
|
||||
"""
|
||||
Check if the given Python version matches the given constraints.
|
||||
"""Check if the given Python version matches the given constraints.
|
||||
|
||||
:param version_string: A string representing the Python version (e.g. "3.8.5").
|
||||
:param constraint_string: A string representing the package's Python version constraints (e.g. ">=3.6, <4.0").
|
||||
:return: True if the version matches the constraints, False otherwise.
|
||||
Args:
|
||||
version_string: A string representing the Python version (e.g. "3.8.5").
|
||||
constraint_string: A string representing the package's Python version
|
||||
constraints (e.g. ">=3.6, <4.0").
|
||||
|
||||
Returns:
|
||||
True if the version matches the constraints
|
||||
"""
|
||||
|
||||
# rewrite occurrences of ^0.0.z to 0.0.z (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^0.0.z to 0.0.z (can be anywhere in constraint string)
|
||||
constraint_string = re.sub(r"\^0\.0\.(\d+)", r"0.0.\1", constraint_string)
|
||||
# rewrite occurrences of ^0.y.z to >=0.y.z,<0.y+1.0 (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^0.y.z to >=0.y.z,<0.y+1.0 (can be anywhere in constraint string)
|
||||
for y in range(1, 10):
|
||||
constraint_string = re.sub(
|
||||
rf"\^0\.{y}\.(\d+)", rf">=0.{y}.\1,<0.{y + 1}.0", constraint_string
|
||||
)
|
||||
# rewrite occurrences of ^x.y.z to >=x.y.z,<x+1.0.0 (can be anywhere in constraint string)
|
||||
# Rewrite occurrences of ^x.y.z to >=x.y.z,<x+1.0.0 (can be anywhere in constraint string)
|
||||
for x in range(1, 10):
|
||||
constraint_string = re.sub(
|
||||
rf"\^{x}\.0\.(\d+)", rf">={x}.0.\1,<{x + 1}.0.0", constraint_string
|
||||
|
||||
20
.github/scripts/prep_api_docs_build.py
vendored
20
.github/scripts/prep_api_docs_build.py
vendored
@@ -1,5 +1,8 @@
|
||||
#!/usr/bin/env python
|
||||
"""Script to sync libraries from various repositories into the main langchain repository."""
|
||||
"""Sync libraries from various repositories into this monorepo.
|
||||
|
||||
Moves cloned partner packages into libs/partners structure.
|
||||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
@@ -10,7 +13,7 @@ import yaml
|
||||
|
||||
|
||||
def load_packages_yaml() -> Dict[str, Any]:
|
||||
"""Load and parse the packages.yml file."""
|
||||
"""Load and parse packages.yml."""
|
||||
with open("langchain/libs/packages.yml", "r") as f:
|
||||
return yaml.safe_load(f)
|
||||
|
||||
@@ -61,12 +64,15 @@ def move_libraries(packages: list) -> None:
|
||||
|
||||
|
||||
def main():
|
||||
"""Main function to orchestrate the library sync process."""
|
||||
"""Orchestrate the library sync process."""
|
||||
try:
|
||||
# Load packages configuration
|
||||
package_yaml = load_packages_yaml()
|
||||
|
||||
# Clean target directories
|
||||
# Clean/empty target directories in preparation for moving new ones
|
||||
#
|
||||
# Only for packages in the langchain-ai org or explicitly included via
|
||||
# include_in_api_ref, excluding 'langchain' itself and 'langchain-ai21'
|
||||
clean_target_directories(
|
||||
[
|
||||
p
|
||||
@@ -80,7 +86,9 @@ def main():
|
||||
]
|
||||
)
|
||||
|
||||
# Move libraries to their new locations
|
||||
# Move cloned libraries to their new locations, only for packages in the
|
||||
# langchain-ai org or explicitly included via include_in_api_ref,
|
||||
# excluding 'langchain' itself and 'langchain-ai21'
|
||||
move_libraries(
|
||||
[
|
||||
p
|
||||
@@ -95,7 +103,7 @@ def main():
|
||||
]
|
||||
)
|
||||
|
||||
# Delete ones without a pyproject.toml
|
||||
# Delete partner packages without a pyproject.toml
|
||||
for partner in Path("langchain/libs/partners").iterdir():
|
||||
if partner.is_dir() and not (partner / "pyproject.toml").exists():
|
||||
print(f"Removing {partner} as it does not have a pyproject.toml")
|
||||
|
||||
@@ -1,3 +1,11 @@
|
||||
# Validates that a package's integration tests compile without syntax or import errors.
|
||||
#
|
||||
# (If an integration test fails to compile, it won't run.)
|
||||
#
|
||||
# Called as part of check_diffs.yml workflow
|
||||
#
|
||||
# Runs pytest with compile marker to check syntax/imports.
|
||||
|
||||
name: '🔗 Compile Integration Tests'
|
||||
|
||||
on:
|
||||
|
||||
9
.github/workflows/_integration_test.yml
vendored
9
.github/workflows/_integration_test.yml
vendored
@@ -1,3 +1,10 @@
|
||||
|
||||
# Runs `make integration_tests` on the specified package.
|
||||
#
|
||||
# Manually triggered via workflow_dispatch for testing with real APIs.
|
||||
#
|
||||
# Installs integration test dependencies and executes full test suite.
|
||||
|
||||
name: '🚀 Integration Tests'
|
||||
run-name: 'Test ${{ inputs.working-directory }} on Python ${{ inputs.python-version }}'
|
||||
|
||||
@@ -83,7 +90,7 @@ jobs:
|
||||
run: |
|
||||
make integration_tests
|
||||
|
||||
- name: Ensure the tests did not create any additional files
|
||||
- name: 'Ensure testing did not create/modify files'
|
||||
shell: bash
|
||||
run: |
|
||||
set -eu
|
||||
|
||||
32
.github/workflows/_lint.yml
vendored
32
.github/workflows/_lint.yml
vendored
@@ -1,6 +1,11 @@
|
||||
name: '🧹 Code Linting'
|
||||
# Runs code quality checks using ruff, mypy, and other linting tools
|
||||
# Checks both package code and test code for consistency
|
||||
# Runs linting.
|
||||
#
|
||||
# Uses the package's Makefile to run the checks, specifically the
|
||||
# `lint_package` and `lint_tests` targets.
|
||||
#
|
||||
# Called as part of check_diffs.yml workflow.
|
||||
|
||||
name: '🧹 Linting'
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
@@ -43,14 +48,6 @@ jobs:
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
|
||||
- name: '📦 Install Lint & Typing Dependencies'
|
||||
# Also installs dev/lint/test/typing dependencies, to ensure we have
|
||||
# type hints for as many of our libraries as possible.
|
||||
# This helps catch errors that require dependencies to be spotted, for example:
|
||||
# https://github.com/langchain-ai/langchain/pull/10249/files#diff-935185cd488d015f026dcd9e19616ff62863e8cde8c0bee70318d3ccbca98341
|
||||
#
|
||||
# If you change this configuration, make sure to change the `cache-key`
|
||||
# in the `poetry_setup` action above to stop using the old cache.
|
||||
# It doesn't matter how you change it, any change will cause a cache-bust.
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
run: |
|
||||
uv sync --group lint --group typing
|
||||
@@ -60,20 +57,13 @@ jobs:
|
||||
run: |
|
||||
make lint_package
|
||||
|
||||
- name: '📦 Install Unit Test Dependencies'
|
||||
# Also installs dev/lint/test/typing dependencies, to ensure we have
|
||||
# type hints for as many of our libraries as possible.
|
||||
# This helps catch errors that require dependencies to be spotted, for example:
|
||||
# https://github.com/langchain-ai/langchain/pull/10249/files#diff-935185cd488d015f026dcd9e19616ff62863e8cde8c0bee70318d3ccbca98341
|
||||
#
|
||||
# If you change this configuration, make sure to change the `cache-key`
|
||||
# in the `poetry_setup` action above to stop using the old cache.
|
||||
# It doesn't matter how you change it, any change will cause a cache-bust.
|
||||
- name: '📦 Install Test Dependencies (non-partners)'
|
||||
# (For directories NOT starting with libs/partners/)
|
||||
if: ${{ ! startsWith(inputs.working-directory, 'libs/partners/') }}
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
run: |
|
||||
uv sync --inexact --group test
|
||||
- name: '📦 Install Unit + Integration Test Dependencies'
|
||||
- name: '📦 Install Test Dependencies'
|
||||
if: ${{ startsWith(inputs.working-directory, 'libs/partners/') }}
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
run: |
|
||||
|
||||
34
.github/workflows/_release.yml
vendored
34
.github/workflows/_release.yml
vendored
@@ -1,3 +1,9 @@
|
||||
# Builds and publishes LangChain packages to PyPI.
|
||||
#
|
||||
# Manually triggered, though can be used as a reusable workflow (workflow_call).
|
||||
#
|
||||
# Handles version bumping, building, and publishing to PyPI with authentication.
|
||||
|
||||
name: '🚀 Package Release'
|
||||
run-name: 'Release ${{ inputs.working-directory }} ${{ inputs.release-version }}'
|
||||
on:
|
||||
@@ -52,8 +58,8 @@ jobs:
|
||||
|
||||
# We want to keep this build stage *separate* from the release stage,
|
||||
# so that there's no sharing of permissions between them.
|
||||
# The release stage has trusted publishing and GitHub repo contents write access,
|
||||
# and we want to keep the scope of that access limited just to the release job.
|
||||
# (Release stage has trusted publishing and GitHub repo contents write access,
|
||||
#
|
||||
# Otherwise, a malicious `build` step (e.g. via a compromised dependency)
|
||||
# could get access to our GitHub or PyPI credentials.
|
||||
#
|
||||
@@ -288,16 +294,19 @@ jobs:
|
||||
run: |
|
||||
VIRTUAL_ENV=.venv uv pip install dist/*.whl
|
||||
|
||||
- name: Run unit tests
|
||||
run: make tests
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
|
||||
- name: Check for prerelease versions
|
||||
# Block release if any dependencies allow prerelease versions
|
||||
# (unless this is itself a prerelease version)
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
run: |
|
||||
uv run python $GITHUB_WORKSPACE/.github/scripts/check_prerelease_dependencies.py pyproject.toml
|
||||
|
||||
- name: Run unit tests
|
||||
run: make tests
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
|
||||
- name: Get minimum versions
|
||||
# Find the minimum published versions that satisfies the given constraints
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
id: min-version
|
||||
run: |
|
||||
@@ -322,6 +331,7 @@ jobs:
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
|
||||
- name: Run integration tests
|
||||
# Uses the Makefile's `integration_tests` target for the specified package
|
||||
if: ${{ startsWith(inputs.working-directory, 'libs/partners/') }}
|
||||
env:
|
||||
AI21_API_KEY: ${{ secrets.AI21_API_KEY }}
|
||||
@@ -362,7 +372,11 @@ jobs:
|
||||
working-directory: ${{ inputs.working-directory }}
|
||||
|
||||
# Test select published packages against new core
|
||||
# Done when code changes are made to langchain-core
|
||||
test-prior-published-packages-against-new-core:
|
||||
# Installs the new core with old partners: Installs the new unreleased core
|
||||
# alongside the previously published partner packages and runs integration tests
|
||||
if: github.ref != 'refs/heads/v0.3'
|
||||
needs:
|
||||
- build
|
||||
- release-notes
|
||||
@@ -390,6 +404,7 @@ jobs:
|
||||
|
||||
# We implement this conditional as Github Actions does not have good support
|
||||
# for conditionally needing steps. https://github.com/actions/runner/issues/491
|
||||
# TODO: this seems to be resolved upstream, so we can probably remove this workaround
|
||||
- name: Check if libs/core
|
||||
run: |
|
||||
if [ "${{ startsWith(inputs.working-directory, 'libs/core') }}" != "true" ]; then
|
||||
@@ -417,7 +432,7 @@ jobs:
|
||||
git ls-remote --tags origin "langchain-${{ matrix.partner }}*" \
|
||||
| awk '{print $2}' \
|
||||
| sed 's|refs/tags/||' \
|
||||
| grep -E '[0-9]+\.[0-9]+\.[0-9]+$' \
|
||||
| grep -E '==0\.3\.[0-9]+$' \
|
||||
| sort -Vr \
|
||||
| head -n 1
|
||||
)"
|
||||
@@ -444,12 +459,12 @@ jobs:
|
||||
make integration_tests
|
||||
|
||||
publish:
|
||||
# Publishes the package to PyPI
|
||||
needs:
|
||||
- build
|
||||
- release-notes
|
||||
- test-pypi-publish
|
||||
- pre-release-checks
|
||||
- test-prior-published-packages-against-new-core
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
# This permission is used for trusted publishing:
|
||||
@@ -486,6 +501,7 @@ jobs:
|
||||
attestations: false
|
||||
|
||||
mark-release:
|
||||
# Marks the GitHub release with the new version tag
|
||||
needs:
|
||||
- build
|
||||
- release-notes
|
||||
@@ -495,7 +511,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
# This permission is needed by `ncipollo/release-action` to
|
||||
# create the GitHub release.
|
||||
# create the GitHub release/tag
|
||||
contents: write
|
||||
|
||||
defaults:
|
||||
|
||||
5
.github/workflows/_test.yml
vendored
5
.github/workflows/_test.yml
vendored
@@ -1,6 +1,7 @@
|
||||
name: '🧪 Unit Testing'
|
||||
# Runs unit tests with both current and minimum supported dependency versions
|
||||
# to ensure compatibility across the supported range
|
||||
# to ensure compatibility across the supported range.
|
||||
|
||||
name: '🧪 Unit Testing'
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
|
||||
7
.github/workflows/_test_doc_imports.yml
vendored
7
.github/workflows/_test_doc_imports.yml
vendored
@@ -1,3 +1,10 @@
|
||||
# Validates that all import statements in `.ipynb` notebooks are correct and functional.
|
||||
#
|
||||
# Called as part of check_diffs.yml.
|
||||
#
|
||||
# Installs test dependencies and LangChain packages in editable mode and
|
||||
# runs check_imports.py.
|
||||
|
||||
name: '📑 Documentation Import Testing'
|
||||
|
||||
on:
|
||||
|
||||
2
.github/workflows/_test_pydantic.yml
vendored
2
.github/workflows/_test_pydantic.yml
vendored
@@ -1,3 +1,5 @@
|
||||
# Facilitate unit testing against different Pydantic versions for a provided package.
|
||||
|
||||
name: '🐍 Pydantic Version Testing'
|
||||
|
||||
on:
|
||||
|
||||
46
.github/workflows/api_doc_build.yml
vendored
46
.github/workflows/api_doc_build.yml
vendored
@@ -1,11 +1,19 @@
|
||||
# Build the API reference documentation.
|
||||
#
|
||||
# Runs daily. Can also be triggered manually for immediate updates.
|
||||
#
|
||||
# Built HTML pushed to langchain-ai/langchain-api-docs-html.
|
||||
#
|
||||
# Looks for langchain-ai org repos in packages.yml and checks them out.
|
||||
# Calls prep_api_docs_build.py.
|
||||
|
||||
name: '📚 API Docs'
|
||||
run-name: 'Build & Deploy API Reference'
|
||||
# Runs daily or can be triggered manually for immediate updates
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 13 * * *' # Daily at 1PM UTC
|
||||
- cron: '0 13 * * *' # Runs daily at 1PM UTC (9AM EDT/6AM PDT)
|
||||
env:
|
||||
PYTHON_VERSION: "3.11"
|
||||
|
||||
@@ -31,6 +39,8 @@ jobs:
|
||||
uses: mikefarah/yq@master
|
||||
with:
|
||||
cmd: |
|
||||
# Extract repos from packages.yml that are in the langchain-ai org
|
||||
# (excluding 'langchain' itself)
|
||||
yq '
|
||||
.packages[]
|
||||
| select(
|
||||
@@ -77,24 +87,31 @@ jobs:
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
- name: '📦 Install Initial Python Dependencies'
|
||||
- name: '📦 Install Initial Python Dependencies using uv'
|
||||
working-directory: langchain
|
||||
run: |
|
||||
python -m pip install -U uv
|
||||
python -m uv pip install --upgrade --no-cache-dir pip setuptools pyyaml
|
||||
|
||||
- name: '📦 Organize Library Directories'
|
||||
# Places cloned partner packages into libs/partners structure
|
||||
run: python langchain/.github/scripts/prep_api_docs_build.py
|
||||
|
||||
- name: '🧹 Remove Old HTML Files'
|
||||
- name: '🧹 Clear Prior Build'
|
||||
run:
|
||||
# Remove artifacts from prior docs build
|
||||
rm -rf langchain-api-docs-html/api_reference_build/html
|
||||
|
||||
- name: '📦 Install Documentation Dependencies'
|
||||
- name: '📦 Install Documentation Dependencies using uv'
|
||||
working-directory: langchain
|
||||
run: |
|
||||
python -m uv pip install $(ls ./libs/partners | xargs -I {} echo "./libs/partners/{}") --overrides ./docs/vercel_overrides.txt
|
||||
# Install all partner packages in editable mode with overrides
|
||||
python -m uv pip install $(ls ./libs/partners | xargs -I {} echo "./libs/partners/{}") --overrides ./docs/vercel_overrides.txt --prerelease=allow
|
||||
|
||||
# Install core langchain and other main packages
|
||||
python -m uv pip install libs/core libs/langchain libs/text-splitters libs/community libs/experimental libs/standard-tests
|
||||
|
||||
# Install Sphinx and related packages for building docs
|
||||
python -m uv pip install -r docs/api_reference/requirements.txt
|
||||
|
||||
- name: '🔧 Configure Git Settings'
|
||||
@@ -106,14 +123,29 @@ jobs:
|
||||
- name: '📚 Build API Documentation'
|
||||
working-directory: langchain
|
||||
run: |
|
||||
# Generate the API reference RST files
|
||||
python docs/api_reference/create_api_rst.py
|
||||
|
||||
# Build the HTML documentation using Sphinx
|
||||
# -T: show full traceback on exception
|
||||
# -E: don't use cached environment (force rebuild, ignore cached doctrees)
|
||||
# -b html: build HTML docs (vs PDS, etc.)
|
||||
# -d: path for the cached environment (parsed document trees / doctrees)
|
||||
# - Separate from output dir for faster incremental builds
|
||||
# -c: path to conf.py
|
||||
# -j auto: parallel build using all available CPU cores
|
||||
python -m sphinx -T -E -b html -d ../langchain-api-docs-html/_build/doctrees -c docs/api_reference docs/api_reference ../langchain-api-docs-html/api_reference_build/html -j auto
|
||||
|
||||
# Post-process the generated HTML
|
||||
python docs/api_reference/scripts/custom_formatter.py ../langchain-api-docs-html/api_reference_build/html
|
||||
|
||||
# Default index page is blank so we copy in the actual home page.
|
||||
cp ../langchain-api-docs-html/api_reference_build/html/{reference,index}.html
|
||||
|
||||
# Removes Sphinx's intermediate build artifacts after the build is complete.
|
||||
rm -rf ../langchain-api-docs-html/_build/
|
||||
|
||||
# https://github.com/marketplace/actions/add-commit
|
||||
# Commit and push changes to langchain-api-docs-html repo
|
||||
- uses: EndBug/add-and-commit@v9
|
||||
with:
|
||||
cwd: langchain-api-docs-html
|
||||
|
||||
6
.github/workflows/check-broken-links.yml
vendored
6
.github/workflows/check-broken-links.yml
vendored
@@ -1,9 +1,11 @@
|
||||
# Runs broken link checker in /docs on a daily schedule.
|
||||
|
||||
name: '🔗 Check Broken Links'
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 13 * * *'
|
||||
- cron: '0 13 * * *' # Runs daily at 1PM UTC (9AM EDT/6AM PDT)
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
@@ -15,7 +17,7 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- name: '🟢 Setup Node.js 18.x'
|
||||
uses: actions/setup-node@v4
|
||||
uses: actions/setup-node@v5
|
||||
with:
|
||||
node-version: 18.x
|
||||
cache: "yarn"
|
||||
|
||||
8
.github/workflows/check_core_versions.yml
vendored
8
.github/workflows/check_core_versions.yml
vendored
@@ -1,6 +1,8 @@
|
||||
name: '🔍 Check `core` Version Equality'
|
||||
# Ensures version numbers in pyproject.toml and version.py stay in sync
|
||||
# Prevents releases with mismatched version numbers
|
||||
# Ensures version numbers in pyproject.toml and version.py stay in sync.
|
||||
#
|
||||
# (Prevents releases with mismatched version numbers)
|
||||
|
||||
name: '🔍 Check Version Equality'
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
|
||||
19
.github/workflows/check_diffs.yml
vendored
19
.github/workflows/check_diffs.yml
vendored
@@ -1,3 +1,18 @@
|
||||
# Primary CI workflow.
|
||||
#
|
||||
# Only runs against packages that have changed files.
|
||||
#
|
||||
# Runs:
|
||||
# - Linting (_lint.yml)
|
||||
# - Unit Tests (_test.yml)
|
||||
# - Pydantic compatibility tests (_test_pydantic.yml)
|
||||
# - Documentation import tests (_test_doc_imports.yml)
|
||||
# - Integration test compilation checks (_compile_integration_test.yml)
|
||||
# - Extended test suites that require additional dependencies
|
||||
# - Codspeed benchmarks (if not labeled 'codspeed-ignore')
|
||||
#
|
||||
# Reports status to GitHub checks and PR status.
|
||||
|
||||
name: '🔧 CI'
|
||||
|
||||
on:
|
||||
@@ -11,8 +26,8 @@ on:
|
||||
# cancel the earlier run in favor of the next run.
|
||||
#
|
||||
# There's no point in testing an outdated version of the code. GitHub only allows
|
||||
# a limited number of job runners to be active at the same time, so it's better to cancel
|
||||
# pointless jobs early so that more useful jobs can run sooner.
|
||||
# a limited number of job runners to be active at the same time, so it's better to
|
||||
# cancel pointless jobs early so that more useful jobs can run sooner.
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
3
.github/workflows/check_new_docs.yml
vendored
3
.github/workflows/check_new_docs.yml
vendored
@@ -1,3 +1,6 @@
|
||||
# For integrations, we run check_templates.py to ensure that new docs use the correct
|
||||
# templates based on their type. See the script for more details.
|
||||
|
||||
name: '📑 Integration Docs Lint'
|
||||
|
||||
on:
|
||||
|
||||
6
.github/workflows/people.yml
vendored
6
.github/workflows/people.yml
vendored
@@ -1,9 +1,11 @@
|
||||
# Updates the LangChain People data by fetching the latest info from the LangChain Git.
|
||||
# TODO: broken/not used
|
||||
|
||||
name: '👥 LangChain People'
|
||||
run-name: 'Update People Data'
|
||||
# This workflow updates the LangChain People data by fetching the latest information from the LangChain Git
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 14 1 * *"
|
||||
- cron: "0 14 1 * *" # Runs at 14:00 UTC on the 1st of every month (10AM EDT/7AM PDT)
|
||||
push:
|
||||
branches: [jacob/people]
|
||||
workflow_dispatch:
|
||||
|
||||
@@ -1,8 +1,14 @@
|
||||
# Label PRs based on changed files.
|
||||
#
|
||||
# See `.github/pr-file-labeler.yml` to see rules for each label/directory.
|
||||
|
||||
name: "🏷️ Pull Request Labeler"
|
||||
|
||||
on:
|
||||
# Safe since we're not checking out or running the PR's code
|
||||
# Never check out the PR's head in a pull_request_target job
|
||||
pull_request_target:
|
||||
types: [opened, synchronize, reopened]
|
||||
types: [opened, synchronize, reopened, edited]
|
||||
|
||||
jobs:
|
||||
labeler:
|
||||
@@ -18,5 +24,5 @@ jobs:
|
||||
uses: actions/labeler@v6
|
||||
with:
|
||||
repo-token: "${{ secrets.GITHUB_TOKEN }}"
|
||||
configuration-path: .github/labeler.yml
|
||||
configuration-path: .github/pr-file-labeler.yml
|
||||
sync-labels: false
|
||||
@@ -1,7 +1,13 @@
|
||||
# Label PRs based on their titles.
|
||||
#
|
||||
# See `.github/pr-title-labeler.yml` to see rules for each label/title pattern.
|
||||
|
||||
name: "🏷️ PR Title Labeler"
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
# Safe since we're not checking out or running the PR's code
|
||||
# Never check out the PR's head in a pull_request_target job
|
||||
pull_request_target:
|
||||
types: [opened, synchronize, reopened, edited]
|
||||
|
||||
jobs:
|
||||
@@ -15,7 +21,8 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Label PR based on title
|
||||
uses: grafana/pr-labeler-action@v0.1.0
|
||||
# Archived repo; latest commit (v0.1.0)
|
||||
uses: grafana/pr-labeler-action@f19222d3ef883d2ca5f04420fdfe8148003763f0
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
configuration-path: .github/pr-title-labeler.yml
|
||||
60
.github/workflows/pr_lint.yml
vendored
60
.github/workflows/pr_lint.yml
vendored
@@ -1,29 +1,30 @@
|
||||
# -----------------------------------------------------------------------------
|
||||
# PR Title Lint Workflow
|
||||
# PR title linting.
|
||||
#
|
||||
# Purpose:
|
||||
# Enforces Conventional Commits format for pull request titles to maintain a
|
||||
# clear, consistent, and machine-readable change history across our repository.
|
||||
# This helps with automated changelog generation and semantic versioning.
|
||||
# FORMAT (Conventional Commits 1.0.0):
|
||||
#
|
||||
# Enforced Commit Message Format (Conventional Commits 1.0.0):
|
||||
# <type>[optional scope]: <description>
|
||||
# [optional body]
|
||||
# [optional footer(s)]
|
||||
#
|
||||
# Examples:
|
||||
# feat(core): add multi‐tenant support
|
||||
# fix(cli): resolve flag parsing error
|
||||
# docs: update API usage examples
|
||||
# docs(openai): update API usage examples
|
||||
#
|
||||
# Allowed Types:
|
||||
# • feat — a new feature (MINOR bump)
|
||||
# • fix — a bug fix (PATCH bump)
|
||||
# • docs — documentation only changes
|
||||
# • style — formatting, missing semi-colons, etc.; no code change
|
||||
# • refactor — code change that neither fixes a bug nor adds a feature
|
||||
# • perf — code change that improves performance
|
||||
# • test — adding missing tests or correcting existing tests
|
||||
# • build — changes that affect the build system or external dependencies
|
||||
# • ci — continuous integration/configuration changes
|
||||
# • chore — other changes that don't modify src or test files
|
||||
# • revert — reverts a previous commit
|
||||
# • release — prepare a new release
|
||||
# * feat — a new feature (MINOR)
|
||||
# * fix — a bug fix (PATCH)
|
||||
# * docs — documentation only changes (either in /docs or code comments)
|
||||
# * style — formatting, linting, etc.; no code change or typing refactors
|
||||
# * refactor — code change that neither fixes a bug nor adds a feature
|
||||
# * perf — code change that improves performance
|
||||
# * test — adding tests or correcting existing
|
||||
# * build — changes that affect the build system/external dependencies
|
||||
# * ci — continuous integration/configuration changes
|
||||
# * chore — other changes that don't modify source or test files
|
||||
# * revert — reverts a previous commit
|
||||
# * release — prepare a new release
|
||||
#
|
||||
# Allowed Scopes (optional):
|
||||
# core, cli, langchain, langchain_v1, langchain_legacy, standard-tests,
|
||||
@@ -31,21 +32,12 @@
|
||||
# huggingface, mistralai, nomic, ollama, openai, perplexity, prompty, qdrant,
|
||||
# xai, infra
|
||||
#
|
||||
# Rules & Tips for New Committers:
|
||||
# 1. Subject (type) must start with a lowercase letter and, if possible, be
|
||||
# followed by a scope wrapped in parenthesis `(scope)`
|
||||
# 2. Breaking changes:
|
||||
# – Append "!" after type/scope (e.g., feat!: drop Node 12 support)
|
||||
# – Or include a footer "BREAKING CHANGE: <details>"
|
||||
# 3. Example PR titles:
|
||||
# feat(core): add multi‐tenant support
|
||||
# fix(cli): resolve flag parsing error
|
||||
# docs: update API usage examples
|
||||
# docs(openai): update API usage examples
|
||||
# Rules:
|
||||
# 1. The 'Type' must start with a lowercase letter.
|
||||
# 2. Breaking changes: append "!" after type/scope (e.g., feat!: drop x support)
|
||||
#
|
||||
# Resources:
|
||||
# • Conventional Commits spec: https://www.conventionalcommits.org/en/v1.0.0/
|
||||
# -----------------------------------------------------------------------------
|
||||
# Enforces Conventional Commits format for pull request titles to maintain a clear and
|
||||
# machine-readable change history.
|
||||
|
||||
name: '🏷️ PR Title Lint'
|
||||
|
||||
@@ -57,7 +49,7 @@ on:
|
||||
types: [opened, edited, synchronize]
|
||||
|
||||
jobs:
|
||||
# Validates that PR title follows Conventional Commits specification
|
||||
# Validates that PR title follows Conventional Commits 1.0.0 specification
|
||||
lint-pr-title:
|
||||
name: 'validate format'
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
2
.github/workflows/run_notebooks.yml
vendored
2
.github/workflows/run_notebooks.yml
vendored
@@ -1,3 +1,5 @@
|
||||
# Integration tests for documentation notebooks.
|
||||
|
||||
name: '📓 Validate Documentation Notebooks'
|
||||
run-name: 'Test notebooks in ${{ inputs.working-directory }}'
|
||||
on:
|
||||
|
||||
14
.github/workflows/scheduled_test.yml
vendored
14
.github/workflows/scheduled_test.yml
vendored
@@ -1,8 +1,14 @@
|
||||
# Routine integration tests against partner libraries with live API credentials.
|
||||
#
|
||||
# Uses `make integration_tests` for each library in the matrix.
|
||||
#
|
||||
# Runs daily. Can also be triggered manually for immediate updates.
|
||||
|
||||
name: '⏰ Scheduled Integration Tests'
|
||||
run-name: "Run Integration Tests - ${{ inputs.working-directory-force || 'all libs' }} (Python ${{ inputs.python-version-force || '3.9, 3.11' }})"
|
||||
|
||||
on:
|
||||
workflow_dispatch: # Allows maintainers to trigger the workflow manually in GitHub UI
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
working-directory-force:
|
||||
type: string
|
||||
@@ -54,13 +60,13 @@ jobs:
|
||||
echo $matrix
|
||||
echo "matrix=$matrix" >> $GITHUB_OUTPUT
|
||||
# Run integration tests against partner libraries with live API credentials
|
||||
# Tests are run with both Poetry and UV depending on the library's setup
|
||||
# Tests are run with Poetry or UV depending on the library's setup
|
||||
build:
|
||||
if: github.repository_owner == 'langchain-ai' || github.event_name != 'schedule'
|
||||
name: '🐍 Python ${{ matrix.python-version }}: ${{ matrix.working-directory }}'
|
||||
runs-on: ubuntu-latest
|
||||
needs: [compute-matrix]
|
||||
timeout-minutes: 20
|
||||
timeout-minutes: 30
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
@@ -161,7 +167,7 @@ jobs:
|
||||
make integration_tests
|
||||
|
||||
- name: '🧹 Clean up External Libraries'
|
||||
# Clean up external libraries to avoid affecting git status check
|
||||
# Clean up external libraries to avoid affecting the following git status check
|
||||
run: |
|
||||
rm -rf \
|
||||
langchain/libs/partners/google-genai \
|
||||
|
||||
9
.github/workflows/v1_changes.md
vendored
Normal file
9
.github/workflows/v1_changes.md
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
With the deprecation of v0 docs, the following files will need to be migrated/supported
|
||||
in the new docs repo:
|
||||
|
||||
- run_notebooks.yml: New repo should run Integration tests on code snippets?
|
||||
- people.yml: Need to fix and somehow display on the new docs site
|
||||
- Subsequently, `.github/actions/people/`
|
||||
- _test_doc_imports.yml
|
||||
- check_new_docs.yml
|
||||
- check-broken-links.yml
|
||||
@@ -1,25 +0,0 @@
|
||||
# Read the Docs configuration file
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
version: 2
|
||||
|
||||
# Set the version of Python and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
commands:
|
||||
- mkdir -p $READTHEDOCS_OUTPUT
|
||||
- cp -r api_reference_build/* $READTHEDOCS_OUTPUT
|
||||
|
||||
# Build documentation in the docs/ directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/api_reference/conf.py
|
||||
|
||||
# If using Sphinx, optionally build your docs in additional formats such as PDF
|
||||
formats:
|
||||
- pdf
|
||||
|
||||
# Optionally declare the Python requirements required to build your docs
|
||||
python:
|
||||
install:
|
||||
- requirements: docs/api_reference/requirements.txt
|
||||
7
.vscode/settings.json
vendored
7
.vscode/settings.json
vendored
@@ -78,5 +78,10 @@
|
||||
"editor.insertSpaces": true
|
||||
},
|
||||
"python.terminal.activateEnvironment": false,
|
||||
"python.defaultInterpreterPath": "./.venv/bin/python"
|
||||
"python.defaultInterpreterPath": "./.venv/bin/python",
|
||||
"github.copilot.chat.commitMessageGeneration.instructions": [
|
||||
{
|
||||
"file": ".github/workflows/pr_lint.yml"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
325
AGENTS.md
Normal file
325
AGENTS.md
Normal file
@@ -0,0 +1,325 @@
|
||||
# Global Development Guidelines for LangChain Projects
|
||||
|
||||
## Core Development Principles
|
||||
|
||||
### 1. Maintain Stable Public Interfaces ⚠️ CRITICAL
|
||||
|
||||
**Always attempt to preserve function signatures, argument positions, and names for exported/public methods.**
|
||||
|
||||
❌ **Bad - Breaking Change:**
|
||||
|
||||
```python
|
||||
def get_user(id, verbose=False): # Changed from `user_id`
|
||||
pass
|
||||
```
|
||||
|
||||
✅ **Good - Stable Interface:**
|
||||
|
||||
```python
|
||||
def get_user(user_id: str, verbose: bool = False) -> User:
|
||||
"""Retrieve user by ID with optional verbose output."""
|
||||
pass
|
||||
```
|
||||
|
||||
**Before making ANY changes to public APIs:**
|
||||
|
||||
- Check if the function/class is exported in `__init__.py`
|
||||
- Look for existing usage patterns in tests and examples
|
||||
- Use keyword-only arguments for new parameters: `*, new_param: str = "default"`
|
||||
- Mark experimental features clearly with docstring warnings (using reStructuredText, like `.. warning::`)
|
||||
|
||||
🧠 *Ask yourself:* "Would this change break someone's code if they used it last week?"
|
||||
|
||||
### 2. Code Quality Standards
|
||||
|
||||
**All Python code MUST include type hints and return types.**
|
||||
|
||||
❌ **Bad:**
|
||||
|
||||
```python
|
||||
def p(u, d):
|
||||
return [x for x in u if x not in d]
|
||||
```
|
||||
|
||||
✅ **Good:**
|
||||
|
||||
```python
|
||||
def filter_unknown_users(users: list[str], known_users: set[str]) -> list[str]:
|
||||
"""Filter out users that are not in the known users set.
|
||||
|
||||
Args:
|
||||
users: List of user identifiers to filter.
|
||||
known_users: Set of known/valid user identifiers.
|
||||
|
||||
Returns:
|
||||
List of users that are not in the known_users set.
|
||||
"""
|
||||
return [user for user in users if user not in known_users]
|
||||
```
|
||||
|
||||
**Style Requirements:**
|
||||
|
||||
- Use descriptive, **self-explanatory variable names**. Avoid overly short or cryptic identifiers.
|
||||
- Attempt to break up complex functions (>20 lines) into smaller, focused functions where it makes sense
|
||||
- Avoid unnecessary abstraction or premature optimization
|
||||
- Follow existing patterns in the codebase you're modifying
|
||||
|
||||
### 3. Testing Requirements
|
||||
|
||||
**Every new feature or bugfix MUST be covered by unit tests.**
|
||||
|
||||
**Test Organization:**
|
||||
|
||||
- Unit tests: `tests/unit_tests/` (no network calls allowed)
|
||||
- Integration tests: `tests/integration_tests/` (network calls permitted)
|
||||
- Use `pytest` as the testing framework
|
||||
|
||||
**Test Quality Checklist:**
|
||||
|
||||
- [ ] Tests fail when your new logic is broken
|
||||
- [ ] Happy path is covered
|
||||
- [ ] Edge cases and error conditions are tested
|
||||
- [ ] Use fixtures/mocks for external dependencies
|
||||
- [ ] Tests are deterministic (no flaky tests)
|
||||
|
||||
Checklist questions:
|
||||
|
||||
- [ ] Does the test suite fail if your new logic is broken?
|
||||
- [ ] Are all expected behaviors exercised (happy path, invalid input, etc)?
|
||||
- [ ] Do tests use fixtures or mocks where needed?
|
||||
|
||||
```python
|
||||
def test_filter_unknown_users():
|
||||
"""Test filtering unknown users from a list."""
|
||||
users = ["alice", "bob", "charlie"]
|
||||
known_users = {"alice", "bob"}
|
||||
|
||||
result = filter_unknown_users(users, known_users)
|
||||
|
||||
assert result == ["charlie"]
|
||||
assert len(result) == 1
|
||||
```
|
||||
|
||||
### 4. Security and Risk Assessment
|
||||
|
||||
**Security Checklist:**
|
||||
|
||||
- No `eval()`, `exec()`, or `pickle` on user-controlled input
|
||||
- Proper exception handling (no bare `except:`) and use a `msg` variable for error messages
|
||||
- Remove unreachable/commented code before committing
|
||||
- Race conditions or resource leaks (file handles, sockets, threads).
|
||||
- Ensure proper resource cleanup (file handles, connections)
|
||||
|
||||
❌ **Bad:**
|
||||
|
||||
```python
|
||||
def load_config(path):
|
||||
with open(path) as f:
|
||||
return eval(f.read()) # ⚠️ Never eval config
|
||||
```
|
||||
|
||||
✅ **Good:**
|
||||
|
||||
```python
|
||||
import json
|
||||
|
||||
def load_config(path: str) -> dict:
|
||||
with open(path) as f:
|
||||
return json.load(f)
|
||||
```
|
||||
|
||||
### 5. Documentation Standards
|
||||
|
||||
**Use Google-style docstrings with Args section for all public functions.**
|
||||
|
||||
❌ **Insufficient Documentation:**
|
||||
|
||||
```python
|
||||
def send_email(to, msg):
|
||||
"""Send an email to a recipient."""
|
||||
```
|
||||
|
||||
✅ **Complete Documentation:**
|
||||
|
||||
```python
|
||||
def send_email(to: str, msg: str, *, priority: str = "normal") -> bool:
|
||||
"""
|
||||
Send an email to a recipient with specified priority.
|
||||
|
||||
Args:
|
||||
to: The email address of the recipient.
|
||||
msg: The message body to send.
|
||||
priority: Email priority level (``'low'``, ``'normal'``, ``'high'``).
|
||||
|
||||
Returns:
|
||||
True if email was sent successfully, False otherwise.
|
||||
|
||||
Raises:
|
||||
InvalidEmailError: If the email address format is invalid.
|
||||
SMTPConnectionError: If unable to connect to email server.
|
||||
"""
|
||||
```
|
||||
|
||||
**Documentation Guidelines:**
|
||||
|
||||
- Types go in function signatures, NOT in docstrings
|
||||
- Focus on "why" rather than "what" in descriptions
|
||||
- Document all parameters, return values, and exceptions
|
||||
- Keep descriptions concise but clear
|
||||
- Use reStructuredText for docstrings to enable rich formatting
|
||||
|
||||
📌 *Tip:* Keep descriptions concise but clear. Only document return values if non-obvious.
|
||||
|
||||
### 6. Architectural Improvements
|
||||
|
||||
**When you encounter code that could be improved, suggest better designs:**
|
||||
|
||||
❌ **Poor Design:**
|
||||
|
||||
```python
|
||||
def process_data(data, db_conn, email_client, logger):
|
||||
# Function doing too many things
|
||||
validated = validate_data(data)
|
||||
result = db_conn.save(validated)
|
||||
email_client.send_notification(result)
|
||||
logger.log(f"Processed {len(data)} items")
|
||||
return result
|
||||
```
|
||||
|
||||
✅ **Better Design:**
|
||||
|
||||
```python
|
||||
@dataclass
|
||||
class ProcessingResult:
|
||||
"""Result of data processing operation."""
|
||||
items_processed: int
|
||||
success: bool
|
||||
errors: List[str] = field(default_factory=list)
|
||||
|
||||
class DataProcessor:
|
||||
"""Handles data validation, storage, and notification."""
|
||||
|
||||
def __init__(self, db_conn: Database, email_client: EmailClient):
|
||||
self.db = db_conn
|
||||
self.email = email_client
|
||||
|
||||
def process(self, data: List[dict]) -> ProcessingResult:
|
||||
"""Process and store data with notifications."""
|
||||
validated = self._validate_data(data)
|
||||
result = self.db.save(validated)
|
||||
self._notify_completion(result)
|
||||
return result
|
||||
```
|
||||
|
||||
**Design Improvement Areas:**
|
||||
|
||||
If there's a **cleaner**, **more scalable**, or **simpler** design, highlight it and suggest improvements that would:
|
||||
|
||||
- Reduce code duplication through shared utilities
|
||||
- Make unit testing easier
|
||||
- Improve separation of concerns (single responsibility)
|
||||
- Make unit testing easier through dependency injection
|
||||
- Add clarity without adding complexity
|
||||
- Prefer dataclasses for structured data
|
||||
|
||||
## Development Tools & Commands
|
||||
|
||||
### Package Management
|
||||
|
||||
```bash
|
||||
# Add package
|
||||
uv add package-name
|
||||
|
||||
# Sync project dependencies
|
||||
uv sync
|
||||
uv lock
|
||||
```
|
||||
|
||||
### Testing
|
||||
|
||||
```bash
|
||||
# Run unit tests (no network)
|
||||
make test
|
||||
|
||||
# Don't run integration tests, as API keys must be set
|
||||
|
||||
# Run specific test file
|
||||
uv run --group test pytest tests/unit_tests/test_specific.py
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
|
||||
```bash
|
||||
# Lint code
|
||||
make lint
|
||||
|
||||
# Format code
|
||||
make format
|
||||
|
||||
# Type checking
|
||||
uv run --group lint mypy .
|
||||
```
|
||||
|
||||
### Dependency Management Patterns
|
||||
|
||||
**Local Development Dependencies:**
|
||||
|
||||
```toml
|
||||
[tool.uv.sources]
|
||||
langchain-core = { path = "../core", editable = true }
|
||||
langchain-tests = { path = "../standard-tests", editable = true }
|
||||
```
|
||||
|
||||
**For tools, use the `@tool` decorator from `langchain_core.tools`:**
|
||||
|
||||
```python
|
||||
from langchain_core.tools import tool
|
||||
|
||||
@tool
|
||||
def search_database(query: str) -> str:
|
||||
"""Search the database for relevant information.
|
||||
|
||||
Args:
|
||||
query: The search query string.
|
||||
"""
|
||||
# Implementation here
|
||||
return results
|
||||
```
|
||||
|
||||
## Commit Standards
|
||||
|
||||
**Use Conventional Commits format for PR titles:**
|
||||
|
||||
- `feat(core): add multi-tenant support`
|
||||
- `fix(cli): resolve flag parsing error`
|
||||
- `docs: update API usage examples`
|
||||
- `docs(openai): update API usage examples`
|
||||
|
||||
## Framework-Specific Guidelines
|
||||
|
||||
- Follow the existing patterns in `langchain-core` for base abstractions
|
||||
- Use `langchain_core.callbacks` for execution tracking
|
||||
- Implement proper streaming support where applicable
|
||||
- Avoid deprecated components like legacy `LLMChain`
|
||||
|
||||
### Partner Integrations
|
||||
|
||||
- Follow the established patterns in existing partner libraries
|
||||
- Implement standard interfaces (`BaseChatModel`, `BaseEmbeddings`, etc.)
|
||||
- Include comprehensive integration tests
|
||||
- Document API key requirements and authentication
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference Checklist
|
||||
|
||||
Before submitting code changes:
|
||||
|
||||
- [ ] **Breaking Changes**: Verified no public API changes
|
||||
- [ ] **Type Hints**: All functions have complete type annotations
|
||||
- [ ] **Tests**: New functionality is fully tested
|
||||
- [ ] **Security**: No dangerous patterns (eval, silent failures, etc.)
|
||||
- [ ] **Documentation**: Google-style docstrings for public functions
|
||||
- [ ] **Code Quality**: `make lint` and `make format` pass
|
||||
- [ ] **Architecture**: Suggested improvements where applicable
|
||||
- [ ] **Commit Message**: Follows Conventional Commits format
|
||||
108
README.md
108
README.md
@@ -1,83 +1,75 @@
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: light)" srcset="docs/static/img/logo-dark.svg">
|
||||
<source media="(prefers-color-scheme: dark)" srcset="docs/static/img/logo-light.svg">
|
||||
<img alt="LangChain Logo" src="docs/static/img/logo-dark.svg" width="80%">
|
||||
</picture>
|
||||
<p align="center">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: light)" srcset="docs/static/img/logo-dark.svg">
|
||||
<source media="(prefers-color-scheme: dark)" srcset="docs/static/img/logo-light.svg">
|
||||
<img alt="LangChain Logo" src="docs/static/img/logo-dark.svg" width="80%">
|
||||
</picture>
|
||||
</p>
|
||||
|
||||
<div>
|
||||
<br>
|
||||
</div>
|
||||
<p align="center">
|
||||
The platform for reliable agents.
|
||||
</p>
|
||||
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
[](https://pypistats.org/packages/langchain-core)
|
||||
[](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/langchain-ai/langchain)
|
||||
[<img src="https://github.com/codespaces/badge.svg" alt="Open in Github Codespace" title="Open in Github Codespace" width="150" height="20">](https://codespaces.new/langchain-ai/langchain)
|
||||
[](https://codspeed.io/langchain-ai/langchain)
|
||||
[](https://twitter.com/langchainai)
|
||||
<p align="center">
|
||||
<a href="https://opensource.org/licenses/MIT" target="_blank">
|
||||
<img src="https://img.shields.io/pypi/l/langchain-core?style=flat-square" alt="PyPI - License">
|
||||
</a>
|
||||
<a href="https://pypistats.org/packages/langchain-core" target="_blank">
|
||||
<img src="https://img.shields.io/pepy/dt/langchain" alt="PyPI - Downloads">
|
||||
</a>
|
||||
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/langchain-ai/langchain" target="_blank">
|
||||
<img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode&style=flat-square" alt="Open in Dev Containers">
|
||||
</a>
|
||||
<a href="https://codespaces.new/langchain-ai/langchain" target="_blank">
|
||||
<img src="https://github.com/codespaces/badge.svg" alt="Open in Github Codespace" title="Open in Github Codespace" width="150" height="20">
|
||||
</a>
|
||||
<a href="https://codspeed.io/langchain-ai/langchain" target="_blank">
|
||||
<img src="https://img.shields.io/endpoint?url=https://codspeed.io/badge.json" alt="CodSpeed Badge">
|
||||
</a>
|
||||
<a href="https://twitter.com/langchainai" target="_blank">
|
||||
<img src="https://img.shields.io/twitter/url/https/twitter.com/langchainai.svg?style=social&label=Follow%20%40LangChainAI" alt="Twitter / X">
|
||||
</a>
|
||||
</p>
|
||||
|
||||
> [!NOTE]
|
||||
> Looking for the JS/TS library? Check out [LangChain.js](https://github.com/langchain-ai/langchainjs).
|
||||
|
||||
LangChain is a framework for building LLM-powered applications. It helps you chain
|
||||
together interoperable components and third-party integrations to simplify AI
|
||||
application development — all while future-proofing decisions as the underlying
|
||||
technology evolves.
|
||||
LangChain is a framework for building LLM-powered applications. It helps you chain together interoperable components and third-party integrations to simplify AI application development — all while future-proofing decisions as the underlying technology evolves.
|
||||
|
||||
```bash
|
||||
pip install -U langchain
|
||||
```
|
||||
|
||||
To learn more about LangChain, check out
|
||||
[the docs](https://python.langchain.com/docs/introduction/). If you’re looking for more
|
||||
advanced customization or agent orchestration, check out
|
||||
[LangGraph](https://langchain-ai.github.io/langgraph/), our framework for building
|
||||
controllable agent workflows.
|
||||
---
|
||||
|
||||
**Documentation**: To learn more about LangChain, check out [the docs](https://python.langchain.com/docs/introduction/).
|
||||
|
||||
If you're looking for more advanced customization or agent orchestration, check out [LangGraph](https://langchain-ai.github.io/langgraph/), our framework for building controllable agent workflows.
|
||||
|
||||
> [!NOTE]
|
||||
> Looking for the JS/TS library? Check out [LangChain.js](https://github.com/langchain-ai/langchainjs).
|
||||
|
||||
## Why use LangChain?
|
||||
|
||||
LangChain helps developers build applications powered by LLMs through a standard
|
||||
interface for models, embeddings, vector stores, and more.
|
||||
LangChain helps developers build applications powered by LLMs through a standard interface for models, embeddings, vector stores, and more.
|
||||
|
||||
Use LangChain for:
|
||||
|
||||
- **Real-time data augmentation**. Easily connect LLMs to diverse data sources and
|
||||
external/internal systems, drawing from LangChain’s vast library of integrations with
|
||||
model providers, tools, vector stores, retrievers, and more.
|
||||
- **Model interoperability**. Swap models in and out as your engineering team
|
||||
experiments to find the best choice for your application’s needs. As the industry
|
||||
frontier evolves, adapt quickly — LangChain’s abstractions keep you moving without
|
||||
losing momentum.
|
||||
- **Real-time data augmentation**. Easily connect LLMs to diverse data sources and external/internal systems, drawing from LangChain’s vast library of integrations with model providers, tools, vector stores, retrievers, and more.
|
||||
- **Model interoperability**. Swap models in and out as your engineering team experiments to find the best choice for your application’s needs. As the industry frontier evolves, adapt quickly — LangChain’s abstractions keep you moving without losing momentum.
|
||||
|
||||
## LangChain’s ecosystem
|
||||
|
||||
While the LangChain framework can be used standalone, it also integrates seamlessly
|
||||
with any LangChain product, giving developers a full suite of tools when building LLM
|
||||
applications.
|
||||
While the LangChain framework can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools when building LLM applications.
|
||||
|
||||
To improve your LLM application development, pair LangChain with:
|
||||
|
||||
- [LangSmith](https://www.langchain.com/langsmith) - Helpful for agent evals and
|
||||
observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain
|
||||
visibility in production, and improve performance over time.
|
||||
- [LangGraph](https://langchain-ai.github.io/langgraph/) - Build agents that can
|
||||
reliably handle complex tasks with LangGraph, our low-level agent orchestration
|
||||
framework. LangGraph offers customizable architecture, long-term memory, and
|
||||
human-in-the-loop workflows — and is trusted in production by companies like LinkedIn,
|
||||
Uber, Klarna, and GitLab.
|
||||
- [LangGraph Platform](https://docs.langchain.com/langgraph-platform) - Deploy
|
||||
and scale agents effortlessly with a purpose-built deployment platform for long-running, stateful workflows. Discover, reuse, configure, and share agents across
|
||||
teams — and iterate quickly with visual prototyping in
|
||||
[LangGraph Studio](https://langchain-ai.github.io/langgraph/concepts/langgraph_studio/).
|
||||
- [LangSmith](https://www.langchain.com/langsmith) - Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time.
|
||||
- [LangGraph](https://langchain-ai.github.io/langgraph/) - Build agents that can reliably handle complex tasks with LangGraph, our low-level agent orchestration framework. LangGraph offers customizable architecture, long-term memory, and human-in-the-loop workflows — and is trusted in production by companies like LinkedIn, Uber, Klarna, and GitLab.
|
||||
- [LangGraph Platform](https://docs.langchain.com/langgraph-platform) - Deploy and scale agents effortlessly with a purpose-built deployment platform for long-running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [LangGraph Studio](https://langchain-ai.github.io/langgraph/concepts/langgraph_studio/).
|
||||
|
||||
## Additional resources
|
||||
|
||||
- [Tutorials](https://python.langchain.com/docs/tutorials/): Simple walkthroughs with
|
||||
guided examples on getting started with LangChain.
|
||||
- [How-to Guides](https://python.langchain.com/docs/how_to/): Quick, actionable code
|
||||
snippets for topics such as tool calling, RAG use cases, and more.
|
||||
- [Conceptual Guides](https://python.langchain.com/docs/concepts/): Explanations of key
|
||||
concepts behind the LangChain framework.
|
||||
- [Tutorials](https://python.langchain.com/docs/tutorials/): Simple walkthroughs with guided examples on getting started with LangChain.
|
||||
- [How-to Guides](https://python.langchain.com/docs/how_to/): Quick, actionable code snippets for topics such as tool calling, RAG use cases, and more.
|
||||
- [Conceptual Guides](https://python.langchain.com/docs/concepts/): Explanations of key concepts behind the LangChain framework.
|
||||
- [LangChain Forum](https://forum.langchain.com/): Connect with the community and share all of your technical questions, ideas, and feedback.
|
||||
- [API Reference](https://python.langchain.com/api_reference/): Detailed reference on
|
||||
navigating base packages and integrations for LangChain.
|
||||
- [API Reference](https://python.langchain.com/api_reference/): Detailed reference on navigating base packages and integrations for LangChain.
|
||||
- [Chat LangChain](https://chat.langchain.com/): Ask questions & chat with our documentation.
|
||||
|
||||
@@ -35,7 +35,7 @@ open source projects at [huntr](https://huntr.com/bounties/disclose/?target=http
|
||||
Before reporting a vulnerability, please review:
|
||||
|
||||
1) In-Scope Targets and Out-of-Scope Targets below.
|
||||
2) The [langchain-ai/langchain](https://python.langchain.com/docs/contributing/repo_structure) monorepo structure.
|
||||
2) The [langchain-ai/langchain](https://docs.langchain.com/oss/python/contributing/code#supporting-packages) monorepo structure.
|
||||
3) The [Best Practices](#best-practices) above to understand what we consider to be a security vulnerability vs. developer responsibility.
|
||||
|
||||
### In-Scope Targets
|
||||
|
||||
153
docs/README.md
153
docs/README.md
@@ -1,3 +1,154 @@
|
||||
# LangChain Documentation
|
||||
|
||||
For more information on contributing to our documentation, see the [Documentation Contributing Guide](https://python.langchain.com/docs/contributing/how_to/documentation)
|
||||
For more information on contributing to our documentation, see the [Documentation Contributing Guide](https://python.langchain.com/docs/contributing/how_to/documentation).
|
||||
|
||||
## Structure
|
||||
|
||||
The primary documentation is located in the `docs/` directory. This directory contains
|
||||
both the source files for the main documentation as well as the API reference doc
|
||||
build process.
|
||||
|
||||
### API Reference
|
||||
|
||||
API reference documentation is located in `docs/api_reference/` and is generated from
|
||||
the codebase using Sphinx.
|
||||
|
||||
The API reference have additional build steps that differ from the main documentation.
|
||||
|
||||
#### Deployment Process
|
||||
|
||||
Currently, the build process roughly follows these steps:
|
||||
|
||||
1. Using the `api_doc_build.yml` GitHub workflow, the API reference docs are
|
||||
[built](#build-technical-details) and copied to the `langchain-api-docs-html`
|
||||
repository. This workflow is triggered either (1) on a cron routine interval or (2)
|
||||
triggered manually.
|
||||
|
||||
In short, the workflow extracts all `langchain-ai`-org-owned repos defined in
|
||||
`langchain/libs/packages.yml`, clones them locally (in the workflow runner's file
|
||||
system), and then builds the API reference RST files (using `create_api_rst.py`).
|
||||
Following post-processing, the HTML files are pushed to the
|
||||
`langchain-api-docs-html` repository.
|
||||
2. After the HTML files are in the `langchain-api-docs-html` repository, they are **not**
|
||||
automatically published to the [live docs site](https://python.langchain.com/api_reference/).
|
||||
|
||||
The docs site is served by Vercel. The Vercel deployment process copies the HTML
|
||||
files from the `langchain-api-docs-html` repository and deploys them to the live
|
||||
site. Deployments are triggered on each new commit pushed to `v0.3`.
|
||||
|
||||
#### Build Technical Details
|
||||
|
||||
The build process creates a virtual monorepo by syncing multiple repositories, then generates comprehensive API documentation:
|
||||
|
||||
1. **Repository Sync Phase:**
|
||||
- `.github/scripts/prep_api_docs_build.py` - Clones external partner repos and organizes them into the `libs/partners/` structure to create a virtual monorepo for documentation building
|
||||
|
||||
2. **RST Generation Phase:**
|
||||
- `docs/api_reference/create_api_rst.py` - Main script that **generates RST files** from Python source code
|
||||
- Scans `libs/` directories and extracts classes/functions from each module (using `inspect`)
|
||||
- Creates `.rst` files using specialized templates for different object types
|
||||
- Templates in `docs/api_reference/templates/` (`pydantic.rst`, `runnable_pydantic.rst`, etc.)
|
||||
|
||||
3. **HTML Build Phase:**
|
||||
- Sphinx-based, uses `sphinx.ext.autodoc` (auto-extracts docstrings from the codebase)
|
||||
- `docs/api_reference/conf.py` (sphinx config) configures `autodoc` and other extensions
|
||||
- `sphinx-build` processes the generated `.rst` files into HTML using autodoc
|
||||
- `docs/api_reference/scripts/custom_formatter.py` - Post-processes the generated HTML
|
||||
- Copies `reference.html` to `index.html` to create the default landing page (artifact? might not need to do this - just put everyhing in index directly?)
|
||||
|
||||
4. **Deployment:**
|
||||
- `.github/workflows/api_doc_build.yml` - Workflow responsible for orchestrating the entire build and deployment process
|
||||
- Built HTML files are committed and pushed to the `langchain-api-docs-html` repository
|
||||
|
||||
#### Local Build
|
||||
|
||||
For local development and testing of API documentation, use the Makefile targets in the repository root:
|
||||
|
||||
```bash
|
||||
# Full build
|
||||
make api_docs_build
|
||||
```
|
||||
|
||||
Like the CI process, this target:
|
||||
|
||||
- Installs the CLI package in editable mode
|
||||
- Generates RST files for all packages using `create_api_rst.py`
|
||||
- Builds HTML documentation with Sphinx
|
||||
- Post-processes the HTML with `custom_formatter.py`
|
||||
- Opens the built documentation (`reference.html`) in your browser
|
||||
|
||||
**Quick Preview:**
|
||||
|
||||
```bash
|
||||
make api_docs_quick_preview API_PKG=openai
|
||||
```
|
||||
|
||||
- Generates RST files for only the specified package (default: `text-splitters`)
|
||||
- Builds and post-processes HTML documentation
|
||||
- Opens the preview in your browser
|
||||
|
||||
Both targets automatically clean previous builds and handle the complete build pipeline locally, mirroring the CI process but for faster iteration during development.
|
||||
|
||||
#### Documentation Standards
|
||||
|
||||
**Docstring Format:**
|
||||
The API reference uses **Google-style docstrings** with reStructuredText markup. Sphinx processes these through the `sphinx.ext.napoleon` extension to generate documentation.
|
||||
|
||||
**Required format:**
|
||||
|
||||
```python
|
||||
def example_function(param1: str, param2: int = 5) -> bool:
|
||||
"""Brief description of the function.
|
||||
|
||||
Longer description can go here. Use reStructuredText syntax for
|
||||
rich formatting like **bold** and *italic*.
|
||||
|
||||
TODO: code: figure out what works?
|
||||
|
||||
Args:
|
||||
param1: Description of the first parameter.
|
||||
param2: Description of the second parameter with default value.
|
||||
|
||||
Returns:
|
||||
Description of the return value.
|
||||
|
||||
Raises:
|
||||
ValueError: When param1 is empty.
|
||||
TypeError: When param2 is not an integer.
|
||||
|
||||
.. warning::
|
||||
This function is experimental and may change.
|
||||
"""
|
||||
```
|
||||
|
||||
**Special Markers:**
|
||||
|
||||
- `:private:` in docstrings excludes members from documentation
|
||||
- `.. warning::` adds warning admonitions
|
||||
|
||||
#### Site Styling and Assets
|
||||
|
||||
**Theme and Styling:**
|
||||
|
||||
- Uses [**PyData Sphinx Theme**](https://pydata-sphinx-theme.readthedocs.io/en/stable/index.html) (`pydata_sphinx_theme`)
|
||||
- Custom CSS in `docs/api_reference/_static/css/custom.css` with LangChain-specific:
|
||||
- Color palette
|
||||
- Inter font family
|
||||
- Custom navbar height and sidebar formatting
|
||||
- Deprecated/beta feature styling
|
||||
|
||||
**Static Assets:**
|
||||
|
||||
- Logos: `_static/wordmark-api.svg` (light) and `_static/wordmark-api-dark.svg` (dark mode)
|
||||
- Favicon: `_static/img/brand/favicon.png`
|
||||
- Custom CSS: `_static/css/custom.css`
|
||||
|
||||
**Post-Processing:**
|
||||
|
||||
- `scripts/custom_formatter.py` cleans up generated HTML:
|
||||
- Shortens TOC entries from `ClassName.method()` to `method()`
|
||||
|
||||
**Analytics and Integration:**
|
||||
|
||||
- GitHub integration (source links, edit buttons)
|
||||
- Example backlinking through custom `ExampleLinksDirective`
|
||||
|
||||
@@ -50,7 +50,7 @@ class GalleryGridDirective(SphinxDirective):
|
||||
individual cards + ["image", "header", "content", "title"].
|
||||
|
||||
Danger:
|
||||
This directive can only be used in the context of a Myst documentation page as
|
||||
This directive can only be used in the context of a MyST documentation page as
|
||||
the templates use Markdown flavored formatting.
|
||||
"""
|
||||
|
||||
|
||||
@@ -394,3 +394,21 @@ p {
|
||||
font-size: 0.9rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
/* Deprecation announcement banner styling */
|
||||
.bd-header-announcement {
|
||||
background-color: #790000 !important;
|
||||
color: white !important;
|
||||
font-weight: 600;
|
||||
padding: 0.75rem 1rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.bd-header-announcement a {
|
||||
color: white !important;
|
||||
text-decoration: underline !important;
|
||||
}
|
||||
|
||||
.bd-header-announcement a:hover {
|
||||
color: #f0f0f0 !important;
|
||||
}
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
"""Configuration file for the Sphinx documentation builder."""
|
||||
|
||||
# Configuration file for the Sphinx documentation builder.
|
||||
#
|
||||
# This file only contains a selection of the most common options. For a full
|
||||
# list see the documentation:
|
||||
# https://www.sphinx-doc.org/en/master/usage/configuration.html
|
||||
@@ -20,16 +18,18 @@ from docutils.parsers.rst.directives.admonitions import BaseAdmonition
|
||||
from docutils.statemachine import StringList
|
||||
from sphinx.util.docutils import SphinxDirective
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
|
||||
# Add paths to Python import system so Sphinx can import LangChain modules
|
||||
# This allows autodoc to introspect and document the actual code
|
||||
_DIR = Path(__file__).parent.absolute()
|
||||
sys.path.insert(0, os.path.abspath("."))
|
||||
sys.path.insert(0, os.path.abspath("../../libs/langchain"))
|
||||
sys.path.insert(0, os.path.abspath(".")) # Current directory
|
||||
sys.path.insert(0, os.path.abspath("../../libs/langchain")) # LangChain main package
|
||||
|
||||
# Load package metadata from pyproject.toml (for version info, etc.)
|
||||
with (_DIR.parents[1] / "libs" / "langchain" / "pyproject.toml").open("r") as f:
|
||||
data = toml.load(f)
|
||||
|
||||
# Load mapping of classes to example notebooks for backlinking
|
||||
# This file is generated by scripts that scan our tutorial/example notebooks
|
||||
with (_DIR / "guide_imports.json").open("r") as f:
|
||||
imported_classes = json.load(f)
|
||||
|
||||
@@ -86,6 +86,7 @@ class Beta(BaseAdmonition):
|
||||
|
||||
|
||||
def setup(app):
|
||||
"""Register custom directives and hooks with Sphinx."""
|
||||
app.add_directive("example_links", ExampleLinksDirective)
|
||||
app.add_directive("beta", Beta)
|
||||
app.connect("autodoc-skip-member", skip_private_members)
|
||||
@@ -125,7 +126,7 @@ extensions = [
|
||||
"sphinx.ext.viewcode",
|
||||
"sphinxcontrib.autodoc_pydantic",
|
||||
"IPython.sphinxext.ipython_console_highlighting",
|
||||
"myst_parser",
|
||||
"myst_parser", # For generated index.md and reference.md
|
||||
"_extensions.gallery_directive",
|
||||
"sphinx_design",
|
||||
"sphinx_copybutton",
|
||||
@@ -217,7 +218,7 @@ html_theme_options = {
|
||||
# # Use :html_theme.sidebar_secondary.remove: for file-wide removal
|
||||
# "secondary_sidebar_items": {"**": ["page-toc", "sourcelink"]},
|
||||
# "show_version_warning_banner": True,
|
||||
# "announcement": None,
|
||||
"announcement": "⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank' style='color: white; text-decoration: underline;'>Visit the new v1.0 docs</a> and new <a href='https://reference.langchain.com/python' target='_blank' style='color: white; text-decoration: underline;'>reference docs</a>",
|
||||
"icon_links": [
|
||||
{
|
||||
# Label for this link
|
||||
@@ -258,6 +259,7 @@ html_static_path = ["_static"]
|
||||
html_css_files = ["css/custom.css"]
|
||||
html_use_index = False
|
||||
|
||||
# Only used on the generated index.md and reference.md files
|
||||
myst_enable_extensions = ["colon_fence"]
|
||||
|
||||
# generate autosummary even if no references
|
||||
@@ -268,11 +270,11 @@ autosummary_ignore_module_all = False
|
||||
html_copy_source = False
|
||||
html_show_sourcelink = False
|
||||
|
||||
googleanalytics_id = "G-9B66JQQH2F"
|
||||
|
||||
# Set canonical URL from the Read the Docs Domain
|
||||
html_baseurl = os.environ.get("READTHEDOCS_CANONICAL_URL", "")
|
||||
|
||||
googleanalytics_id = "G-9B66JQQH2F"
|
||||
|
||||
# Tell Jinja2 templates the build is running on Read the Docs
|
||||
if os.environ.get("READTHEDOCS", "") == "True":
|
||||
html_context["READTHEDOCS"] = True
|
||||
|
||||
@@ -1,4 +1,41 @@
|
||||
"""Script for auto-generating api_reference.rst."""
|
||||
"""Auto-generate API reference documentation (RST files) for LangChain packages.
|
||||
|
||||
* Automatically discovers all packages in `libs/` and `libs/partners/`
|
||||
* For each package, recursively walks the filesystem to:
|
||||
* Load Python modules using importlib
|
||||
* Extract classes and functions using Python's inspect module
|
||||
* Classify objects by type (Pydantic models, Runnables, TypedDicts, etc.)
|
||||
* Filter out private members (names starting with '_') and deprecated items
|
||||
* Creates structured RST files with:
|
||||
* Module-level documentation pages with autosummary tables
|
||||
* Different Sphinx templates based on object type (see templates/ directory)
|
||||
* Proper cross-references and navigation structure
|
||||
* Separation of current vs deprecated APIs
|
||||
* Generates a directory tree like:
|
||||
```
|
||||
docs/api_reference/
|
||||
├── index.md # Main landing page with package gallery
|
||||
├── reference.md # Package overview and navigation
|
||||
├── core/ # langchain-core documentation
|
||||
│ ├── index.rst
|
||||
│ ├── callbacks.rst
|
||||
│ └── ...
|
||||
├── langchain/ # langchain documentation
|
||||
│ ├── index.rst
|
||||
│ └── ...
|
||||
└── partners/ # Integration packages
|
||||
├── openai/
|
||||
├── anthropic/
|
||||
└── ...
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
* Respects privacy markers:
|
||||
* Modules with `:private:` in docstring are excluded entirely
|
||||
* Objects with `:private:` in docstring are filtered out
|
||||
* Names starting with '_' are treated as private
|
||||
"""
|
||||
|
||||
import importlib
|
||||
import inspect
|
||||
@@ -177,12 +214,13 @@ def _load_package_modules(
|
||||
Traversal based on the file system makes it easy to determine which
|
||||
of the modules/packages are part of the package vs. 3rd party or built-in.
|
||||
|
||||
Parameters:
|
||||
package_directory (Union[str, Path]): Path to the package directory.
|
||||
submodule (Optional[str]): Optional name of submodule to load.
|
||||
Args:
|
||||
package_directory: Path to the package directory.
|
||||
submodule: Optional name of submodule to load.
|
||||
|
||||
Returns:
|
||||
Dict[str, ModuleMembers]: A dictionary where keys are module names and values are ModuleMembers objects.
|
||||
A dictionary where keys are module names and values are `ModuleMembers`
|
||||
objects.
|
||||
"""
|
||||
package_path = (
|
||||
Path(package_directory)
|
||||
@@ -199,12 +237,13 @@ def _load_package_modules(
|
||||
package_path = package_path / submodule
|
||||
|
||||
for file_path in package_path.rglob("*.py"):
|
||||
# Skip private modules
|
||||
if file_path.name.startswith("_"):
|
||||
continue
|
||||
|
||||
# Skip integration_template and project_template directories (for libs/cli)
|
||||
if "integration_template" in file_path.parts:
|
||||
continue
|
||||
|
||||
if "project_template" in file_path.parts:
|
||||
continue
|
||||
|
||||
@@ -215,8 +254,13 @@ def _load_package_modules(
|
||||
continue
|
||||
|
||||
# Get the full namespace of the module
|
||||
# Example: langchain_core/schema/output_parsers.py ->
|
||||
# langchain_core.schema.output_parsers
|
||||
namespace = str(relative_module_name).replace(".py", "").replace("/", ".")
|
||||
|
||||
# Keep only the top level namespace
|
||||
# Example: langchain_core.schema.output_parsers ->
|
||||
# langchain_core
|
||||
top_namespace = namespace.split(".")[0]
|
||||
|
||||
try:
|
||||
@@ -253,16 +297,16 @@ def _construct_doc(
|
||||
members_by_namespace: Dict[str, ModuleMembers],
|
||||
package_version: str,
|
||||
) -> List[typing.Tuple[str, str]]:
|
||||
"""Construct the contents of the reference.rst file for the given package.
|
||||
"""Construct the contents of the `reference.rst` for the given package.
|
||||
|
||||
Args:
|
||||
package_namespace: The package top level namespace
|
||||
members_by_namespace: The members of the package, dict organized by top level
|
||||
module contains a list of classes and functions
|
||||
inside of the top level namespace.
|
||||
members_by_namespace: The members of the package dict organized by top level.
|
||||
Module contains a list of classes and functions inside of the top level
|
||||
namespace.
|
||||
|
||||
Returns:
|
||||
The contents of the reference.rst file.
|
||||
The string contents of the reference.rst file.
|
||||
"""
|
||||
docs = []
|
||||
index_doc = f"""\
|
||||
@@ -465,10 +509,13 @@ def _construct_doc(
|
||||
|
||||
|
||||
def _build_rst_file(package_name: str = "langchain") -> None:
|
||||
"""Create a rst file for building of documentation.
|
||||
"""Create a rst file for a given package.
|
||||
|
||||
Args:
|
||||
package_name: Can be either "langchain" or "core"
|
||||
package_name: Name of the package to create the rst file for.
|
||||
|
||||
Returns:
|
||||
The rst file is created in the same directory as this script.
|
||||
"""
|
||||
package_dir = _package_dir(package_name)
|
||||
package_members = _load_package_modules(package_dir)
|
||||
@@ -500,7 +547,10 @@ def _package_namespace(package_name: str) -> str:
|
||||
|
||||
|
||||
def _package_dir(package_name: str = "langchain") -> Path:
|
||||
"""Return the path to the directory containing the documentation."""
|
||||
"""Return the path to the directory containing the documentation.
|
||||
|
||||
Attempts to find the package in `libs/` first, then `libs/partners/`.
|
||||
"""
|
||||
if (ROOT_DIR / "libs" / package_name).exists():
|
||||
return ROOT_DIR / "libs" / package_name / _package_namespace(package_name)
|
||||
else:
|
||||
@@ -514,7 +564,7 @@ def _package_dir(package_name: str = "langchain") -> Path:
|
||||
|
||||
|
||||
def _get_package_version(package_dir: Path) -> str:
|
||||
"""Return the version of the package."""
|
||||
"""Return the version of the package by reading the `pyproject.toml`."""
|
||||
try:
|
||||
with open(package_dir.parent / "pyproject.toml", "r") as f:
|
||||
pyproject = toml.load(f)
|
||||
@@ -540,6 +590,15 @@ def _out_file_path(package_name: str) -> Path:
|
||||
|
||||
|
||||
def _build_index(dirs: List[str]) -> None:
|
||||
"""Build the index.md file for the API reference.
|
||||
|
||||
Args:
|
||||
dirs: List of package directories to include in the index.
|
||||
|
||||
Returns:
|
||||
The index.md file is created in the same directory as this script.
|
||||
"""
|
||||
|
||||
custom_names = {
|
||||
"aws": "AWS",
|
||||
"ai21": "AI21",
|
||||
@@ -556,12 +615,17 @@ def _build_index(dirs: List[str]) -> None:
|
||||
integrations = sorted(dir_ for dir_ in dirs if dir_ not in main_)
|
||||
doc = """# LangChain Python API Reference
|
||||
|
||||
Welcome to the LangChain Python API reference. This is a reference for all
|
||||
Welcome to the LangChain v0.3 Python API reference. This is a reference for all
|
||||
`langchain-x` packages.
|
||||
|
||||
For user guides see [https://python.langchain.com](https://python.langchain.com).
|
||||
```{danger}
|
||||
These pages refer to the the v0.3 versions of LangChain packages and integrations. To
|
||||
visit the documentation for the latest versions of LangChain, visit [https://docs.langchain.com](https://docs.langchain.com)
|
||||
and [https://reference.langchain.com/python/](https://reference.langchain.com/python/) (for references.)
|
||||
|
||||
For the legacy API reference (<v0.3) hosted on ReadTheDocs see [https://api.python.langchain.com/](https://api.python.langchain.com/).
|
||||
```
|
||||
|
||||
For the legacy API reference hosted on ReadTheDocs see [https://api.python.langchain.com/](https://api.python.langchain.com/).
|
||||
"""
|
||||
|
||||
if main_:
|
||||
@@ -647,9 +711,14 @@ See the full list of integrations in the Section Navigation.
|
||||
{integration_tree}
|
||||
```
|
||||
"""
|
||||
# Write the reference.md file
|
||||
with open(HERE / "reference.md", "w") as f:
|
||||
f.write(doc)
|
||||
|
||||
# Write a dummy index.md file that points to reference.md
|
||||
# Sphinx requires an index file to exist in each doc directory
|
||||
# TODO: investigate why we don't just put everything in index.md directly?
|
||||
# if it works it works I guess
|
||||
dummy_index = """\
|
||||
# API reference
|
||||
|
||||
@@ -665,8 +734,11 @@ Reference<reference>
|
||||
|
||||
|
||||
def main(dirs: Optional[list] = None) -> None:
|
||||
"""Generate the api_reference.rst file for each package."""
|
||||
print("Starting to build API reference files.")
|
||||
"""Generate the `api_reference.rst` file for each package.
|
||||
|
||||
If dirs is None, generate for all packages in `libs/` and `libs/partners/`.
|
||||
Otherwise generate only for the specified package(s).
|
||||
"""
|
||||
if not dirs:
|
||||
dirs = [
|
||||
p.parent.name
|
||||
@@ -675,18 +747,17 @@ def main(dirs: Optional[list] = None) -> None:
|
||||
if p.parent.parent.name in ("libs", "partners")
|
||||
]
|
||||
for dir_ in sorted(dirs):
|
||||
# Skip any hidden directories
|
||||
# Skip any hidden directories prefixed with a dot
|
||||
# Some of these could be present by mistake in the code base
|
||||
# e.g., .pytest_cache from running tests from the wrong location.
|
||||
# (e.g., .pytest_cache from running tests from the wrong location)
|
||||
if dir_.startswith("."):
|
||||
print("Skipping dir:", dir_)
|
||||
continue
|
||||
else:
|
||||
print("Building package:", dir_)
|
||||
print("Building:", dir_)
|
||||
_build_rst_file(package_name=dir_)
|
||||
|
||||
_build_index(sorted(dirs))
|
||||
print("API reference files built.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
autodoc_pydantic>=2,<3
|
||||
sphinx>=8,<9
|
||||
myst-parser>=3
|
||||
sphinx-autobuild>=2024
|
||||
pydata-sphinx-theme>=0.15
|
||||
toml>=0.10.2
|
||||
myst-nb>=1.1.1
|
||||
pyyaml
|
||||
sphinx-design
|
||||
sphinx-copybutton
|
||||
beautifulsoup4
|
||||
sphinxcontrib-googleanalytics
|
||||
pydata-sphinx-theme>=0.15
|
||||
myst-parser>=3
|
||||
myst-nb>=1.1.1
|
||||
toml>=0.10.2
|
||||
pyyaml
|
||||
beautifulsoup4
|
||||
|
||||
@@ -1,3 +1,10 @@
|
||||
"""Post-process generated HTML files to clean up table-of-contents headers.
|
||||
|
||||
Runs after Sphinx generates the API reference HTML. It finds TOC entries like
|
||||
"ClassName.method_name()" and shortens them to just "method_name()" for better
|
||||
readability in the sidebar navigation.
|
||||
"""
|
||||
|
||||
import sys
|
||||
from glob import glob
|
||||
from pathlib import Path
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
:::danger
|
||||
⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the new v1.0 docs</a>
|
||||
:::
|
||||
|
||||
# Conceptual guide
|
||||
|
||||
This guide provides explanations of the key concepts behind the LangChain framework and AI applications more broadly.
|
||||
|
||||
@@ -3,6 +3,10 @@ sidebar_position: 0
|
||||
sidebar_class_name: hidden
|
||||
---
|
||||
|
||||
:::danger
|
||||
⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the new v1.0 docs</a>
|
||||
:::
|
||||
|
||||
# How-to guides
|
||||
|
||||
Here you’ll find answers to "How do I….?" types of questions.
|
||||
@@ -72,7 +76,7 @@ See [supported integrations](/docs/integrations/chat/) for details on getting st
|
||||
|
||||
### Example selectors
|
||||
|
||||
[Example Selectors](/docs/concepts/example_selectors) are responsible for selecting the correct few shot examples to pass to the prompt.
|
||||
[Example Selectors](/docs/concepts/example_selectors) are responsible for selecting the correct few-shot examples to pass to the prompt.
|
||||
|
||||
- [How to: use example selectors](/docs/how_to/example_selectors)
|
||||
- [How to: select examples by length](/docs/how_to/example_selectors_length_based)
|
||||
@@ -168,7 +172,7 @@ See [supported integrations](/docs/integrations/vectorstores/) for details on ge
|
||||
|
||||
Indexing is the process of keeping your vectorstore in-sync with the underlying data source.
|
||||
|
||||
- [How to: reindex data to keep your vectorstore in sync with the underlying data source](/docs/how_to/indexing)
|
||||
- [How to: reindex data to keep your vectorstore in-sync with the underlying data source](/docs/how_to/indexing)
|
||||
|
||||
### Tools
|
||||
|
||||
|
||||
@@ -1191,6 +1191,40 @@
|
||||
"response.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "74247a07-b153-444f-9c56-77659aeefc88",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Context management\n",
|
||||
"\n",
|
||||
"Anthropic supports a context editing feature that will automatically manage the model's context window (e.g., by clearing tool results).\n",
|
||||
"\n",
|
||||
"See [Anthropic documentation](https://docs.claude.com/en/docs/build-with-claude/context-editing) for details and configuration options.\n",
|
||||
"\n",
|
||||
":::info\n",
|
||||
"Requires ``langchain-anthropic>=0.3.21``\n",
|
||||
":::"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "cbb79c5d-37b5-4212-b36f-f27366192cf9",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_anthropic import ChatAnthropic\n",
|
||||
"\n",
|
||||
"llm = ChatAnthropic(\n",
|
||||
" model=\"claude-sonnet-4-5-20250929\",\n",
|
||||
" betas=[\"context-management-2025-06-27\"],\n",
|
||||
" context_management={\"edits\": [{\"type\": \"clear_tool_uses_20250919\"}]},\n",
|
||||
")\n",
|
||||
"llm_with_tools = llm.bind_tools([{\"type\": \"web_search_20250305\", \"name\": \"web_search\"}])\n",
|
||||
"response = llm_with_tools.invoke(\"Search for recent developments in AI\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "cbfec7a9-d9df-4d12-844e-d922456dd9bf",
|
||||
@@ -1457,6 +1491,38 @@
|
||||
"</details>"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "29405da2-d2ef-415c-b674-6e29073cd05e",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Memory tool\n",
|
||||
"\n",
|
||||
"Claude supports a memory tool for client-side storage and retrieval of context across conversational threads. See docs [here](https://docs.claude.com/en/docs/agents-and-tools/tool-use/memory-tool) for details.\n",
|
||||
"\n",
|
||||
":::info\n",
|
||||
"Requires ``langchain-anthropic>=0.3.21``\n",
|
||||
":::"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "bbd76eaa-041f-4fb8-8346-ca8fe0001c01",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_anthropic import ChatAnthropic\n",
|
||||
"\n",
|
||||
"llm = ChatAnthropic(\n",
|
||||
" model=\"claude-sonnet-4-5-20250929\",\n",
|
||||
" betas=[\"context-management-2025-06-27\"],\n",
|
||||
")\n",
|
||||
"llm_with_tools = llm.bind_tools([{\"type\": \"memory_20250818\", \"name\": \"memory\"}])\n",
|
||||
"\n",
|
||||
"response = llm_with_tools.invoke(\"What are my interests?\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "040f381a-1768-479a-9a5e-aa2d7d77e0d5",
|
||||
|
||||
@@ -3,6 +3,10 @@ sidebar_position: 0
|
||||
sidebar_class_name: hidden
|
||||
---
|
||||
|
||||
:::danger
|
||||
⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the new v1.0 docs</a>
|
||||
:::
|
||||
|
||||
# Introduction
|
||||
|
||||
**LangChain** is a framework for developing applications powered by large language models (LLMs).
|
||||
|
||||
@@ -2,6 +2,11 @@
|
||||
sidebar_position: 0
|
||||
sidebar_class_name: hidden
|
||||
---
|
||||
|
||||
:::danger
|
||||
⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the new v1.0 docs</a>
|
||||
:::
|
||||
|
||||
# Tutorials
|
||||
|
||||
New to LangChain or LLM app development in general? Read this material to quickly get up and running building your first applications.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# LangChain v0.3
|
||||
|
||||
*Last updated: 09.16.24*
|
||||
*Last updated: 09.16.2024*
|
||||
|
||||
## What's changed
|
||||
|
||||
|
||||
@@ -87,7 +87,7 @@ const config = {
|
||||
({
|
||||
docs: {
|
||||
editUrl:
|
||||
"https://github.com/langchain-ai/langchain/edit/master/docs/",
|
||||
"https://github.com/langchain-ai/langchain/edit/v0.3/docs/",
|
||||
sidebarPath: require.resolve("./sidebars.js"),
|
||||
remarkPlugins: [
|
||||
[require("@docusaurus/remark-plugin-npm2yarn"), { sync: true }],
|
||||
@@ -142,8 +142,8 @@ const config = {
|
||||
respectPrefersColorScheme: true,
|
||||
},
|
||||
announcementBar: {
|
||||
content: "These docs will be deprecated and no longer maintained with the release of LangChain v1.0 in October 2025. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the v1.0 alpha docs</a>",
|
||||
backgroundColor: "#FFAE42",
|
||||
content: "⚠️ THESE DOCS ARE OUTDATED. <a href='https://docs.langchain.com/oss/python/langchain/overview' target='_blank'>Visit the new v1.0 docs</a>",
|
||||
backgroundColor: "#790000ff",
|
||||
},
|
||||
prism: {
|
||||
theme: {
|
||||
|
||||
@@ -16,14 +16,13 @@ fi
|
||||
|
||||
if { \
|
||||
[ "$VERCEL_ENV" == "production" ] || \
|
||||
[ "$VERCEL_GIT_COMMIT_REF" == "master" ] || \
|
||||
[ "$VERCEL_GIT_COMMIT_REF" == "v0.1" ] || \
|
||||
[ "$VERCEL_GIT_COMMIT_REF" == "v0.2" ] || \
|
||||
[ "$VERCEL_GIT_COMMIT_REF" == "v0.3rc" ]; \
|
||||
} && [ "$VERCEL_GIT_REPO_OWNER" == "langchain-ai" ]
|
||||
then
|
||||
echo "✅ Production build - proceeding with build"
|
||||
exit 1
|
||||
echo "✅ Production build - proceeding with build"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
||||
|
||||
@@ -1,4 +1,20 @@
|
||||
"""This script checks documentation for broken import statements."""
|
||||
"""Check documentation for broken import statements.
|
||||
|
||||
Validates that all import statements in Jupyter notebooks within the documentation
|
||||
directory are functional and can be successfully imported.
|
||||
|
||||
- Scans all `.ipynb` files in `docs/`
|
||||
- Extracts import statements from code cells
|
||||
- Tests each import to ensure it works
|
||||
- Reports any broken imports that would fail for users
|
||||
|
||||
Usage:
|
||||
python docs/scripts/check_imports.py
|
||||
|
||||
Exit codes:
|
||||
0: All imports are valid
|
||||
1: Found broken imports (ImportError raised)
|
||||
"""
|
||||
|
||||
import importlib
|
||||
import json
|
||||
|
||||
@@ -204,7 +204,7 @@ def get_vectorstore_table():
|
||||
"similarity_search_with_score": True,
|
||||
"asearch": True,
|
||||
"Passes Standard Tests": True,
|
||||
"Multi Tenancy": False,
|
||||
"Multi Tenancy": True,
|
||||
"Local/Cloud": "Local",
|
||||
"IDs in add Documents": True,
|
||||
},
|
||||
|
||||
@@ -40,15 +40,15 @@ const FEATURE_TABLES = {
|
||||
"apiLink": "https://python.langchain.com/api_reference/mistralai/chat_models/langchain_mistralai.chat_models.ChatMistralAI.html"
|
||||
},
|
||||
{
|
||||
"name": "ChatAIMLAPI",
|
||||
"package": "langchain-aimlapi",
|
||||
"link": "aimlapi/",
|
||||
"structured_output": true,
|
||||
"tool_calling": true,
|
||||
"json_mode": true,
|
||||
"multimodal": true,
|
||||
"local": false,
|
||||
"apiLink": "https://python.langchain.com/api_reference/aimlapi/chat_models/langchain_aimlapi.chat_models.ChatAIMLAPI.html"
|
||||
"name": "ChatAIMLAPI",
|
||||
"package": "langchain-aimlapi",
|
||||
"link": "aimlapi/",
|
||||
"structured_output": true,
|
||||
"tool_calling": true,
|
||||
"json_mode": true,
|
||||
"multimodal": true,
|
||||
"local": false,
|
||||
"apiLink": "https://python.langchain.com/api_reference/aimlapi/chat_models/langchain_aimlapi.chat_models.ChatAIMLAPI.html"
|
||||
},
|
||||
{
|
||||
"name": "ChatFireworks",
|
||||
@@ -1199,7 +1199,7 @@ const FEATURE_TABLES = {
|
||||
searchWithScore: true,
|
||||
async: true,
|
||||
passesStandardTests: false,
|
||||
multiTenancy: false,
|
||||
multiTenancy: true,
|
||||
local: true,
|
||||
idsInAddDocuments: true,
|
||||
},
|
||||
@@ -1230,17 +1230,17 @@ const FEATURE_TABLES = {
|
||||
idsInAddDocuments: true,
|
||||
},
|
||||
{
|
||||
name: "PGVectorStore",
|
||||
link: "pgvectorstore",
|
||||
deleteById: true,
|
||||
filtering: true,
|
||||
searchByVector: true,
|
||||
searchWithScore: true,
|
||||
async: true,
|
||||
passesStandardTests: true,
|
||||
multiTenancy: false,
|
||||
local: true,
|
||||
idsInAddDocuments: true,
|
||||
name: "PGVectorStore",
|
||||
link: "pgvectorstore",
|
||||
deleteById: true,
|
||||
filtering: true,
|
||||
searchByVector: true,
|
||||
searchWithScore: true,
|
||||
async: true,
|
||||
passesStandardTests: true,
|
||||
multiTenancy: false,
|
||||
local: true,
|
||||
idsInAddDocuments: true,
|
||||
},
|
||||
{
|
||||
name: "PineconeVectorStore",
|
||||
|
||||
@@ -2910,4 +2910,4 @@ const suggestedLinks = {
|
||||
"/v0.1/docs/integrations/toolkits/"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
752
docs/vercel.json
752
docs/vercel.json
@@ -7,160 +7,682 @@
|
||||
{
|
||||
"source": "/docs/integrations(/?)",
|
||||
"destination": "/docs/integrations/platforms/"
|
||||
},
|
||||
{
|
||||
"source": "/v0.1",
|
||||
"destination": "https://langchain-v01.vercel.app/v0.1"
|
||||
},
|
||||
{
|
||||
"source": "/v0.1/:path(.*/?)*",
|
||||
"destination": "https://langchain-v01.vercel.app/v0.1/:path*"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2",
|
||||
"destination": "https://langchain-v02.vercel.app/v0.2"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/:path(.*/?)*",
|
||||
"destination": "https://langchain-v02.vercel.app/v0.2/:path*"
|
||||
}
|
||||
],
|
||||
"redirects": [
|
||||
{
|
||||
"source": "/v0.3/docs/:path(.*/?)*",
|
||||
"destination": "/docs/:path*"
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/agents/tools/custom_tools(/?)",
|
||||
"destination": "/docs/how_to/custom_tools/"
|
||||
"source": "/docs/integrations/platforms/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/expression_language(/?)",
|
||||
"destination": "/docs/concepts/lcel"
|
||||
"source": "/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/GRAPH_RECURSION_LIMIT"
|
||||
},
|
||||
{
|
||||
"source": "/docs/expression_language/interface(/?)",
|
||||
"destination": "/docs/concepts/runnables"
|
||||
"source": "/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/GRAPH_RECURSION_LIMIT"
|
||||
},
|
||||
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/INVALID_CONCURRENT_GRAPH_UPDATE",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_CONCURRENT_GRAPH_UPDATE"
|
||||
},
|
||||
{
|
||||
"source": "/docs/versions/overview(/?)",
|
||||
"destination": "/docs/versions/v0_2/overview/"
|
||||
"source": "/docs/troubleshooting/errors/INVALID_CONCURRENT_GRAPH_UPDATE/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_CONCURRENT_GRAPH_UPDATE"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/tool_calls_multi_modal(/?)",
|
||||
"destination": "/docs/how_to/multimodal_inputs/"
|
||||
"source": "/docs/troubleshooting/errors/INVALID_GRAPH_NODE_RETURN_VALUE",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_GRAPH_NODE_RETURN_VALUE"
|
||||
},
|
||||
{
|
||||
"source": "/docs/tutorials/pdf_qa",
|
||||
"destination": "/docs/tutorials/retrievers/"
|
||||
"source": "/docs/troubleshooting/errors/INVALID_GRAPH_NODE_RETURN_VALUE/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_GRAPH_NODE_RETURN_VALUE"
|
||||
},
|
||||
{
|
||||
"source": "/docs/tutorials/query_analysis",
|
||||
"destination": "/docs/tutorials/rag#query-analysis"
|
||||
"source": "/docs/troubleshooting/errors/MULTIPLE_SUBGRAPHS",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/MULTIPLE_SUBGRAPHS"
|
||||
},
|
||||
{
|
||||
"source": "/docs/tutorials/local_rag",
|
||||
"destination": "/docs/tutorials/rag"
|
||||
"source": "/docs/troubleshooting/errors/MULTIPLE_SUBGRAPHS/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/MULTIPLE_SUBGRAPHS"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/graph_mapping(/?)",
|
||||
"destination": "/docs/tutorials/graph#query-validation"
|
||||
"source": "/docs/troubleshooting/errors/INVALID_CHAT_HISTORY",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_CHAT_HISTORY"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/graph_prompting(/?)",
|
||||
"destination": "/docs/tutorials/graph#few-shot-prompting"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/HTML_header_metadata_splitter(/?)",
|
||||
"destination": "/docs/how_to/split_html#using-htmlheadertextsplitter"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/HTML_section_aware_splitter(/?)",
|
||||
"destination": "/docs/how_to/split_html#using-htmlsectionsplitter"
|
||||
},
|
||||
{
|
||||
"source": "/docs/tutorials/data_generation",
|
||||
"destination": "https://python.langchain.com/v0.2/docs/tutorials/data_generation/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/langsmith(/?)",
|
||||
"destination": "https://docs.smith.langchain.com/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/langgraph(/?)",
|
||||
"destination": "https://langchain-ai.github.io/langgraph"
|
||||
},
|
||||
{
|
||||
"source": "/",
|
||||
"destination": "/docs/introduction/"
|
||||
},
|
||||
{
|
||||
"source": "/docs(/?)",
|
||||
"destination": "/docs/introduction/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/get_started/introduction(/?)",
|
||||
"destination": "/docs/introduction/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/migrate_chains(/?)",
|
||||
"destination": "/docs/versions/migrating_chains"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/docs/templates/:path(.*/?)*",
|
||||
"destination": "https://github.com/langchain-ai/langchain/tree/v0.2/templates/:path*"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/providers/mlflow_ai_gateway(/?)",
|
||||
"destination": "/docs/integrations/providers/mlflow/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/platforms/:path((?:anthropic|aws|google|huggingface|microsoft|openai)?/?)*",
|
||||
"destination": "/docs/integrations/providers/:path*"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/:path((?:GRAPH_RECURSION_LIMIT|INVALID_CONCURRENT_GRAPH_UPDATE|INVALID_GRAPH_NODE_RETURN_VALUE|MULTIPLE_SUBGRAPHS|INVALID_CHAT_HISTORY)/?)*",
|
||||
"destination": "https://langchain-ai.github.io/langgraph/troubleshooting/errors/:path*"
|
||||
"source": "/docs/troubleshooting/errors/INVALID_CHAT_HISTORY/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/errors/INVALID_CHAT_HISTORY"
|
||||
},
|
||||
{
|
||||
"source": "/docs/contributing/:path((?:code|documentation|integrations|testing)(?:/|/.*/?)?)",
|
||||
"destination": "/docs/contributing/how_to/:path"
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/contributing/:path((?:faq|repo_structure|review_process)/?)",
|
||||
"destination": "/docs/contributing/reference/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/retrievers/weaviate-hybrid(/?)",
|
||||
"destination": "/docs/integrations/vectorstores/weaviate/#search-mechanism"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/vectorstores/singlestoredb(/?)",
|
||||
"destination": "https://python.langchain.com/v0.2/docs/integrations/vectorstores/singlestoredb/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/providers/singlestoredb(/?)",
|
||||
"destination": "/docs/integrations/providers/singlestore/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/retrievers/singlestoredb(/?)",
|
||||
"destination": "https://python.langchain.com/v0.2/docs/integrations/retrievers/singlestoredb/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/providers/dspy(/?)",
|
||||
"destination": "https://python.langchain.com/v0.2/docs/integrations/providers/dspy/"
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/mongodb/:path(.*/?)*",
|
||||
"destination": "https://langchain-mongodb.readthedocs.io/en/latest/langchain_mongodb/api_docs.html"
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/tests/:path(.*/?)*",
|
||||
"destination": "/api_reference/standard_tests/:path"
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/retrievers/milvus_hybrid_search(/?)",
|
||||
"destination": "https://python.langchain.com/v0.2/docs/integrations/retrievers/milvus_hybrid_search/"
|
||||
"source": "/en/latest/modules/models/llms/integrations/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/data_connection/document_transformers/text_splitters/recursive_text_splitter",
|
||||
"destination": "https://reference.langchain.com/python/langchain_text_splitters/#langchain_text_splitters.RecursiveCharacterTextSplitter"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/data_connection/document_loaders/",
|
||||
"destination": "https://reference.langchain.com/python/langchain_core/document_loaders/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/data_connection/retrievers/parent_document_retriever",
|
||||
"destination": "https://reference.langchain.com/python/langchain_core/retrievers/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/templates/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/guides/productionization/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/langsmith/home"
|
||||
},
|
||||
{
|
||||
"source": "/docs/guides/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/langsmith/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/langsmith/home"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/toolkits/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/tools/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/providers/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/retrievers/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/retrievers/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/agents/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/langchain/agents/"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/memory/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/short-term-memory"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/model_io/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/models"
|
||||
},
|
||||
{
|
||||
"source": "/docs/use_cases/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/rag"
|
||||
},
|
||||
{
|
||||
"source": "/docs/expression_language/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/llms/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/models"
|
||||
},
|
||||
{
|
||||
"source": "/docs/tutorials/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/rag"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/document_loaders/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/document_loaders/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/chat/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/chat/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/vectorstores/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/vectorstores/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/stores/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/stores/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/text_embedding/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/text_embedding/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/splitters/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/splitters/:path"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/chains/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/google_community/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/google_vertexai/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/cohere/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/text_splitters/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/splitters"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/prompty/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/ai21/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/nvidia_ai_endpoints/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/google_genai/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/anthropic/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/qdrant/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/openai/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/huggingface/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/couchbase/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/postgres/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/ollama/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/fireworks/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/mistralai/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/pinecone/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/together/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/chroma/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/voyageai/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/azure_dynamic_sessions/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/community/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/core/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/langchain/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/experimental/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/_modules/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/api_reference/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/v0.1/api_reference/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/core/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/langchain_core/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/core",
|
||||
"destination": "https://reference.langchain.com/python/langchain_core/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/community/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/community",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/langchain/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/langchain/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/langchain",
|
||||
"destination": "https://reference.langchain.com/python/langchain/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/experimental/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/experimental",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/text_splitters/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/langchain_text_splitters/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/text_splitters",
|
||||
"destination": "https://reference.langchain.com/python/langchain_text_splitters/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/anthropic/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_anthropic/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/anthropic",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_anthropic/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/openai/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_openai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/openai",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_openai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/cohere/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_cohere/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/cohere",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_cohere/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_community/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_community/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_community",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_community/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_vertexai/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_vertexai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_vertexai",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_vertexai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_genai/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_genai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/google_genai",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_google_genai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/huggingface/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_huggingface/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/huggingface",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_huggingface/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/aws/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_aws/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/aws",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_aws/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/azure_dynamic_sessions/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/azure_dynamic_sessions",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/pinecone/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_pinecone/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/pinecone",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_pinecone/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/qdrant/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_qdrant/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/qdrant",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_qdrant/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/chroma/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_chroma/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/chroma",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_chroma/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/milvus/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_milvus/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/milvus",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_milvus/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/postgres/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_postgres/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/postgres",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_postgres/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/mongodb/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_mongodb/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/mongodb",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_mongodb/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/elasticsearch/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_elasticsearch/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/elasticsearch",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_elasticsearch/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/astradb/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_astradb/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/astradb",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_astradb/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/couchbase/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/couchbase",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/ollama/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_ollama/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/ollama",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_ollama/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/fireworks/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_fireworks/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/fireworks",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_fireworks/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/mistralai/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_mistralai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/mistralai",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_mistralai/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/together/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_together/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/together",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_together/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/voyageai/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/voyageai",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/ai21/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/ai21",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/nvidia_ai_endpoints/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_nvidia_ai_endpoints/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/nvidia_ai_endpoints",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_nvidia_ai_endpoints/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/prompty/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_prompty/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/prompty",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_prompty/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/box/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/box",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/exa/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_exa/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/exa",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_exa/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/robocorp/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/robocorp",
|
||||
"destination": "https://reference.langchain.com/python/integrations/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/unstructured/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_unstructured/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/unstructured",
|
||||
"destination": "https://reference.langchain.com/python/integrations/langchain_unstructured/"
|
||||
},
|
||||
{
|
||||
"source": "/api_reference/:path(.*)",
|
||||
"destination": "https://reference.langchain.com/python/"
|
||||
},
|
||||
{
|
||||
"source": "/v0.2/:path((?!api_reference).*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/v0.1/:path((?!api_reference).*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/en/latest/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/modules/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/integrations/:path(.*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/integrations/providers/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/graph_mapping(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/knowledge-base"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/chat_model_rate_limiting(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/models#rate-limiting"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/structured_output(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/structured-output"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/graph_prompting(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/knowledge-base"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/multimodal_inputs(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/messages#multimodal"
|
||||
},
|
||||
{
|
||||
"source": "/docs/concepts/chat_models(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/models"
|
||||
},
|
||||
{
|
||||
"source": "/docs/langgraph(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/overview"
|
||||
},
|
||||
{
|
||||
"source": "/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/how_to/migrate_chains(/?)",
|
||||
"destination": "https://docs.langchain.com/oss/python/migrate/langchain-v1"
|
||||
},
|
||||
{
|
||||
"source": "/docs/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
},
|
||||
{
|
||||
"source": "/docs/security/",
|
||||
"destination": "https://docs.langchain.com/oss/python/security-policy"
|
||||
},
|
||||
{
|
||||
"source": "docs/introduction/%23%EF%B8%8F-langgraph",
|
||||
"destination": "https://docs.langchain.com/oss/python/langgraph/overview"
|
||||
},
|
||||
{
|
||||
"source": "docs/introduction/%23%EF%B8%8F-langsmith",
|
||||
"destination": "https://docs.langchain.com/langsmith/home"
|
||||
},
|
||||
{
|
||||
"source": "/docs/get_started/introduction",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/quickstart"
|
||||
},
|
||||
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/INVALID_PROMPT_INPUT/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/INVALID_PROMPT_INPUT"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/INVALID_TOOL_RESULTS/",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/INVALID_TOOL_RESULTS"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/MESSAGE_COERCION_FAILURE",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/MESSAGE_COERCION_FAILURE"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/MODEL_AUTHENTICATION",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/MODEL_AUTHENTICATION"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/MODEL_NOT_FOUND",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/MODEL_NOT_FOUND"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/MODEL_RATE_LIMIT",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/MODEL_RATE_LIMIT"
|
||||
},
|
||||
{
|
||||
"source": "/docs/troubleshooting/errors/OUTPUT_PARSING_FAILURE",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/errors/OUTPUT_PARSING_FAILURE"
|
||||
},
|
||||
{
|
||||
"source": "/:path((?!api_reference).*)",
|
||||
"destination": "https://docs.langchain.com/oss/python/langchain/overview"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
# Contributing to langchain-cli
|
||||
|
||||
Update CLI versions with `poe bump` to ensure that version commands display correctly.
|
||||
@@ -18,7 +18,7 @@ def create_demo_server(
|
||||
|
||||
Args:
|
||||
config_keys: Optional sequence of config keys to expose in the playground.
|
||||
playground_type: The type of playground to use. Can be `'default'` or `'chat'`.
|
||||
playground_type: The type of playground to use.
|
||||
|
||||
Returns:
|
||||
The demo server.
|
||||
|
||||
@@ -46,7 +46,7 @@ class __ModuleName__Retriever(BaseRetriever):
|
||||
|
||||
retriever.invoke(query)
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
# TODO: Example output.
|
||||
|
||||
@@ -80,7 +80,7 @@ class __ModuleName__Retriever(BaseRetriever):
|
||||
|
||||
chain.invoke("...")
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
# TODO: Example output.
|
||||
|
||||
|
||||
@@ -42,7 +42,7 @@ class __ModuleName__Toolkit(BaseToolkit):
|
||||
|
||||
toolkit.get_tools()
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
# TODO: Example output.
|
||||
|
||||
@@ -62,7 +62,7 @@ class __ModuleName__Toolkit(BaseToolkit):
|
||||
for event in events:
|
||||
event["messages"][-1].pretty_print()
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
# TODO: Example output.
|
||||
|
||||
|
||||
@@ -18,7 +18,5 @@ class Test__ModuleName__Retriever(RetrieversIntegrationTests):
|
||||
|
||||
@property
|
||||
def retriever_query_example(self) -> str:
|
||||
"""
|
||||
Returns a str representing the "query" of an example retriever call.
|
||||
"""
|
||||
"""Returns a str representing the "query" of an example retriever call."""
|
||||
return "example query"
|
||||
|
||||
@@ -21,7 +21,7 @@ class TestParrotMultiplyToolIntegration(ToolsIntegrationTests):
|
||||
"""
|
||||
Returns a dictionary representing the "args" of an example tool call.
|
||||
|
||||
This should NOT be a ToolCall dict - i.e. it should not
|
||||
have {"name", "id", "args"} keys.
|
||||
This should NOT be a ToolCall dict - i.e. it should not have
|
||||
`{"name", "id", "args"}` keys.
|
||||
"""
|
||||
return {"a": 2, "b": 3}
|
||||
|
||||
@@ -11,7 +11,7 @@ class TestParrotMultiplyToolUnit(ToolsUnitTests):
|
||||
|
||||
@property
|
||||
def tool_constructor_params(self) -> dict:
|
||||
# if your tool constructor instead required initialization arguments like
|
||||
# If your tool constructor instead required initialization arguments like
|
||||
# `def __init__(self, some_arg: int):`, you would return those here
|
||||
# as a dictionary, e.g.: `return {'some_arg': 42}`
|
||||
return {}
|
||||
@@ -21,7 +21,7 @@ class TestParrotMultiplyToolUnit(ToolsUnitTests):
|
||||
"""
|
||||
Returns a dictionary representing the "args" of an example tool call.
|
||||
|
||||
This should NOT be a ToolCall dict - i.e. it should not
|
||||
have {"name", "id", "args"} keys.
|
||||
This should NOT be a ToolCall dict - i.e. it should not have
|
||||
`{"name", "id", "args"}` keys.
|
||||
"""
|
||||
return {"a": 2, "b": 3}
|
||||
|
||||
@@ -159,8 +159,8 @@ def add(
|
||||
"""Add the specified template to the current LangServe app.
|
||||
|
||||
e.g.:
|
||||
langchain app add extraction-openai-functions
|
||||
langchain app add git+ssh://git@github.com/efriis/simple-pirate.git
|
||||
`langchain app add extraction-openai-functions`
|
||||
`langchain app add git+ssh://git@github.com/efriis/simple-pirate.git`
|
||||
"""
|
||||
if branch is None:
|
||||
branch = []
|
||||
|
||||
@@ -116,17 +116,17 @@ def new(
|
||||
typer.echo(f"Folder {destination_dir} exists.")
|
||||
raise typer.Exit(code=1)
|
||||
|
||||
# copy over template from ../integration_template
|
||||
# Copy over template from ../integration_template
|
||||
shutil.copytree(project_template_dir, destination_dir, dirs_exist_ok=False)
|
||||
|
||||
# folder movement
|
||||
# Folder movement
|
||||
package_dir = destination_dir / replacements["__module_name__"]
|
||||
shutil.move(destination_dir / "integration_template", package_dir)
|
||||
|
||||
# replacements in files
|
||||
# Replacements in files
|
||||
replace_glob(destination_dir, "**/*", cast("dict[str, str]", replacements))
|
||||
|
||||
# dependency install
|
||||
# Dependency install
|
||||
try:
|
||||
# Use --no-progress to avoid tty issues in CI/test environments
|
||||
env = os.environ.copy()
|
||||
@@ -149,7 +149,7 @@ def new(
|
||||
"`uv sync --dev` manually in the package directory.",
|
||||
)
|
||||
else:
|
||||
# confirm src and dst are the same length
|
||||
# Confirm src and dst are the same length
|
||||
if not src:
|
||||
typer.echo("Cannot provide --dst without --src.")
|
||||
raise typer.Exit(code=1)
|
||||
@@ -158,7 +158,7 @@ def new(
|
||||
typer.echo("Number of --src and --dst arguments must match.")
|
||||
raise typer.Exit(code=1)
|
||||
if not dst:
|
||||
# assume we're in a package dir, copy to equivalent path
|
||||
# Assume we're in a package dir, copy to equivalent path
|
||||
dst_paths = [destination_dir / p for p in src]
|
||||
else:
|
||||
dst_paths = [Path.cwd() / p for p in dst]
|
||||
@@ -169,7 +169,7 @@ def new(
|
||||
for p in dst_paths
|
||||
]
|
||||
|
||||
# confirm no duplicate dst_paths
|
||||
# Confirm no duplicate dst_paths
|
||||
if len(dst_paths) != len(set(dst_paths)):
|
||||
typer.echo(
|
||||
"Duplicate destination paths provided or computed - please "
|
||||
@@ -177,7 +177,7 @@ def new(
|
||||
)
|
||||
raise typer.Exit(code=1)
|
||||
|
||||
# confirm no files exist at dst_paths
|
||||
# Confirm no files exist at dst_paths
|
||||
for dst_path in dst_paths:
|
||||
if dst_path.exists():
|
||||
typer.echo(f"File {dst_path} exists.")
|
||||
|
||||
@@ -75,7 +75,7 @@ def generate_raw_migrations(
|
||||
def generate_top_level_imports(pkg: str) -> list[tuple[str, str]]:
|
||||
"""Look at all the top level modules in langchain_community.
|
||||
|
||||
Attempt to import everything from each ``__init__`` file. For example,
|
||||
Attempt to import everything from each `__init__` file. For example,
|
||||
|
||||
langchain_community/
|
||||
chat_models/
|
||||
@@ -83,16 +83,15 @@ def generate_top_level_imports(pkg: str) -> list[tuple[str, str]]:
|
||||
llm/
|
||||
__init__.py # <-- import everything from here
|
||||
|
||||
|
||||
It'll collect all the imports, import the classes / functions it can find
|
||||
there. It'll return a list of 2-tuples
|
||||
|
||||
Each tuple will contain the fully qualified path of the class / function to where
|
||||
its logic is defined
|
||||
(e.g., ``langchain_community.chat_models.xyz_implementation.ver2.XYZ``)
|
||||
its logic is defined.
|
||||
(e.g., `langchain_community.chat_models.xyz_implementation.ver2.XYZ`)
|
||||
and the second tuple will contain the path
|
||||
to importing it from the top level namespaces
|
||||
(e.g., ``langchain_community.chat_models.XYZ``)
|
||||
(e.g., `langchain_community.chat_models.XYZ`)
|
||||
|
||||
Args:
|
||||
pkg: The package to scan.
|
||||
|
||||
@@ -28,7 +28,6 @@ def get_migrations_for_partner_package(pkg_name: str) -> list[tuple[str, str]]:
|
||||
|
||||
Returns:
|
||||
List of 2-tuples containing old and new import paths.
|
||||
|
||||
"""
|
||||
package = importlib.import_module(pkg_name)
|
||||
classes_ = find_subclasses_in_module(
|
||||
|
||||
@@ -38,19 +38,19 @@ def parse_dependency_string(
|
||||
branch: str | None,
|
||||
api_path: str | None,
|
||||
) -> DependencySource:
|
||||
"""Parse a dependency string into a DependencySource.
|
||||
"""Parse a dependency string into a `DependencySource`.
|
||||
|
||||
Args:
|
||||
dep: the dependency string.
|
||||
repo: optional repository.
|
||||
branch: optional branch.
|
||||
api_path: optional API path.
|
||||
dep: The dependency string
|
||||
repo: Optional repository
|
||||
branch: Optional branch
|
||||
api_path: Optional API path
|
||||
|
||||
Returns:
|
||||
The parsed dependency source information.
|
||||
The parsed dependency source information
|
||||
|
||||
Raises:
|
||||
ValueError: if the dependency string is invalid.
|
||||
ValueError: If the dependency string is invalid
|
||||
"""
|
||||
if dep is not None and dep.startswith("git+"):
|
||||
if repo is not None or branch is not None:
|
||||
@@ -147,8 +147,8 @@ def parse_dependencies(
|
||||
"""Parse dependencies.
|
||||
|
||||
Args:
|
||||
dependencies: the dependencies to parse
|
||||
repo: the repositories to use
|
||||
dependencies: The dependencies to parse
|
||||
repo: The repositories to use
|
||||
branch: the branches to use
|
||||
api_path: the api paths to use
|
||||
|
||||
@@ -244,7 +244,7 @@ def copy_repo(
|
||||
) -> None:
|
||||
"""Copiy a repo, ignoring git folders.
|
||||
|
||||
Raises FileNotFound error if it can't find source
|
||||
Raises `FileNotFound` if it can't find source
|
||||
"""
|
||||
|
||||
def ignore_func(_: str, files: list[str]) -> list[str]:
|
||||
|
||||
@@ -37,13 +37,12 @@ def get_package_root(cwd: Path | None = None) -> Path:
|
||||
|
||||
|
||||
class LangServeExport(TypedDict):
|
||||
"""Fields from pyproject.toml that are relevant to LangServe.
|
||||
"""Fields from `pyproject.toml` that are relevant to LangServe.
|
||||
|
||||
Attributes:
|
||||
module: The module to import from, tool.langserve.export_module
|
||||
attr: The attribute to import from the module, tool.langserve.export_attr
|
||||
package_name: The name of the package, tool.poetry.name
|
||||
|
||||
module: The module to import from, `tool.langserve.export_module`
|
||||
attr: The attribute to import from the module, `tool.langserve.export_attr`
|
||||
package_name: The name of the package, `tool.poetry.name`
|
||||
"""
|
||||
|
||||
module: str
|
||||
|
||||
@@ -19,7 +19,7 @@ def add_dependencies_to_pyproject_toml(
|
||||
pyproject_toml: Path,
|
||||
local_editable_dependencies: Iterable[tuple[str, Path]],
|
||||
) -> None:
|
||||
"""Add dependencies to pyproject.toml."""
|
||||
"""Add dependencies to `pyproject.toml`."""
|
||||
with pyproject_toml.open(encoding="utf-8") as f:
|
||||
# tomlkit types aren't amazing - treat as Dict instead
|
||||
pyproject: dict[str, Any] = load(f)
|
||||
@@ -37,7 +37,7 @@ def remove_dependencies_from_pyproject_toml(
|
||||
pyproject_toml: Path,
|
||||
local_editable_dependencies: Iterable[str],
|
||||
) -> None:
|
||||
"""Remove dependencies from pyproject.toml."""
|
||||
"""Remove dependencies from `pyproject.toml`."""
|
||||
with pyproject_toml.open(encoding="utf-8") as f:
|
||||
pyproject: dict[str, Any] = load(f)
|
||||
# tomlkit types aren't amazing - treat as Dict instead
|
||||
|
||||
@@ -8,7 +8,7 @@ license = { text = "MIT" }
|
||||
requires-python = ">=3.10.0,<4.0.0"
|
||||
dependencies = [
|
||||
"typer>=0.17.0,<1.0.0",
|
||||
"gitpython>=3,<4.0.0",
|
||||
"gitpython>=3.0.0,<4.0.0",
|
||||
"langserve[all]>=0.0.51,<1.0.0",
|
||||
"uvicorn>=0.23.0,<1.0.0",
|
||||
"tomlkit>=0.12.0,<1.0.0",
|
||||
@@ -29,9 +29,18 @@ langchain = "langchain_cli.cli:app"
|
||||
langchain-cli = "langchain_cli.cli:app"
|
||||
|
||||
[dependency-groups]
|
||||
dev = ["pytest>=7.4.2,<9.0.0", "pytest-watcher>=0.3.4,<1.0.0"]
|
||||
lint = ["ruff>=0.12.2,<0.13", "mypy>=1.18.1,<1.19"]
|
||||
test = ["langchain-core", "langchain"]
|
||||
dev = [
|
||||
"pytest>=7.4.2,<9.0.0",
|
||||
"pytest-watcher>=0.3.4,<1.0.0"
|
||||
]
|
||||
lint = [
|
||||
"ruff>=0.13.1,<0.14",
|
||||
"mypy>=1.18.1,<1.19"
|
||||
]
|
||||
test = [
|
||||
"langchain-core",
|
||||
"langchain"
|
||||
]
|
||||
typing = ["langchain"]
|
||||
test_integration = []
|
||||
|
||||
@@ -70,11 +79,14 @@ flake8-annotations.allow-star-arg-any = true
|
||||
flake8-annotations.mypy-init-return = true
|
||||
flake8-type-checking.runtime-evaluated-base-classes = ["pydantic.BaseModel","langchain_core.load.serializable.Serializable","langchain_core.runnables.base.RunnableSerializable"]
|
||||
pep8-naming.classmethod-decorators = [ "classmethod", "langchain_core.utils.pydantic.pre_init", "pydantic.field_validator", "pydantic.v1.root_validator",]
|
||||
pydocstyle.convention = "google"
|
||||
pyupgrade.keep-runtime-typing = true
|
||||
|
||||
[tool.ruff.lint.pydocstyle]
|
||||
convention = "google"
|
||||
ignore-var-parameters = true # ignore missing documentation for *args and **kwargs parameters
|
||||
|
||||
[tool.ruff.lint.per-file-ignores]
|
||||
"tests/**" = [ "D1", "DOC", "S", "SLF",]
|
||||
"tests/**" = [ "D1", "S", "SLF",]
|
||||
"scripts/**" = [ "INP", "S",]
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
"""Scripts."""
|
||||
@@ -9,7 +9,7 @@ from langchain_cli.namespaces.migrate.generate.generic import (
|
||||
|
||||
@pytest.mark.xfail(reason="Unknown reason")
|
||||
def test_create_json_agent_migration() -> None:
|
||||
"""Test the migration of create_json_agent from langchain to langchain_community."""
|
||||
"""Test migration of `create_json_agent` from langchain to `langchain_community`."""
|
||||
with sup1(), sup2():
|
||||
raw_migrations = generate_simplified_migrations(
|
||||
from_package="langchain",
|
||||
@@ -40,7 +40,7 @@ def test_create_json_agent_migration() -> None:
|
||||
|
||||
@pytest.mark.xfail(reason="Unknown reason")
|
||||
def test_create_single_store_retriever_db() -> None:
|
||||
"""Test migration from langchain to langchain_core."""
|
||||
"""Test migration from `langchain` to `langchain_core`."""
|
||||
with sup1(), sup2():
|
||||
raw_migrations = generate_simplified_migrations(
|
||||
from_package="langchain",
|
||||
|
||||
162
libs/cli/uv.lock
generated
162
libs/cli/uv.lock
generated
@@ -13,7 +13,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "anyio"
|
||||
version = "4.10.0"
|
||||
version = "4.11.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "exceptiongroup", marker = "python_full_version < '3.11'" },
|
||||
@@ -21,9 +21,9 @@ dependencies = [
|
||||
{ name = "sniffio" },
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f1/b4/636b3b65173d3ce9a38ef5f0522789614e590dab6a8d505340a4efe4c567/anyio-4.10.0.tar.gz", hash = "sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6", size = 213252, upload-time = "2025-08-04T08:54:26.451Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c6/78/7d432127c41b50bccba979505f272c16cbcadcc33645d5fa3a738110ae75/anyio-4.11.0.tar.gz", hash = "sha256:82a8d0b81e318cc5ce71a5f1f8b5c4e63619620b63141ef8c995fa0db95a57c4", size = 219094, upload-time = "2025-09-23T09:19:12.58Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/12/e5e0282d673bb9746bacfb6e2dba8719989d3660cdb2ea79aee9a9651afb/anyio-4.10.0-py3-none-any.whl", hash = "sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1", size = 107213, upload-time = "2025-08-04T08:54:24.882Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -110,14 +110,14 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "click"
|
||||
version = "8.2.1"
|
||||
version = "8.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/46/61/de6cd827efad202d7057d93e0fed9294b96952e188f7384832791c7b2254/click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4", size = 276943, upload-time = "2025-09-18T17:32:23.696Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/d3/9dcc0f5797f070ec8edf30fbadfb200e71d9db6b84d211e3b2085a7589a0/click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc", size = 107295, upload-time = "2025-09-18T17:32:22.42Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -143,16 +143,16 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "fastapi"
|
||||
version = "0.116.2"
|
||||
version = "0.117.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pydantic" },
|
||||
{ name = "starlette" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/01/64/1296f46d6b9e3b23fb22e5d01af3f104ef411425531376212f1eefa2794d/fastapi-0.116.2.tar.gz", hash = "sha256:231a6af2fe21cfa2c32730170ad8514985fc250bec16c9b242d3b94c835ef529", size = 298595, upload-time = "2025-09-16T18:29:23.058Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/7e/7e/d9788300deaf416178f61fb3c2ceb16b7d0dc9f82a08fdb87a5e64ee3cc7/fastapi-0.117.1.tar.gz", hash = "sha256:fb2d42082d22b185f904ca0ecad2e195b851030bd6c5e4c032d1c981240c631a", size = 307155, upload-time = "2025-09-20T20:16:56.663Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/32/e4/c543271a8018874b7f682bf6156863c416e1334b8ed3e51a69495c5d4360/fastapi-0.116.2-py3-none-any.whl", hash = "sha256:c3a7a8fb830b05f7e087d920e0d786ca1fc9892eb4e9a84b227be4c1bc7569db", size = 95670, upload-time = "2025-09-16T18:29:21.329Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/45/d9d3e8eeefbe93be1c50060a9d9a9f366dba66f288bb518a9566a23a8631/fastapi-0.117.1-py3-none-any.whl", hash = "sha256:33c51a0d21cab2b9722d4e56dbb9316f3687155be6b276191790d8da03507552", size = 95959, upload-time = "2025-09-20T20:16:53.661Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -372,7 +372,7 @@ dev = [
|
||||
lint = [
|
||||
{ name = "cffi", marker = "python_full_version < '3.10'", specifier = "<1.17.1" },
|
||||
{ name = "cffi", marker = "python_full_version >= '3.10'" },
|
||||
{ name = "ruff", specifier = ">=0.12.2,<0.13.0" },
|
||||
{ name = "ruff", specifier = ">=0.13.1,<0.14.0" },
|
||||
]
|
||||
test = [
|
||||
{ name = "blockbuster", specifier = ">=1.5.18,<1.6.0" },
|
||||
@@ -459,7 +459,7 @@ typing = [
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "gitpython", specifier = ">=3,<4.0.0" },
|
||||
{ name = "gitpython", specifier = ">=3.0.0,<4.0.0" },
|
||||
{ name = "gritql", specifier = ">=0.2.0,<1.0.0" },
|
||||
{ name = "langserve", extras = ["all"], specifier = ">=0.0.51,<1.0.0" },
|
||||
{ name = "tomlkit", specifier = ">=0.12.0,<1.0.0" },
|
||||
@@ -474,7 +474,7 @@ dev = [
|
||||
]
|
||||
lint = [
|
||||
{ name = "mypy", specifier = ">=1.18.1,<1.19" },
|
||||
{ name = "ruff", specifier = ">=0.12.2,<0.13" },
|
||||
{ name = "ruff", specifier = ">=0.13.1,<0.14" },
|
||||
]
|
||||
test = [
|
||||
{ name = "langchain", editable = "../langchain" },
|
||||
@@ -485,7 +485,7 @@ typing = [{ name = "langchain", editable = "../langchain" }]
|
||||
|
||||
[[package]]
|
||||
name = "langchain-core"
|
||||
version = "0.3.76"
|
||||
version = "0.3.78"
|
||||
source = { editable = "../core" }
|
||||
dependencies = [
|
||||
{ name = "jsonpatch" },
|
||||
@@ -514,7 +514,7 @@ dev = [
|
||||
{ name = "jupyter", specifier = ">=1.0.0,<2.0.0" },
|
||||
{ name = "setuptools", specifier = ">=67.6.1,<68.0.0" },
|
||||
]
|
||||
lint = [{ name = "ruff", specifier = ">=0.12.2,<0.13.0" }]
|
||||
lint = [{ name = "ruff", specifier = ">=0.13.1,<0.14.0" }]
|
||||
test = [
|
||||
{ name = "blockbuster", specifier = ">=1.5.18,<1.6.0" },
|
||||
{ name = "freezegun", specifier = ">=1.2.2,<2.0.0" },
|
||||
@@ -559,7 +559,7 @@ dev = [
|
||||
]
|
||||
lint = [
|
||||
{ name = "langchain-core", editable = "../core" },
|
||||
{ name = "ruff", specifier = ">=0.12.8,<0.13.0" },
|
||||
{ name = "ruff", specifier = ">=0.13.1,<0.14.0" },
|
||||
]
|
||||
test = [
|
||||
{ name = "freezegun", specifier = ">=1.2.2,<2.0.0" },
|
||||
@@ -574,6 +574,8 @@ test = [
|
||||
test-integration = [
|
||||
{ name = "en-core-web-sm", url = "https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.8.0/en_core_web_sm-3.8.0-py3-none-any.whl" },
|
||||
{ name = "nltk", specifier = ">=3.9.1,<4.0.0" },
|
||||
{ name = "scipy", marker = "python_full_version == '3.12.*'", specifier = ">=1.7.0,<2.0.0" },
|
||||
{ name = "scipy", marker = "python_full_version >= '3.13'", specifier = ">=1.14.1,<2.0.0" },
|
||||
{ name = "sentence-transformers", specifier = ">=3.0.1,<4.0.0" },
|
||||
{ name = "spacy", specifier = ">=3.8.7,<4.0.0" },
|
||||
{ name = "thinc", specifier = ">=8.3.6,<9.0.0" },
|
||||
@@ -590,7 +592,7 @@ typing = [
|
||||
|
||||
[[package]]
|
||||
name = "langserve"
|
||||
version = "0.3.1"
|
||||
version = "0.3.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "httpx" },
|
||||
@@ -598,9 +600,9 @@ dependencies = [
|
||||
{ name = "orjson" },
|
||||
{ name = "pydantic" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8c/4f/465c21b7ab4ab18d9bf59f2804f7dc599a52462beb53ba84180f7d2bf581/langserve-0.3.1.tar.gz", hash = "sha256:17bf7f7d7c182623298748c2eab176c4c0e1872e20f88be89ba89a962c94988d", size = 1141070, upload-time = "2024-12-27T02:40:21.417Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5a/fb/86e1f5049fb3593743f0fb049c4991f4984020cda00b830ae31f2c47b46b/langserve-0.3.2.tar.gz", hash = "sha256:134b78b1d897c6bcd1fb8a6258e30cf0fb318294505e4ea59c2bea72fa152129", size = 1141270, upload-time = "2025-09-17T20:01:22.183Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/89/e4/6a26851d96c445d783d188c330cb871b56f03b18824ad8fadf6452d18a88/langserve-0.3.1-py3-none-any.whl", hash = "sha256:c860435ebbcc2c051c3e34c349c9d6020921a8af9216ad17700b58dbd01be9a2", size = 1173074, upload-time = "2024-12-27T02:40:18.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/f0/193c34bf61e1dee8bd637dbeddcc644c46d14e8b03068792ca60b1909bc1/langserve-0.3.2-py3-none-any.whl", hash = "sha256:d9c4cd19d12f6362b82ceecb10357b339b3640a858b9bc30643d5f8a0a036bce", size = 1173213, upload-time = "2025-09-17T20:01:20.603Z" },
|
||||
]
|
||||
|
||||
[package.optional-dependencies]
|
||||
@@ -611,7 +613,7 @@ all = [
|
||||
|
||||
[[package]]
|
||||
name = "langsmith"
|
||||
version = "0.4.28"
|
||||
version = "0.4.31"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "httpx" },
|
||||
@@ -622,9 +624,9 @@ dependencies = [
|
||||
{ name = "requests-toolbelt" },
|
||||
{ name = "zstandard" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/fd/70/a3e9824f7c4823f3ed3f89f024fa25887bb82b64ab4f565dffd02b1f27f9/langsmith-0.4.28.tar.gz", hash = "sha256:8734e6d3e16ce0085b5f7235633b0e14bc8e0c160b1c1d8ce2588f83a936e171", size = 956001, upload-time = "2025-09-15T16:59:46.095Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/55/f5/edbdf89a162ee025348b3b2080fb3b88f4a1040a5a186f32d34aca913994/langsmith-0.4.31.tar.gz", hash = "sha256:5fb3729e22bd9a225391936cb9d1080322e6c375bb776514af06b56d6c46ed3e", size = 959698, upload-time = "2025-09-25T04:18:19.55Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2f/b1/cf3a4d37b7b2e9dd1f35a3d89246b0d5851aa1caff9cbf73872a106ef7f7/langsmith-0.4.28-py3-none-any.whl", hash = "sha256:0440968566d56d38d889afa202e1ff56a238e1493aea87ceb5c3c28d41d01144", size = 384724, upload-time = "2025-09-15T16:59:44.118Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/8e/e7a43d907a147e1f87eebdd6737483f9feba52a5d4b20f69d0bd6f2fa22f/langsmith-0.4.31-py3-none-any.whl", hash = "sha256:64f340bdead21defe5f4a6ca330c11073e35444989169f669508edf45a19025f", size = 386347, upload-time = "2025-09-25T04:18:16.69Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -650,7 +652,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "mypy"
|
||||
version = "1.18.1"
|
||||
version = "1.18.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "mypy-extensions" },
|
||||
@@ -658,39 +660,39 @@ dependencies = [
|
||||
{ name = "tomli", marker = "python_full_version < '3.11'" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/14/a3/931e09fc02d7ba96da65266884da4e4a8806adcdb8a57faaacc6edf1d538/mypy-1.18.1.tar.gz", hash = "sha256:9e988c64ad3ac5987f43f5154f884747faf62141b7f842e87465b45299eea5a9", size = 3448447, upload-time = "2025-09-11T23:00:47.067Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c0/77/8f0d0001ffad290cef2f7f216f96c814866248a0b92a722365ed54648e7e/mypy-1.18.2.tar.gz", hash = "sha256:06a398102a5f203d7477b2923dda3634c36727fa5c237d8f859ef90c42a9924b", size = 3448846, upload-time = "2025-09-19T00:11:10.519Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/06/29ea5a34c23938ae93bc0040eb2900eb3f0f2ef4448cc59af37ab3ddae73/mypy-1.18.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2761b6ae22a2b7d8e8607fb9b81ae90bc2e95ec033fd18fa35e807af6c657763", size = 12811535, upload-time = "2025-09-11T22:58:55.399Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/40/04c38cb04fa9f1dc224b3e9634021a92c47b1569f1c87dfe6e63168883bb/mypy-1.18.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5b10e3ea7f2eec23b4929a3fabf84505da21034a4f4b9613cda81217e92b74f3", size = 11897559, upload-time = "2025-09-11T22:59:48.041Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/46/bf/4c535bd45ea86cebbc1a3b6a781d442f53a4883f322ebd2d442db6444d0b/mypy-1.18.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:261fbfced030228bc0f724d5d92f9ae69f46373bdfd0e04a533852677a11dbea", size = 12507430, upload-time = "2025-09-11T22:59:30.415Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/e1/cbefb16f2be078d09e28e0b9844e981afb41f6ffc85beb68b86c6976e641/mypy-1.18.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4dc6b34a1c6875e6286e27d836a35c0d04e8316beac4482d42cfea7ed2527df8", size = 13243717, upload-time = "2025-09-11T22:59:11.297Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/e8/3e963da63176f16ca9caea7fa48f1bc8766de317cd961528c0391565fd47/mypy-1.18.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:1cabb353194d2942522546501c0ff75c4043bf3b63069cb43274491b44b773c9", size = 13492052, upload-time = "2025-09-11T23:00:09.29Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/09/d5d70c252a3b5b7530662d145437bd1de15f39fa0b48a27ee4e57d254aa1/mypy-1.18.1-cp310-cp310-win_amd64.whl", hash = "sha256:738b171690c8e47c93569635ee8ec633d2cdb06062f510b853b5f233020569a9", size = 9765846, upload-time = "2025-09-11T22:58:26.198Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/28/47709d5d9e7068b26c0d5189c8137c8783e81065ad1102b505214a08b548/mypy-1.18.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c903857b3e28fc5489e54042684a9509039ea0aedb2a619469438b544ae1961", size = 12734635, upload-time = "2025-09-11T23:00:24.983Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/12/ee5c243e52497d0e59316854041cf3b3130131b92266d0764aca4dec3c00/mypy-1.18.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2a0c8392c19934c2b6c65566d3a6abdc6b51d5da7f5d04e43f0eb627d6eeee65", size = 11817287, upload-time = "2025-09-11T22:59:07.38Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/bd/2aeb950151005fe708ab59725afed7c4aeeb96daf844f86a05d4b8ac34f8/mypy-1.18.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f85eb7efa2ec73ef63fc23b8af89c2fe5bf2a4ad985ed2d3ff28c1bb3c317c92", size = 12430464, upload-time = "2025-09-11T22:58:48.084Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/e8/7a20407aafb488acb5734ad7fb5e8c2ef78d292ca2674335350fa8ebef67/mypy-1.18.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:82ace21edf7ba8af31c3308a61dc72df30500f4dbb26f99ac36b4b80809d7e94", size = 13164555, upload-time = "2025-09-11T23:00:13.803Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/c9/5f39065252e033b60f397096f538fb57c1d9fd70a7a490f314df20dd9d64/mypy-1.18.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a2dfd53dfe632f1ef5d161150a4b1f2d0786746ae02950eb3ac108964ee2975a", size = 13359222, upload-time = "2025-09-11T23:00:33.469Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/b6/d54111ef3c1e55992cd2ec9b8b6ce9c72a407423e93132cae209f7e7ba60/mypy-1.18.1-cp311-cp311-win_amd64.whl", hash = "sha256:320f0ad4205eefcb0e1a72428dde0ad10be73da9f92e793c36228e8ebf7298c0", size = 9760441, upload-time = "2025-09-11T23:00:44.826Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/14/1c3f54d606cb88a55d1567153ef3a8bc7b74702f2ff5eb64d0994f9e49cb/mypy-1.18.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:502cde8896be8e638588b90fdcb4c5d5b8c1b004dfc63fd5604a973547367bb9", size = 12911082, upload-time = "2025-09-11T23:00:41.465Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/83/235606c8b6d50a8eba99773add907ce1d41c068edb523f81eb0d01603a83/mypy-1.18.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7509549b5e41be279afc1228242d0e397f1af2919a8f2877ad542b199dc4083e", size = 11919107, upload-time = "2025-09-11T22:58:40.903Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/25/4e2ce00f8d15b99d0c68a2536ad63e9eac033f723439ef80290ec32c1ff5/mypy-1.18.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5956ecaabb3a245e3f34100172abca1507be687377fe20e24d6a7557e07080e2", size = 12472551, upload-time = "2025-09-11T22:58:37.272Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/bb/92642a9350fc339dd9dcefcf6862d171b52294af107d521dce075f32f298/mypy-1.18.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8750ceb014a96c9890421c83f0db53b0f3b8633e2864c6f9bc0a8e93951ed18d", size = 13340554, upload-time = "2025-09-11T22:59:38.756Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/ee/38d01db91c198fb6350025d28f9719ecf3c8f2c55a0094bfbf3ef478cc9a/mypy-1.18.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fb89ea08ff41adf59476b235293679a6eb53a7b9400f6256272fb6029bec3ce5", size = 13530933, upload-time = "2025-09-11T22:59:20.228Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/8d/6d991ae631f80d58edbf9d7066e3f2a96e479dca955d9a968cd6e90850a3/mypy-1.18.1-cp312-cp312-win_amd64.whl", hash = "sha256:2657654d82fcd2a87e02a33e0d23001789a554059bbf34702d623dafe353eabf", size = 9828426, upload-time = "2025-09-11T23:00:21.007Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/ec/ef4a7260e1460a3071628a9277a7579e7da1b071bc134ebe909323f2fbc7/mypy-1.18.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d70d2b5baf9b9a20bc9c730015615ae3243ef47fb4a58ad7b31c3e0a59b5ef1f", size = 12918671, upload-time = "2025-09-11T22:58:29.814Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/82/0ea6c3953f16223f0b8eda40c1aeac6bd266d15f4902556ae6e91f6fca4c/mypy-1.18.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b8367e33506300f07a43012fc546402f283c3f8bcff1dc338636affb710154ce", size = 11913023, upload-time = "2025-09-11T23:00:29.049Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/ef/5e2057e692c2690fc27b3ed0a4dbde4388330c32e2576a23f0302bc8358d/mypy-1.18.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:913f668ec50c3337b89df22f973c1c8f0b29ee9e290a8b7fe01cc1ef7446d42e", size = 12473355, upload-time = "2025-09-11T23:00:04.544Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/43/b7e429fc4be10e390a167b0cd1810d41cb4e4add4ae50bab96faff695a3b/mypy-1.18.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a0e70b87eb27b33209fa4792b051c6947976f6ab829daa83819df5f58330c71", size = 13346944, upload-time = "2025-09-11T22:58:23.024Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/4e/899dba0bfe36bbd5b7c52e597de4cf47b5053d337b6d201a30e3798e77a6/mypy-1.18.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c378d946e8a60be6b6ede48c878d145546fb42aad61df998c056ec151bf6c746", size = 13512574, upload-time = "2025-09-11T22:59:52.152Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f5/f8/7661021a5b0e501b76440454d786b0f01bb05d5c4b125fcbda02023d0250/mypy-1.18.1-cp313-cp313-win_amd64.whl", hash = "sha256:2cd2c1e0f3a7465f22731987fff6fc427e3dcbb4ca5f7db5bbeaff2ff9a31f6d", size = 9837684, upload-time = "2025-09-11T22:58:44.454Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/87/7b173981466219eccc64c107cf8e5ab9eb39cc304b4c07df8e7881533e4f/mypy-1.18.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ba24603c58e34dd5b096dfad792d87b304fc6470cbb1c22fd64e7ebd17edcc61", size = 12900265, upload-time = "2025-09-11T22:59:03.4Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/cc/b10e65bae75b18a5ac8f81b1e8e5867677e418f0dd2c83b8e2de9ba96ebd/mypy-1.18.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ed36662fb92ae4cb3cacc682ec6656208f323bbc23d4b08d091eecfc0863d4b5", size = 11942890, upload-time = "2025-09-11T23:00:00.607Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/39/d4/aeefa07c44d09f4c2102e525e2031bc066d12e5351f66b8a83719671004d/mypy-1.18.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:040ecc95e026f71a9ad7956fea2724466602b561e6a25c2e5584160d3833aaa8", size = 12472291, upload-time = "2025-09-11T22:59:43.425Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/07/711e78668ff8e365f8c19735594ea95938bff3639a4c46a905e3ed8ff2d6/mypy-1.18.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:937e3ed86cb731276706e46e03512547e43c391a13f363e08d0fee49a7c38a0d", size = 13318610, upload-time = "2025-09-11T23:00:17.604Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/85/df3b2d39339c31d360ce299b418c55e8194ef3205284739b64962f6074e7/mypy-1.18.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1f95cc4f01c0f1701ca3b0355792bccec13ecb2ec1c469e5b85a6ef398398b1d", size = 13513697, upload-time = "2025-09-11T22:58:59.534Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/df/462866163c99ea73bb28f0eb4d415c087e30de5d36ee0f5429d42e28689b/mypy-1.18.1-cp314-cp314-win_amd64.whl", hash = "sha256:e4f16c0019d48941220ac60b893615be2f63afedaba6a0801bdcd041b96991ce", size = 9985739, upload-time = "2025-09-11T22:58:51.644Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/1d/4b97d3089b48ef3d904c9ca69fab044475bd03245d878f5f0b3ea1daf7ce/mypy-1.18.1-py3-none-any.whl", hash = "sha256:b76a4de66a0ac01da1be14ecc8ae88ddea33b8380284a9e3eae39d57ebcbe26e", size = 2352212, upload-time = "2025-09-11T22:59:26.576Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/6f/657961a0743cff32e6c0611b63ff1c1970a0b482ace35b069203bf705187/mypy-1.18.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c1eab0cf6294dafe397c261a75f96dc2c31bffe3b944faa24db5def4e2b0f77c", size = 12807973, upload-time = "2025-09-19T00:10:35.282Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/e9/420822d4f661f13ca8900f5fa239b40ee3be8b62b32f3357df9a3045a08b/mypy-1.18.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7a780ca61fc239e4865968ebc5240bb3bf610ef59ac398de9a7421b54e4a207e", size = 11896527, upload-time = "2025-09-19T00:10:55.791Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/73/a05b2bbaa7005f4642fcfe40fb73f2b4fb6bb44229bd585b5878e9a87ef8/mypy-1.18.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:448acd386266989ef11662ce3c8011fd2a7b632e0ec7d61a98edd8e27472225b", size = 12507004, upload-time = "2025-09-19T00:11:05.411Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/01/f6e4b9f0d031c11ccbd6f17da26564f3a0f3c4155af344006434b0a05a9d/mypy-1.18.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f9e171c465ad3901dc652643ee4bffa8e9fef4d7d0eece23b428908c77a76a66", size = 13245947, upload-time = "2025-09-19T00:10:46.923Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/97/19727e7499bfa1ae0773d06afd30ac66a58ed7437d940c70548634b24185/mypy-1.18.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:592ec214750bc00741af1f80cbf96b5013d81486b7bb24cb052382c19e40b428", size = 13499217, upload-time = "2025-09-19T00:09:39.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/4f/90dc8c15c1441bf31cf0f9918bb077e452618708199e530f4cbd5cede6ff/mypy-1.18.2-cp310-cp310-win_amd64.whl", hash = "sha256:7fb95f97199ea11769ebe3638c29b550b5221e997c63b14ef93d2e971606ebed", size = 9766753, upload-time = "2025-09-19T00:10:49.161Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/88/87/cafd3ae563f88f94eec33f35ff722d043e09832ea8530ef149ec1efbaf08/mypy-1.18.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:807d9315ab9d464125aa9fcf6d84fde6e1dc67da0b6f80e7405506b8ac72bc7f", size = 12731198, upload-time = "2025-09-19T00:09:44.857Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0f/e0/1e96c3d4266a06d4b0197ace5356d67d937d8358e2ee3ffac71faa843724/mypy-1.18.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:776bb00de1778caf4db739c6e83919c1d85a448f71979b6a0edd774ea8399341", size = 11817879, upload-time = "2025-09-19T00:09:47.131Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/72/ef/0c9ba89eb03453e76bdac5a78b08260a848c7bfc5d6603634774d9cd9525/mypy-1.18.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1379451880512ffce14505493bd9fe469e0697543717298242574882cf8cdb8d", size = 12427292, upload-time = "2025-09-19T00:10:22.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/52/ec4a061dd599eb8179d5411d99775bec2a20542505988f40fc2fee781068/mypy-1.18.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1331eb7fd110d60c24999893320967594ff84c38ac6d19e0a76c5fd809a84c86", size = 13163750, upload-time = "2025-09-19T00:09:51.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/5f/2cf2ceb3b36372d51568f2208c021870fe7834cf3186b653ac6446511839/mypy-1.18.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3ca30b50a51e7ba93b00422e486cbb124f1c56a535e20eff7b2d6ab72b3b2e37", size = 13351827, upload-time = "2025-09-19T00:09:58.311Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/7d/2697b930179e7277529eaaec1513f8de622818696857f689e4a5432e5e27/mypy-1.18.2-cp311-cp311-win_amd64.whl", hash = "sha256:664dc726e67fa54e14536f6e1224bcfce1d9e5ac02426d2326e2bb4e081d1ce8", size = 9757983, upload-time = "2025-09-19T00:10:09.071Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/06/dfdd2bc60c66611dd8335f463818514733bc763e4760dee289dcc33df709/mypy-1.18.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:33eca32dd124b29400c31d7cf784e795b050ace0e1f91b8dc035672725617e34", size = 12908273, upload-time = "2025-09-19T00:10:58.321Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/14/6a9de6d13a122d5608e1a04130724caf9170333ac5a924e10f670687d3eb/mypy-1.18.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a3c47adf30d65e89b2dcd2fa32f3aeb5e94ca970d2c15fcb25e297871c8e4764", size = 11920910, upload-time = "2025-09-19T00:10:20.043Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5f/a9/b29de53e42f18e8cc547e38daa9dfa132ffdc64f7250e353f5c8cdd44bee/mypy-1.18.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d6c838e831a062f5f29d11c9057c6009f60cb294fea33a98422688181fe2893", size = 12465585, upload-time = "2025-09-19T00:10:33.005Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/ae/6c3d2c7c61ff21f2bee938c917616c92ebf852f015fb55917fd6e2811db2/mypy-1.18.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:01199871b6110a2ce984bde85acd481232d17413868c9807e95c1b0739a58914", size = 13348562, upload-time = "2025-09-19T00:10:11.51Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/31/aec68ab3b4aebdf8f36d191b0685d99faa899ab990753ca0fee60fb99511/mypy-1.18.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a2afc0fa0b0e91b4599ddfe0f91e2c26c2b5a5ab263737e998d6817874c5f7c8", size = 13533296, upload-time = "2025-09-19T00:10:06.568Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/83/abcb3ad9478fca3ebeb6a5358bb0b22c95ea42b43b7789c7fb1297ca44f4/mypy-1.18.2-cp312-cp312-win_amd64.whl", hash = "sha256:d8068d0afe682c7c4897c0f7ce84ea77f6de953262b12d07038f4d296d547074", size = 9828828, upload-time = "2025-09-19T00:10:28.203Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5f/04/7f462e6fbba87a72bc8097b93f6842499c428a6ff0c81dd46948d175afe8/mypy-1.18.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:07b8b0f580ca6d289e69209ec9d3911b4a26e5abfde32228a288eb79df129fcc", size = 12898728, upload-time = "2025-09-19T00:10:01.33Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/99/5b/61ed4efb64f1871b41fd0b82d29a64640f3516078f6c7905b68ab1ad8b13/mypy-1.18.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ed4482847168439651d3feee5833ccedbf6657e964572706a2adb1f7fa4dfe2e", size = 11910758, upload-time = "2025-09-19T00:10:42.607Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3c/46/d297d4b683cc89a6e4108c4250a6a6b717f5fa96e1a30a7944a6da44da35/mypy-1.18.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c3ad2afadd1e9fea5cf99a45a822346971ede8685cc581ed9cd4d42eaf940986", size = 12475342, upload-time = "2025-09-19T00:11:00.371Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/83/45/4798f4d00df13eae3bfdf726c9244bcb495ab5bd588c0eed93a2f2dd67f3/mypy-1.18.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a431a6f1ef14cf8c144c6b14793a23ec4eae3db28277c358136e79d7d062f62d", size = 13338709, upload-time = "2025-09-19T00:11:03.358Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/09/479f7358d9625172521a87a9271ddd2441e1dab16a09708f056e97007207/mypy-1.18.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7ab28cc197f1dd77a67e1c6f35cd1f8e8b73ed2217e4fc005f9e6a504e46e7ba", size = 13529806, upload-time = "2025-09-19T00:10:26.073Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/cf/ac0f2c7e9d0ea3c75cd99dff7aec1c9df4a1376537cb90e4c882267ee7e9/mypy-1.18.2-cp313-cp313-win_amd64.whl", hash = "sha256:0e2785a84b34a72ba55fb5daf079a1003a34c05b22238da94fcae2bbe46f3544", size = 9833262, upload-time = "2025-09-19T00:10:40.035Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/0c/7d5300883da16f0063ae53996358758b2a2df2a09c72a5061fa79a1f5006/mypy-1.18.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:62f0e1e988ad41c2a110edde6c398383a889d95b36b3e60bcf155f5164c4fdce", size = 12893775, upload-time = "2025-09-19T00:10:03.814Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/df/2cffbf25737bdb236f60c973edf62e3e7b4ee1c25b6878629e88e2cde967/mypy-1.18.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:8795a039bab805ff0c1dfdb8cd3344642c2b99b8e439d057aba30850b8d3423d", size = 11936852, upload-time = "2025-09-19T00:10:51.631Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/50/34059de13dd269227fb4a03be1faee6e2a4b04a2051c82ac0a0b5a773c9a/mypy-1.18.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6ca1e64b24a700ab5ce10133f7ccd956a04715463d30498e64ea8715236f9c9c", size = 12480242, upload-time = "2025-09-19T00:11:07.955Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/11/040983fad5132d85914c874a2836252bbc57832065548885b5bb5b0d4359/mypy-1.18.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d924eef3795cc89fecf6bedc6ed32b33ac13e8321344f6ddbf8ee89f706c05cb", size = 13326683, upload-time = "2025-09-19T00:09:55.572Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/ba/89b2901dd77414dd7a8c8729985832a5735053be15b744c18e4586e506ef/mypy-1.18.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:20c02215a080e3a2be3aa50506c67242df1c151eaba0dcbc1e4e557922a26075", size = 13514749, upload-time = "2025-09-19T00:10:44.827Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/bc/cc98767cffd6b2928ba680f3e5bc969c4152bf7c2d83f92f5a504b92b0eb/mypy-1.18.2-cp314-cp314-win_amd64.whl", hash = "sha256:749b5f83198f1ca64345603118a6f01a4e99ad4bf9d103ddc5a3200cc4614adf", size = 9982959, upload-time = "2025-09-19T00:10:37.344Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/e3/be76d87158ebafa0309946c4a73831974d4d6ab4f4ef40c3b53a385a66fd/mypy-1.18.2-py3-none-any.whl", hash = "sha256:22a1748707dd62b58d2ae53562ffc4d7f8bcc727e8ac7cbc69c053ddc874d47e", size = 2352367, upload-time = "2025-09-19T00:10:15.489Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1034,28 +1036,28 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.12.12"
|
||||
version = "0.13.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a8/f0/e0965dd709b8cabe6356811c0ee8c096806bb57d20b5019eb4e48a117410/ruff-0.12.12.tar.gz", hash = "sha256:b86cd3415dbe31b3b46a71c598f4c4b2f550346d1ccf6326b347cc0c8fd063d6", size = 5359915, upload-time = "2025-09-04T16:50:18.273Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ab/33/c8e89216845615d14d2d42ba2bee404e7206a8db782f33400754f3799f05/ruff-0.13.1.tar.gz", hash = "sha256:88074c3849087f153d4bb22e92243ad4c1b366d7055f98726bc19aa08dc12d51", size = 5397987, upload-time = "2025-09-18T19:52:44.33Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/09/79/8d3d687224d88367b51c7974cec1040c4b015772bfbeffac95face14c04a/ruff-0.12.12-py3-none-linux_armv6l.whl", hash = "sha256:de1c4b916d98ab289818e55ce481e2cacfaad7710b01d1f990c497edf217dafc", size = 12116602, upload-time = "2025-09-04T16:49:18.892Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/c3/6e599657fe192462f94861a09aae935b869aea8a1da07f47d6eae471397c/ruff-0.12.12-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:7acd6045e87fac75a0b0cdedacf9ab3e1ad9d929d149785903cff9bb69ad9727", size = 12868393, upload-time = "2025-09-04T16:49:23.043Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/d2/9e3e40d399abc95336b1843f52fc0daaceb672d0e3c9290a28ff1a96f79d/ruff-0.12.12-py3-none-macosx_11_0_arm64.whl", hash = "sha256:abf4073688d7d6da16611f2f126be86523a8ec4343d15d276c614bda8ec44edb", size = 12036967, upload-time = "2025-09-04T16:49:26.04Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/03/6816b2ed08836be272e87107d905f0908be5b4a40c14bfc91043e76631b8/ruff-0.12.12-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:968e77094b1d7a576992ac078557d1439df678a34c6fe02fd979f973af167577", size = 12276038, upload-time = "2025-09-04T16:49:29.056Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/d5/707b92a61310edf358a389477eabd8af68f375c0ef858194be97ca5b6069/ruff-0.12.12-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42a67d16e5b1ffc6d21c5f67851e0e769517fb57a8ebad1d0781b30888aa704e", size = 11901110, upload-time = "2025-09-04T16:49:32.07Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9d/3d/f8b1038f4b9822e26ec3d5b49cf2bc313e3c1564cceb4c1a42820bf74853/ruff-0.12.12-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b216ec0a0674e4b1214dcc998a5088e54eaf39417327b19ffefba1c4a1e4971e", size = 13668352, upload-time = "2025-09-04T16:49:35.148Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/0e/91421368ae6c4f3765dd41a150f760c5f725516028a6be30e58255e3c668/ruff-0.12.12-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:59f909c0fdd8f1dcdbfed0b9569b8bf428cf144bec87d9de298dcd4723f5bee8", size = 14638365, upload-time = "2025-09-04T16:49:38.892Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/5d/88f3f06a142f58ecc8ecb0c2fe0b82343e2a2b04dcd098809f717cf74b6c/ruff-0.12.12-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9ac93d87047e765336f0c18eacad51dad0c1c33c9df7484c40f98e1d773876f5", size = 14060812, upload-time = "2025-09-04T16:49:42.732Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/fc/8962e7ddd2e81863d5c92400820f650b86f97ff919c59836fbc4c1a6d84c/ruff-0.12.12-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:01543c137fd3650d322922e8b14cc133b8ea734617c4891c5a9fccf4bfc9aa92", size = 13050208, upload-time = "2025-09-04T16:49:46.434Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/53/06/8deb52d48a9a624fd37390555d9589e719eac568c020b27e96eed671f25f/ruff-0.12.12-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2afc2fa864197634e549d87fb1e7b6feb01df0a80fd510d6489e1ce8c0b1cc45", size = 13311444, upload-time = "2025-09-04T16:49:49.931Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/81/de5a29af7eb8f341f8140867ffb93f82e4fde7256dadee79016ac87c2716/ruff-0.12.12-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:0c0945246f5ad776cb8925e36af2438e66188d2b57d9cf2eed2c382c58b371e5", size = 13279474, upload-time = "2025-09-04T16:49:53.465Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/14/d9577fdeaf791737ada1b4f5c6b59c21c3326f3f683229096cccd7674e0c/ruff-0.12.12-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a0fbafe8c58e37aae28b84a80ba1817f2ea552e9450156018a478bf1fa80f4e4", size = 12070204, upload-time = "2025-09-04T16:49:56.882Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/04/a910078284b47fad54506dc0af13839c418ff704e341c176f64e1127e461/ruff-0.12.12-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b9c456fb2fc8e1282affa932c9e40f5ec31ec9cbb66751a316bd131273b57c23", size = 11880347, upload-time = "2025-09-04T16:49:59.729Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/58/30185fcb0e89f05e7ea82e5817b47798f7fa7179863f9d9ba6fd4fe1b098/ruff-0.12.12-py3-none-musllinux_1_2_i686.whl", hash = "sha256:5f12856123b0ad0147d90b3961f5c90e7427f9acd4b40050705499c98983f489", size = 12891844, upload-time = "2025-09-04T16:50:02.591Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/9c/28a8dacce4855e6703dcb8cdf6c1705d0b23dd01d60150786cd55aa93b16/ruff-0.12.12-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:26a1b5a2bf7dd2c47e3b46d077cd9c0fc3b93e6c6cc9ed750bd312ae9dc302ee", size = 13360687, upload-time = "2025-09-04T16:50:05.8Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/fa/05b6428a008e60f79546c943e54068316f32ec8ab5c4f73e4563934fbdc7/ruff-0.12.12-py3-none-win32.whl", hash = "sha256:173be2bfc142af07a01e3a759aba6f7791aa47acf3604f610b1c36db888df7b1", size = 12052870, upload-time = "2025-09-04T16:50:09.121Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/60/d1e335417804df452589271818749d061b22772b87efda88354cf35cdb7a/ruff-0.12.12-py3-none-win_amd64.whl", hash = "sha256:e99620bf01884e5f38611934c09dd194eb665b0109104acae3ba6102b600fd0d", size = 13178016, upload-time = "2025-09-04T16:50:12.559Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/28/7e/61c42657f6e4614a4258f1c3b0c5b93adc4d1f8575f5229d1906b483099b/ruff-0.12.12-py3-none-win_arm64.whl", hash = "sha256:2a8199cab4ce4d72d158319b63370abf60991495fb733db96cd923a34c52d093", size = 12256762, upload-time = "2025-09-04T16:50:15.737Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/41/ca37e340938f45cfb8557a97a5c347e718ef34702546b174e5300dbb1f28/ruff-0.13.1-py3-none-linux_armv6l.whl", hash = "sha256:b2abff595cc3cbfa55e509d89439b5a09a6ee3c252d92020bd2de240836cf45b", size = 12304308, upload-time = "2025-09-18T19:51:56.253Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/84/ba378ef4129415066c3e1c80d84e539a0d52feb250685091f874804f28af/ruff-0.13.1-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:4ee9f4249bf7f8bb3984c41bfaf6a658162cdb1b22e3103eabc7dd1dc5579334", size = 12937258, upload-time = "2025-09-18T19:52:00.184Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/b6/ec5e4559ae0ad955515c176910d6d7c93edcbc0ed1a3195a41179c58431d/ruff-0.13.1-py3-none-macosx_11_0_arm64.whl", hash = "sha256:5c5da4af5f6418c07d75e6f3224e08147441f5d1eac2e6ce10dcce5e616a3bae", size = 12214554, upload-time = "2025-09-18T19:52:02.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/d6/cb3e3b4f03b9b0c4d4d8f06126d34b3394f6b4d764912fe80a1300696ef6/ruff-0.13.1-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:80524f84a01355a59a93cef98d804e2137639823bcee2931f5028e71134a954e", size = 12448181, upload-time = "2025-09-18T19:52:05.279Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/ea/bf60cb46d7ade706a246cd3fb99e4cfe854efa3dfbe530d049c684da24ff/ruff-0.13.1-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff7f5ce8d7988767dd46a148192a14d0f48d1baea733f055d9064875c7d50389", size = 12104599, upload-time = "2025-09-18T19:52:07.497Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2d/3e/05f72f4c3d3a69e65d55a13e1dd1ade76c106d8546e7e54501d31f1dc54a/ruff-0.13.1-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c55d84715061f8b05469cdc9a446aa6c7294cd4bd55e86a89e572dba14374f8c", size = 13791178, upload-time = "2025-09-18T19:52:10.189Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/e7/01b1fc403dd45d6cfe600725270ecc6a8f8a48a55bc6521ad820ed3ceaf8/ruff-0.13.1-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:ac57fed932d90fa1624c946dc67a0a3388d65a7edc7d2d8e4ca7bddaa789b3b0", size = 14814474, upload-time = "2025-09-18T19:52:12.866Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/92/d9e183d4ed6185a8df2ce9faa3f22e80e95b5f88d9cc3d86a6d94331da3f/ruff-0.13.1-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c366a71d5b4f41f86a008694f7a0d75fe409ec298685ff72dc882f882d532e36", size = 14217531, upload-time = "2025-09-18T19:52:15.245Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/4a/6ddb1b11d60888be224d721e01bdd2d81faaf1720592858ab8bac3600466/ruff-0.13.1-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f4ea9d1b5ad3e7a83ee8ebb1229c33e5fe771e833d6d3dcfca7b77d95b060d38", size = 13265267, upload-time = "2025-09-18T19:52:17.649Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/98/3f1d18a8d9ea33ef2ad508f0417fcb182c99b23258ec5e53d15db8289809/ruff-0.13.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b0f70202996055b555d3d74b626406476cc692f37b13bac8828acff058c9966a", size = 13243120, upload-time = "2025-09-18T19:52:20.332Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/86/b6ce62ce9c12765fa6c65078d1938d2490b2b1d9273d0de384952b43c490/ruff-0.13.1-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:f8cff7a105dad631085d9505b491db33848007d6b487c3c1979dd8d9b2963783", size = 13443084, upload-time = "2025-09-18T19:52:23.032Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/6e/af7943466a41338d04503fb5a81b2fd07251bd272f546622e5b1599a7976/ruff-0.13.1-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:9761e84255443316a258dd7dfbd9bfb59c756e52237ed42494917b2577697c6a", size = 12295105, upload-time = "2025-09-18T19:52:25.263Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/97/0249b9a24f0f3ebd12f007e81c87cec6d311de566885e9309fcbac5b24cc/ruff-0.13.1-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:3d376a88c3102ef228b102211ef4a6d13df330cb0f5ca56fdac04ccec2a99700", size = 12072284, upload-time = "2025-09-18T19:52:27.478Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/85/0b64693b2c99d62ae65236ef74508ba39c3febd01466ef7f354885e5050c/ruff-0.13.1-py3-none-musllinux_1_2_i686.whl", hash = "sha256:cbefd60082b517a82c6ec8836989775ac05f8991715d228b3c1d86ccc7df7dae", size = 12970314, upload-time = "2025-09-18T19:52:30.212Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/96/fc/342e9f28179915d28b3747b7654f932ca472afbf7090fc0c4011e802f494/ruff-0.13.1-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:dd16b9a5a499fe73f3c2ef09a7885cb1d97058614d601809d37c422ed1525317", size = 13422360, upload-time = "2025-09-18T19:52:32.676Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/54/6177a0dc10bce6f43e392a2192e6018755473283d0cf43cc7e6afc182aea/ruff-0.13.1-py3-none-win32.whl", hash = "sha256:55e9efa692d7cb18580279f1fbb525146adc401f40735edf0aaeabd93099f9a0", size = 12178448, upload-time = "2025-09-18T19:52:35.545Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/51/c6a3a33d9938007b8bdc8ca852ecc8d810a407fb513ab08e34af12dc7c24/ruff-0.13.1-py3-none-win_amd64.whl", hash = "sha256:3a3fb595287ee556de947183489f636b9f76a72f0fa9c028bdcabf5bab2cc5e5", size = 13286458, upload-time = "2025-09-18T19:52:38.198Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/04/afc078a12cf68592345b1e2d6ecdff837d286bac023d7a22c54c7a698c5b/ruff-0.13.1-py3-none-win_arm64.whl", hash = "sha256:c0bae9ffd92d54e03c2bf266f466da0a65e145f298ee5b5846ed435f6a00518a", size = 12437893, upload-time = "2025-09-18T19:52:41.283Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1217,7 +1219,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "typer"
|
||||
version = "0.17.4"
|
||||
version = "0.19.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
@@ -1225,9 +1227,9 @@ dependencies = [
|
||||
{ name = "shellingham" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/92/e8/2a73ccf9874ec4c7638f172efc8972ceab13a0e3480b389d6ed822f7a822/typer-0.17.4.tar.gz", hash = "sha256:b77dc07d849312fd2bb5e7f20a7af8985c7ec360c45b051ed5412f64d8dc1580", size = 103734, upload-time = "2025-09-05T18:14:40.746Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/21/ca/950278884e2ca20547ff3eb109478c6baf6b8cf219318e6bc4f666fad8e8/typer-0.19.2.tar.gz", hash = "sha256:9ad824308ded0ad06cc716434705f691d4ee0bfd0fb081839d2e426860e7fdca", size = 104755, upload-time = "2025-09-23T09:47:48.256Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/93/72/6b3e70d32e89a5cbb6a4513726c1ae8762165b027af569289e19ec08edd8/typer-0.17.4-py3-none-any.whl", hash = "sha256:015534a6edaa450e7007eba705d5c18c3349dcea50a6ad79a5ed530967575824", size = 46643, upload-time = "2025-09-05T18:14:39.166Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/22/35617eee79080a5d071d0f14ad698d325ee6b3bf824fc0467c03b30e7fa8/typer-0.19.2-py3-none-any.whl", hash = "sha256:755e7e19670ffad8283db353267cb81ef252f595aa6834a0d1ca9312d9326cb9", size = 46748, upload-time = "2025-09-23T09:47:46.777Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1262,16 +1264,16 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "uvicorn"
|
||||
version = "0.35.0"
|
||||
version = "0.37.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
{ name = "h11" },
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473, upload-time = "2025-06-28T16:15:46.058Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/71/57/1616c8274c3442d802621abf5deb230771c7a0fec9414cb6763900eb3868/uvicorn-0.37.0.tar.gz", hash = "sha256:4115c8add6d3fd536c8ee77f0e14a7fd2ebba939fed9b02583a97f80648f9e13", size = 80367, upload-time = "2025-09-23T13:33:47.486Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406, upload-time = "2025-06-28T16:15:44.816Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/cd/584a2ceb5532af99dd09e50919e3615ba99aa127e9850eafe5f31ddfdb9a/uvicorn-0.37.0-py3-none-any.whl", hash = "sha256:913b2b88672343739927ce381ff9e2ad62541f9f8289664fa1d1d3803fa2ce6c", size = 67976, upload-time = "2025-09-23T13:33:45.842Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
||||
@@ -27,16 +27,6 @@ The LangChain ecosystem is built on top of `langchain-core`. Some of the benefit
|
||||
- **Stability**: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.
|
||||
- **Battle-tested**: Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.
|
||||
|
||||
## 1️⃣ Core Interface: Runnables
|
||||
|
||||
The concept of a `Runnable` is central to LangChain Core – it is the interface that most LangChain Core components implement, giving them
|
||||
|
||||
- A common invocation interface (`invoke()`, `batch()`, `stream()`, etc.)
|
||||
- Built-in utilities for retries, fallbacks, schemas and runtime configurability
|
||||
- Easy deployment with [LangGraph](https://github.com/langchain-ai/langgraph)
|
||||
|
||||
For more check out the [`Runnable` docs](https://python.langchain.com/docs/concepts/runnables/). Examples of components that implement the interface include: Chat Models, Tools, Retrievers, and Output Parsers.
|
||||
|
||||
## 📕 Releases & Versioning
|
||||
|
||||
As `langchain-core` contains the base abstractions and runtime for the whole LangChain ecosystem, we will communicate any breaking changes with advance notice and version bumps. The exception for this is anything in `langchain_core.beta`. The reason for `langchain_core.beta` is that given the rate of change of the field, being able to move quickly is still a priority, and this module is our attempt to do so.
|
||||
|
||||
@@ -174,6 +174,7 @@ def beta(
|
||||
def finalize(_wrapper: Callable[..., Any], new_doc: str) -> Any:
|
||||
"""Finalize the property."""
|
||||
return property(fget=_fget, fset=_fset, fdel=_fdel, doc=new_doc)
|
||||
|
||||
else:
|
||||
_name = _name or obj.__qualname__
|
||||
if not _obj_type:
|
||||
@@ -226,17 +227,17 @@ def warn_beta(
|
||||
) -> None:
|
||||
"""Display a standardized beta annotation.
|
||||
|
||||
Arguments:
|
||||
message : str, optional
|
||||
Args:
|
||||
message:
|
||||
Override the default beta message. The
|
||||
%(name)s, %(obj_type)s, %(addendum)s
|
||||
format specifiers will be replaced by the
|
||||
values of the respective arguments passed to this function.
|
||||
name : str, optional
|
||||
name:
|
||||
The name of the annotated object.
|
||||
obj_type : str, optional
|
||||
obj_type:
|
||||
The object type being annotated.
|
||||
addendum : str, optional
|
||||
addendum:
|
||||
Additional text appended directly to the final message.
|
||||
"""
|
||||
if not message:
|
||||
|
||||
@@ -431,35 +431,35 @@ def warn_deprecated(
|
||||
) -> None:
|
||||
"""Display a standardized deprecation.
|
||||
|
||||
Arguments:
|
||||
since : str
|
||||
Args:
|
||||
since:
|
||||
The release at which this API became deprecated.
|
||||
message : str, optional
|
||||
message:
|
||||
Override the default deprecation message. The %(since)s,
|
||||
%(name)s, %(alternative)s, %(obj_type)s, %(addendum)s,
|
||||
and %(removal)s format specifiers will be replaced by the
|
||||
values of the respective arguments passed to this function.
|
||||
name : str, optional
|
||||
name:
|
||||
The name of the deprecated object.
|
||||
alternative : str, optional
|
||||
alternative:
|
||||
An alternative API that the user may use in place of the
|
||||
deprecated API. The deprecation warning will tell the user
|
||||
about this alternative if provided.
|
||||
alternative_import: str, optional
|
||||
alternative_import:
|
||||
An alternative import that the user may use instead.
|
||||
pending : bool, optional
|
||||
pending:
|
||||
If True, uses a PendingDeprecationWarning instead of a
|
||||
DeprecationWarning. Cannot be used together with removal.
|
||||
obj_type : str, optional
|
||||
obj_type:
|
||||
The object type being deprecated.
|
||||
addendum : str, optional
|
||||
addendum:
|
||||
Additional text appended directly to the final message.
|
||||
removal : str, optional
|
||||
removal:
|
||||
The expected removal version. With the default (an empty
|
||||
string), a removal version is automatically computed from
|
||||
since. Set to other Falsy values to not schedule a removal
|
||||
date. Cannot be used together with pending.
|
||||
package: str, optional
|
||||
package:
|
||||
The package of the deprecated object.
|
||||
"""
|
||||
if not pending:
|
||||
|
||||
@@ -6,7 +6,6 @@ import asyncio
|
||||
import atexit
|
||||
import functools
|
||||
import logging
|
||||
import uuid
|
||||
from abc import ABC, abstractmethod
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from contextlib import asynccontextmanager, contextmanager
|
||||
@@ -41,6 +40,7 @@ from langchain_core.tracers.langchain import LangChainTracer
|
||||
from langchain_core.tracers.schemas import Run
|
||||
from langchain_core.tracers.stdout import ConsoleCallbackHandler
|
||||
from langchain_core.utils.env import env_var_is_set
|
||||
from langchain_core.utils.uuid import uuid7
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import AsyncGenerator, Coroutine, Generator, Sequence
|
||||
@@ -92,7 +92,7 @@ def trace_as_chain_group(
|
||||
metadata (dict[str, Any], optional): The metadata to apply to all runs.
|
||||
Defaults to None.
|
||||
|
||||
.. note:
|
||||
.. note::
|
||||
Must have ``LANGCHAIN_TRACING_V2`` env var set to true to see the trace in
|
||||
LangSmith.
|
||||
|
||||
@@ -179,7 +179,7 @@ async def atrace_as_chain_group(
|
||||
Yields:
|
||||
The async callback manager for the chain group.
|
||||
|
||||
.. note:
|
||||
.. note::
|
||||
Must have ``LANGCHAIN_TRACING_V2`` env var set to true to see the trace in
|
||||
LangSmith.
|
||||
|
||||
@@ -506,7 +506,7 @@ class BaseRunManager(RunManagerMixin):
|
||||
|
||||
"""
|
||||
return cls(
|
||||
run_id=uuid.uuid4(),
|
||||
run_id=uuid7(),
|
||||
handlers=[],
|
||||
inheritable_handlers=[],
|
||||
tags=[],
|
||||
@@ -1339,7 +1339,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
managers = []
|
||||
for i, prompt in enumerate(prompts):
|
||||
# Can't have duplicate runs with the same run ID (if provided)
|
||||
run_id_ = run_id if i == 0 and run_id is not None else uuid.uuid4()
|
||||
run_id_ = run_id if i == 0 and run_id is not None else uuid7()
|
||||
handle_event(
|
||||
self.handlers,
|
||||
"on_llm_start",
|
||||
@@ -1394,7 +1394,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
run_id_ = run_id
|
||||
run_id = None
|
||||
else:
|
||||
run_id_ = uuid.uuid4()
|
||||
run_id_ = uuid7()
|
||||
handle_event(
|
||||
self.handlers,
|
||||
"on_chat_model_start",
|
||||
@@ -1443,7 +1443,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
handle_event(
|
||||
self.handlers,
|
||||
"on_chain_start",
|
||||
@@ -1498,7 +1498,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
handle_event(
|
||||
self.handlers,
|
||||
@@ -1547,7 +1547,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
The callback manager for the retriever run.
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
handle_event(
|
||||
self.handlers,
|
||||
@@ -1607,7 +1607,7 @@ class CallbackManager(BaseCallbackManager):
|
||||
)
|
||||
raise ValueError(msg)
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
handle_event(
|
||||
self.handlers,
|
||||
@@ -1843,7 +1843,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
run_id_ = run_id
|
||||
run_id = None
|
||||
else:
|
||||
run_id_ = uuid.uuid4()
|
||||
run_id_ = uuid7()
|
||||
|
||||
if inline_handlers:
|
||||
inline_tasks.append(
|
||||
@@ -1928,7 +1928,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
run_id_ = run_id
|
||||
run_id = None
|
||||
else:
|
||||
run_id_ = uuid.uuid4()
|
||||
run_id_ = uuid7()
|
||||
|
||||
for handler in self.handlers:
|
||||
task = ahandle_event(
|
||||
@@ -1991,7 +1991,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
for the chain run.
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
await ahandle_event(
|
||||
self.handlers,
|
||||
@@ -2041,7 +2041,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
for the tool run.
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
await ahandle_event(
|
||||
self.handlers,
|
||||
@@ -2093,7 +2093,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
if not self.handlers:
|
||||
return
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
if kwargs:
|
||||
msg = (
|
||||
@@ -2136,7 +2136,7 @@ class AsyncCallbackManager(BaseCallbackManager):
|
||||
for the retriever run.
|
||||
"""
|
||||
if run_id is None:
|
||||
run_id = uuid.uuid4()
|
||||
run_id = uuid7()
|
||||
|
||||
await ahandle_event(
|
||||
self.handlers,
|
||||
|
||||
@@ -32,7 +32,7 @@ class UsageMetadataCallbackHandler(BaseCallbackHandler):
|
||||
result_2 = llm_2.invoke("Hello", config={"callbacks": [callback]})
|
||||
callback.usage_metadata
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
{'gpt-4o-mini-2024-07-18': {'input_tokens': 8,
|
||||
'output_tokens': 10,
|
||||
@@ -119,7 +119,7 @@ def get_usage_metadata_callback(
|
||||
llm_2.invoke("Hello")
|
||||
print(cb.usage_metadata)
|
||||
|
||||
.. code-block:: none
|
||||
.. code-block::
|
||||
|
||||
{'gpt-4o-mini-2024-07-18': {'input_tokens': 8,
|
||||
'output_tokens': 10,
|
||||
|
||||
@@ -31,7 +31,7 @@ class LangSmithLoader(BaseLoader):
|
||||
for doc in loader.lazy_load():
|
||||
docs.append(doc)
|
||||
|
||||
.. code-block:: pycon
|
||||
.. code-block:: python
|
||||
|
||||
# -> [Document("...", metadata={"inputs": {...}, "outputs": {...}, ...}), ...]
|
||||
|
||||
|
||||
@@ -296,7 +296,11 @@ def index(
|
||||
For the time being, documents are indexed using their hashes, and users
|
||||
are not able to specify the uid of the document.
|
||||
|
||||
Important:
|
||||
.. versionchanged:: 0.3.25
|
||||
Added ``scoped_full`` cleanup mode.
|
||||
|
||||
.. important::
|
||||
|
||||
* In full mode, the loader should be returning
|
||||
the entire dataset, and not just a subset of the dataset.
|
||||
Otherwise, the auto_cleanup will remove documents that it is not
|
||||
@@ -309,7 +313,7 @@ def index(
|
||||
chunks, and we index them using a batch size of 5, we'll have 3 batches
|
||||
all with the same source id. In general, to avoid doing too much
|
||||
redundant work select as big a batch size as possible.
|
||||
* The `scoped_full` mode is suitable if determining an appropriate batch size
|
||||
* The ``scoped_full`` mode is suitable if determining an appropriate batch size
|
||||
is challenging or if your data loader cannot return the entire dataset at
|
||||
once. This mode keeps track of source IDs in memory, which should be fine
|
||||
for most use cases. If your dataset is large (10M+ docs), you will likely
|
||||
@@ -378,10 +382,6 @@ def index(
|
||||
TypeError: If ``vectorstore`` is not a VectorStore or a DocumentIndex.
|
||||
AssertionError: If ``source_id`` is None when cleanup mode is incremental.
|
||||
(should be unreachable code).
|
||||
|
||||
.. version_modified:: 0.3.25
|
||||
|
||||
* Added `scoped_full` cleanup mode.
|
||||
"""
|
||||
# Behavior is deprecated, but we keep it for backwards compatibility.
|
||||
# # Warn only once per process.
|
||||
@@ -636,26 +636,30 @@ async def aindex(
|
||||
documents were deleted, which documents should be skipped.
|
||||
|
||||
For the time being, documents are indexed using their hashes, and users
|
||||
are not able to specify the uid of the document.
|
||||
are not able to specify the uid of the document.
|
||||
|
||||
Important:
|
||||
* In full mode, the loader should be returning
|
||||
the entire dataset, and not just a subset of the dataset.
|
||||
Otherwise, the auto_cleanup will remove documents that it is not
|
||||
supposed to.
|
||||
* In incremental mode, if documents associated with a particular
|
||||
source id appear across different batches, the indexing API
|
||||
will do some redundant work. This will still result in the
|
||||
correct end state of the index, but will unfortunately not be
|
||||
100% efficient. For example, if a given document is split into 15
|
||||
chunks, and we index them using a batch size of 5, we'll have 3 batches
|
||||
all with the same source id. In general, to avoid doing too much
|
||||
redundant work select as big a batch size as possible.
|
||||
* The `scoped_full` mode is suitable if determining an appropriate batch size
|
||||
is challenging or if your data loader cannot return the entire dataset at
|
||||
once. This mode keeps track of source IDs in memory, which should be fine
|
||||
for most use cases. If your dataset is large (10M+ docs), you will likely
|
||||
need to parallelize the indexing process regardless.
|
||||
.. versionchanged:: 0.3.25
|
||||
Added ``scoped_full`` cleanup mode.
|
||||
|
||||
.. important::
|
||||
|
||||
* In full mode, the loader should be returning
|
||||
the entire dataset, and not just a subset of the dataset.
|
||||
Otherwise, the auto_cleanup will remove documents that it is not
|
||||
supposed to.
|
||||
* In incremental mode, if documents associated with a particular
|
||||
source id appear across different batches, the indexing API
|
||||
will do some redundant work. This will still result in the
|
||||
correct end state of the index, but will unfortunately not be
|
||||
100% efficient. For example, if a given document is split into 15
|
||||
chunks, and we index them using a batch size of 5, we'll have 3 batches
|
||||
all with the same source id. In general, to avoid doing too much
|
||||
redundant work select as big a batch size as possible.
|
||||
* The ``scoped_full`` mode is suitable if determining an appropriate batch size
|
||||
is challenging or if your data loader cannot return the entire dataset at
|
||||
once. This mode keeps track of source IDs in memory, which should be fine
|
||||
for most use cases. If your dataset is large (10M+ docs), you will likely
|
||||
need to parallelize the indexing process regardless.
|
||||
|
||||
Args:
|
||||
docs_source: Data loader or iterable of documents to index.
|
||||
@@ -720,10 +724,6 @@ async def aindex(
|
||||
TypeError: If ``vector_store`` is not a VectorStore or DocumentIndex.
|
||||
AssertionError: If ``source_id_key`` is None when cleanup mode is
|
||||
incremental or ``scoped_full`` (should be unreachable).
|
||||
|
||||
.. version_modified:: 0.3.25
|
||||
|
||||
* Added `scoped_full` cleanup mode.
|
||||
"""
|
||||
# Behavior is deprecated, but we keep it for backwards compatibility.
|
||||
# # Warn only once per process.
|
||||
|
||||
@@ -471,7 +471,7 @@ class BaseChatModel(BaseLanguageModel[BaseMessage], ABC):
|
||||
**kwargs: Any,
|
||||
) -> Iterator[BaseMessageChunk]:
|
||||
if not self._should_stream(async_api=False, **{**kwargs, "stream": True}):
|
||||
# model doesn't implement streaming, so use default implementation
|
||||
# Model doesn't implement streaming, so use default implementation
|
||||
yield cast(
|
||||
"BaseMessageChunk",
|
||||
self.invoke(input, config=config, stop=stop, **kwargs),
|
||||
|
||||
@@ -19,7 +19,7 @@ from langchain_core.runnables import RunnableConfig
|
||||
|
||||
|
||||
class FakeMessagesListChatModel(BaseChatModel):
|
||||
"""Fake ChatModel for testing purposes."""
|
||||
"""Fake ``ChatModel`` for testing purposes."""
|
||||
|
||||
responses: list[BaseMessage]
|
||||
"""List of responses to **cycle** through in order."""
|
||||
@@ -212,10 +212,11 @@ class GenericFakeChatModel(BaseChatModel):
|
||||
"""Generic fake chat model that can be used to test the chat model interface.
|
||||
|
||||
* Chat model should be usable in both sync and async tests
|
||||
* Invokes on_llm_new_token to allow for testing of callback related code for new
|
||||
* Invokes ``on_llm_new_token`` to allow for testing of callback related code for new
|
||||
tokens.
|
||||
* Includes logic to break messages into message chunk to facilitate testing of
|
||||
streaming.
|
||||
|
||||
"""
|
||||
|
||||
messages: Iterator[Union[AIMessage, str]]
|
||||
@@ -230,6 +231,7 @@ class GenericFakeChatModel(BaseChatModel):
|
||||
.. warning::
|
||||
Streaming is not implemented yet. We should try to implement it in the future by
|
||||
delegating to invoke and then breaking the resulting output into message chunks.
|
||||
|
||||
"""
|
||||
|
||||
@override
|
||||
@@ -351,6 +353,7 @@ class ParrotFakeChatModel(BaseChatModel):
|
||||
"""Generic fake chat model that can be used to test the chat model interface.
|
||||
|
||||
* Chat model should be usable in both sync and async tests
|
||||
|
||||
"""
|
||||
|
||||
@override
|
||||
|
||||
@@ -6,7 +6,7 @@ from langchain_core._import_utils import import_attr
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from langchain_core.load.dump import dumpd, dumps
|
||||
from langchain_core.load.load import loads
|
||||
from langchain_core.load.load import InitValidator, loads
|
||||
from langchain_core.load.serializable import Serializable
|
||||
|
||||
# Unfortunately, we have to eagerly import load from langchain_core/load/load.py
|
||||
@@ -15,11 +15,19 @@ if TYPE_CHECKING:
|
||||
# the `from langchain_core.load.load import load` absolute import should also work.
|
||||
from langchain_core.load.load import load
|
||||
|
||||
__all__ = ("Serializable", "dumpd", "dumps", "load", "loads")
|
||||
__all__ = (
|
||||
"InitValidator",
|
||||
"Serializable",
|
||||
"dumpd",
|
||||
"dumps",
|
||||
"load",
|
||||
"loads",
|
||||
)
|
||||
|
||||
_dynamic_imports = {
|
||||
"dumpd": "dump",
|
||||
"dumps": "dump",
|
||||
"InitValidator": "load",
|
||||
"loads": "load",
|
||||
"Serializable": "serializable",
|
||||
}
|
||||
|
||||
176
libs/core/langchain_core/load/_validation.py
Normal file
176
libs/core/langchain_core/load/_validation.py
Normal file
@@ -0,0 +1,176 @@
|
||||
"""Validation utilities for LangChain serialization.
|
||||
|
||||
Provides escape-based protection against injection attacks in serialized objects. The
|
||||
approach uses an allowlist design: only dicts explicitly produced by
|
||||
`Serializable.to_json()` are treated as LC objects during deserialization.
|
||||
|
||||
## How escaping works
|
||||
|
||||
During serialization, plain dicts (user data) that contain an `'lc'` key are wrapped:
|
||||
|
||||
```python
|
||||
{"lc": 1, ...} # user data that looks like LC object
|
||||
# becomes:
|
||||
{"__lc_escaped__": {"lc": 1, ...}}
|
||||
```
|
||||
|
||||
During deserialization, escaped dicts are unwrapped and returned as plain dicts,
|
||||
NOT instantiated as LC objects.
|
||||
"""
|
||||
|
||||
from typing import Any
|
||||
|
||||
_LC_ESCAPED_KEY = "__lc_escaped__"
|
||||
"""Sentinel key used to mark escaped user dicts during serialization.
|
||||
|
||||
When a plain dict contains 'lc' key (which could be confused with LC objects),
|
||||
we wrap it as {"__lc_escaped__": {...original...}}.
|
||||
"""
|
||||
|
||||
|
||||
def _needs_escaping(obj: dict[str, Any]) -> bool:
|
||||
"""Check if a dict needs escaping to prevent confusion with LC objects.
|
||||
|
||||
A dict needs escaping if:
|
||||
|
||||
1. It has an `'lc'` key (could be confused with LC serialization format)
|
||||
2. It has only the escape key (would be mistaken for an escaped dict)
|
||||
"""
|
||||
return "lc" in obj or (len(obj) == 1 and _LC_ESCAPED_KEY in obj)
|
||||
|
||||
|
||||
def _escape_dict(obj: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Wrap a dict in the escape marker.
|
||||
|
||||
Example:
|
||||
```python
|
||||
{"key": "value"} # becomes {"__lc_escaped__": {"key": "value"}}
|
||||
```
|
||||
"""
|
||||
return {_LC_ESCAPED_KEY: obj}
|
||||
|
||||
|
||||
def _is_escaped_dict(obj: dict[str, Any]) -> bool:
|
||||
"""Check if a dict is an escaped user dict.
|
||||
|
||||
Example:
|
||||
```python
|
||||
{"__lc_escaped__": {...}} # is an escaped dict
|
||||
```
|
||||
"""
|
||||
return len(obj) == 1 and _LC_ESCAPED_KEY in obj
|
||||
|
||||
|
||||
def _serialize_value(obj: Any) -> Any:
|
||||
"""Serialize a value with escaping of user dicts.
|
||||
|
||||
Called recursively on kwarg values to escape any plain dicts that could be confused
|
||||
with LC objects.
|
||||
|
||||
Args:
|
||||
obj: The value to serialize.
|
||||
|
||||
Returns:
|
||||
The serialized value with user dicts escaped as needed.
|
||||
"""
|
||||
from langchain_core.load.serializable import ( # noqa: PLC0415
|
||||
Serializable,
|
||||
to_json_not_implemented,
|
||||
)
|
||||
|
||||
if isinstance(obj, Serializable):
|
||||
# This is an LC object - serialize it properly (not escaped)
|
||||
return _serialize_lc_object(obj)
|
||||
if isinstance(obj, dict):
|
||||
if not all(isinstance(k, (str, int, float, bool, type(None))) for k in obj):
|
||||
# if keys are not json serializable
|
||||
return to_json_not_implemented(obj)
|
||||
# Check if dict needs escaping BEFORE recursing into values.
|
||||
# If it needs escaping, wrap it as-is - the contents are user data that
|
||||
# will be returned as-is during deserialization (no instantiation).
|
||||
# This prevents re-escaping of already-escaped nested content.
|
||||
if _needs_escaping(obj):
|
||||
return _escape_dict(obj)
|
||||
# Safe dict (no 'lc' key) - recurse into values
|
||||
return {k: _serialize_value(v) for k, v in obj.items()}
|
||||
if isinstance(obj, (list, tuple)):
|
||||
return [_serialize_value(item) for item in obj]
|
||||
if isinstance(obj, (str, int, float, bool, type(None))):
|
||||
return obj
|
||||
|
||||
# Non-JSON-serializable object (datetime, custom objects, etc.)
|
||||
return to_json_not_implemented(obj)
|
||||
|
||||
|
||||
def _is_lc_secret(obj: Any) -> bool:
|
||||
"""Check if an object is a LangChain secret marker."""
|
||||
expected_num_keys = 3
|
||||
return (
|
||||
isinstance(obj, dict)
|
||||
and obj.get("lc") == 1
|
||||
and obj.get("type") == "secret"
|
||||
and "id" in obj
|
||||
and len(obj) == expected_num_keys
|
||||
)
|
||||
|
||||
|
||||
def _serialize_lc_object(obj: Any) -> dict[str, Any]:
|
||||
"""Serialize a `Serializable` object with escaping of user data in kwargs.
|
||||
|
||||
Args:
|
||||
obj: The `Serializable` object to serialize.
|
||||
|
||||
Returns:
|
||||
The serialized dict with user data in kwargs escaped as needed.
|
||||
|
||||
Note:
|
||||
Kwargs values are processed with `_serialize_value` to escape user data (like
|
||||
metadata) that contains `'lc'` keys. Secret fields (from `lc_secrets`) are
|
||||
skipped because `to_json()` replaces their values with secret markers.
|
||||
"""
|
||||
from langchain_core.load.serializable import Serializable # noqa: PLC0415
|
||||
|
||||
if not isinstance(obj, Serializable):
|
||||
msg = f"Expected Serializable, got {type(obj)}"
|
||||
raise TypeError(msg)
|
||||
|
||||
serialized: dict[str, Any] = dict(obj.to_json())
|
||||
|
||||
# Process kwargs to escape user data that could be confused with LC objects
|
||||
# Skip secret fields - to_json() already converted them to secret markers
|
||||
if serialized.get("type") == "constructor" and "kwargs" in serialized:
|
||||
serialized["kwargs"] = {
|
||||
k: v if _is_lc_secret(v) else _serialize_value(v)
|
||||
for k, v in serialized["kwargs"].items()
|
||||
}
|
||||
|
||||
return serialized
|
||||
|
||||
|
||||
def _unescape_value(obj: Any) -> Any:
|
||||
"""Unescape a value, processing escape markers in dict values and lists.
|
||||
|
||||
When an escaped dict is encountered (`{"__lc_escaped__": ...}`), it's
|
||||
unwrapped and the contents are returned AS-IS (no further processing).
|
||||
The contents represent user data that should not be modified.
|
||||
|
||||
For regular dicts and lists, we recurse to find any nested escape markers.
|
||||
|
||||
Args:
|
||||
obj: The value to unescape.
|
||||
|
||||
Returns:
|
||||
The unescaped value.
|
||||
"""
|
||||
if isinstance(obj, dict):
|
||||
if _is_escaped_dict(obj):
|
||||
# Unwrap and return the user data as-is (no further unescaping).
|
||||
# The contents are user data that may contain more escape keys,
|
||||
# but those are part of the user's actual data.
|
||||
return obj[_LC_ESCAPED_KEY]
|
||||
|
||||
# Regular dict - recurse into values to find nested escape markers
|
||||
return {k: _unescape_value(v) for k, v in obj.items()}
|
||||
if isinstance(obj, list):
|
||||
return [_unescape_value(item) for item in obj]
|
||||
return obj
|
||||
@@ -1,10 +1,26 @@
|
||||
"""Dump objects to json."""
|
||||
"""Serialize LangChain objects to JSON.
|
||||
|
||||
Provides `dumps` (to JSON string) and `dumpd` (to dict) for serializing
|
||||
`Serializable` objects.
|
||||
|
||||
## Escaping
|
||||
|
||||
During serialization, plain dicts (user data) that contain an `'lc'` key are escaped
|
||||
by wrapping them: `{"__lc_escaped__": {...original...}}`. This prevents injection
|
||||
attacks where malicious data could trick the deserializer into instantiating
|
||||
arbitrary classes. The escape marker is removed during deserialization.
|
||||
|
||||
This is an allowlist approach: only dicts explicitly produced by
|
||||
`Serializable.to_json()` are treated as LC objects; everything else is escaped if it
|
||||
could be confused with the LC format.
|
||||
"""
|
||||
|
||||
import json
|
||||
from typing import Any
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
from langchain_core.load._validation import _serialize_value
|
||||
from langchain_core.load.serializable import Serializable, to_json_not_implemented
|
||||
from langchain_core.messages import AIMessage
|
||||
from langchain_core.outputs import ChatGeneration
|
||||
@@ -25,6 +41,20 @@ def default(obj: Any) -> Any:
|
||||
|
||||
|
||||
def _dump_pydantic_models(obj: Any) -> Any:
|
||||
"""Convert nested Pydantic models to dicts for JSON serialization.
|
||||
|
||||
Handles the special case where a `ChatGeneration` contains an `AIMessage`
|
||||
with a parsed Pydantic model in `additional_kwargs["parsed"]`. Since
|
||||
Pydantic models aren't directly JSON serializable, this converts them to
|
||||
dicts.
|
||||
|
||||
Args:
|
||||
obj: The object to process.
|
||||
|
||||
Returns:
|
||||
A copy of the object with nested Pydantic models converted to dicts, or
|
||||
the original object unchanged if no conversion was needed.
|
||||
"""
|
||||
if (
|
||||
isinstance(obj, ChatGeneration)
|
||||
and isinstance(obj.message, AIMessage)
|
||||
@@ -40,12 +70,18 @@ def _dump_pydantic_models(obj: Any) -> Any:
|
||||
def dumps(obj: Any, *, pretty: bool = False, **kwargs: Any) -> str:
|
||||
"""Return a json string representation of an object.
|
||||
|
||||
Note:
|
||||
Plain dicts containing an `'lc'` key are automatically escaped to prevent
|
||||
confusion with LC serialization format. The escape marker is removed during
|
||||
deserialization.
|
||||
|
||||
Args:
|
||||
obj: The object to dump.
|
||||
pretty: Whether to pretty print the json. If true, the json will be
|
||||
indented with 2 spaces (if no indent is provided as part of kwargs).
|
||||
Default is False.
|
||||
kwargs: Additional arguments to pass to json.dumps
|
||||
pretty: Whether to pretty print the json.
|
||||
|
||||
If `True`, the json will be indented by either 2 spaces or the amount
|
||||
provided in the `indent` kwarg.
|
||||
**kwargs: Additional arguments to pass to `json.dumps`
|
||||
|
||||
Returns:
|
||||
A json string representation of the object.
|
||||
@@ -56,25 +92,23 @@ def dumps(obj: Any, *, pretty: bool = False, **kwargs: Any) -> str:
|
||||
if "default" in kwargs:
|
||||
msg = "`default` should not be passed to dumps"
|
||||
raise ValueError(msg)
|
||||
try:
|
||||
obj = _dump_pydantic_models(obj)
|
||||
if pretty:
|
||||
indent = kwargs.pop("indent", 2)
|
||||
return json.dumps(obj, default=default, indent=indent, **kwargs)
|
||||
return json.dumps(obj, default=default, **kwargs)
|
||||
except TypeError:
|
||||
if pretty:
|
||||
indent = kwargs.pop("indent", 2)
|
||||
return json.dumps(to_json_not_implemented(obj), indent=indent, **kwargs)
|
||||
return json.dumps(to_json_not_implemented(obj), **kwargs)
|
||||
|
||||
obj = _dump_pydantic_models(obj)
|
||||
serialized = _serialize_value(obj)
|
||||
|
||||
if pretty:
|
||||
indent = kwargs.pop("indent", 2)
|
||||
return json.dumps(serialized, indent=indent, **kwargs)
|
||||
return json.dumps(serialized, **kwargs)
|
||||
|
||||
|
||||
def dumpd(obj: Any) -> Any:
|
||||
"""Return a dict representation of an object.
|
||||
|
||||
.. note::
|
||||
Unfortunately this function is not as efficient as it could be because it first
|
||||
dumps the object to a json string and then loads it back into a dictionary.
|
||||
Note:
|
||||
Plain dicts containing an `'lc'` key are automatically escaped to prevent
|
||||
confusion with LC serialization format. The escape marker is removed during
|
||||
deserialization.
|
||||
|
||||
Args:
|
||||
obj: The object to dump.
|
||||
@@ -82,4 +116,5 @@ def dumpd(obj: Any) -> Any:
|
||||
Returns:
|
||||
dictionary that can be serialized to json using json.dumps
|
||||
"""
|
||||
return json.loads(dumps(obj))
|
||||
obj = _dump_pydantic_models(obj)
|
||||
return _serialize_value(obj)
|
||||
|
||||
@@ -1,11 +1,85 @@
|
||||
"""Load LangChain objects from JSON strings or objects."""
|
||||
"""Load LangChain objects from JSON strings or objects.
|
||||
|
||||
## How it works
|
||||
|
||||
Each `Serializable` LangChain object has a unique identifier (its "class path"), which
|
||||
is a list of strings representing the module path and class name. For example:
|
||||
|
||||
- `AIMessage` -> `["langchain_core", "messages", "ai", "AIMessage"]`
|
||||
- `ChatPromptTemplate` -> `["langchain_core", "prompts", "chat", "ChatPromptTemplate"]`
|
||||
|
||||
When deserializing, the class path from the JSON `'id'` field is checked against an
|
||||
allowlist. If the class is not in the allowlist, deserialization raises a `ValueError`.
|
||||
|
||||
## Security model
|
||||
|
||||
The `allowed_objects` parameter controls which classes can be deserialized:
|
||||
|
||||
- **`'core'` (default)**: Allow classes defined in the serialization mappings for
|
||||
langchain_core.
|
||||
- **`'all'`**: Allow classes defined in the serialization mappings. This
|
||||
includes core LangChain types (messages, prompts, documents, etc.) and trusted
|
||||
partner integrations. See `langchain_core.load.mapping` for the full list.
|
||||
- **Explicit list of classes**: Only those specific classes are allowed.
|
||||
|
||||
For simple data types like messages and documents, the default allowlist is safe to use.
|
||||
These classes do not perform side effects during initialization.
|
||||
|
||||
!!! note "Side effects in allowed classes"
|
||||
|
||||
Deserialization calls `__init__` on allowed classes. If those classes perform side
|
||||
effects during initialization (network calls, file operations, etc.), those side
|
||||
effects will occur. The allowlist prevents instantiation of classes outside the
|
||||
allowlist, but does not sandbox the allowed classes themselves.
|
||||
|
||||
Import paths are also validated against trusted namespaces before any module is
|
||||
imported.
|
||||
|
||||
### Injection protection (escape-based)
|
||||
|
||||
During serialization, plain dicts that contain an `'lc'` key are escaped by wrapping
|
||||
them: `{"__lc_escaped__": {...}}`. During deserialization, escaped dicts are unwrapped
|
||||
and returned as plain dicts, NOT instantiated as LC objects.
|
||||
|
||||
This is an allowlist approach: only dicts explicitly produced by
|
||||
`Serializable.to_json()` (which are NOT escaped) are treated as LC objects;
|
||||
everything else is user data.
|
||||
|
||||
Even if an attacker's payload includes `__lc_escaped__` wrappers, it will be unwrapped
|
||||
to plain dicts and NOT instantiated as malicious objects.
|
||||
|
||||
## Examples
|
||||
|
||||
```python
|
||||
from langchain_core.load import load
|
||||
from langchain_core.prompts import ChatPromptTemplate
|
||||
from langchain_core.messages import AIMessage, HumanMessage
|
||||
|
||||
# Use default allowlist (classes from mappings) - recommended
|
||||
obj = load(data)
|
||||
|
||||
# Allow only specific classes (most restrictive)
|
||||
obj = load(
|
||||
data,
|
||||
allowed_objects=[
|
||||
ChatPromptTemplate,
|
||||
AIMessage,
|
||||
HumanMessage,
|
||||
],
|
||||
)
|
||||
```
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib
|
||||
import json
|
||||
import os
|
||||
from typing import Any, Optional
|
||||
from collections.abc import Callable, Iterable
|
||||
from typing import Any, Literal, Optional, cast
|
||||
|
||||
from langchain_core._api import beta
|
||||
from langchain_core.load._validation import _is_escaped_dict, _unescape_value
|
||||
from langchain_core.load.mapping import (
|
||||
_JS_SERIALIZABLE_MAPPING,
|
||||
_OG_SERIALIZABLE_MAPPING,
|
||||
@@ -44,12 +118,167 @@ ALL_SERIALIZABLE_MAPPINGS = {
|
||||
**_JS_SERIALIZABLE_MAPPING,
|
||||
}
|
||||
|
||||
# Cache for the default allowed class paths computed from mappings
|
||||
# Maps mode ("all" or "core") to the cached set of paths
|
||||
_default_class_paths_cache: dict[str, set[tuple[str, ...]]] = {}
|
||||
|
||||
|
||||
def _get_default_allowed_class_paths(
|
||||
allowed_object_mode: Literal["all", "core"],
|
||||
) -> set[tuple[str, ...]]:
|
||||
"""Get the default allowed class paths from the serialization mappings.
|
||||
|
||||
This uses the mappings as the source of truth for what classes are allowed
|
||||
by default. Both the legacy paths (keys) and current paths (values) are included.
|
||||
|
||||
Args:
|
||||
allowed_object_mode: either `'all'` or `'core'`.
|
||||
|
||||
Returns:
|
||||
Set of class path tuples that are allowed by default.
|
||||
"""
|
||||
if allowed_object_mode in _default_class_paths_cache:
|
||||
return _default_class_paths_cache[allowed_object_mode]
|
||||
|
||||
allowed_paths: set[tuple[str, ...]] = set()
|
||||
for key, value in ALL_SERIALIZABLE_MAPPINGS.items():
|
||||
if allowed_object_mode == "core" and value[0] != "langchain_core":
|
||||
continue
|
||||
allowed_paths.add(key)
|
||||
allowed_paths.add(value)
|
||||
|
||||
_default_class_paths_cache[allowed_object_mode] = allowed_paths
|
||||
return _default_class_paths_cache[allowed_object_mode]
|
||||
|
||||
|
||||
def _block_jinja2_templates(
|
||||
class_path: tuple[str, ...],
|
||||
kwargs: dict[str, Any],
|
||||
) -> None:
|
||||
"""Block jinja2 templates during deserialization for security.
|
||||
|
||||
Jinja2 templates can execute arbitrary code, so they are blocked by default when
|
||||
deserializing objects with `template_format='jinja2'`.
|
||||
|
||||
Note:
|
||||
We intentionally do NOT check the `class_path` here to keep this simple and
|
||||
future-proof. If any new class is added that accepts `template_format='jinja2'`,
|
||||
it will be automatically blocked without needing to update this function.
|
||||
|
||||
Args:
|
||||
class_path: The class path tuple being deserialized (unused).
|
||||
kwargs: The kwargs dict for the class constructor.
|
||||
|
||||
Raises:
|
||||
ValueError: If `template_format` is `'jinja2'`.
|
||||
"""
|
||||
_ = class_path # Unused - see docstring for rationale. Kept to satisfy signature.
|
||||
if kwargs.get("template_format") == "jinja2":
|
||||
msg = (
|
||||
"Jinja2 templates are not allowed during deserialization for security "
|
||||
"reasons. Use 'f-string' template format instead, or explicitly allow "
|
||||
"jinja2 by providing a custom init_validator."
|
||||
)
|
||||
raise ValueError(msg)
|
||||
|
||||
|
||||
def default_init_validator(
|
||||
class_path: tuple[str, ...],
|
||||
kwargs: dict[str, Any],
|
||||
) -> None:
|
||||
"""Default init validator that blocks jinja2 templates.
|
||||
|
||||
This is the default validator used by `load()` and `loads()` when no custom
|
||||
validator is provided.
|
||||
|
||||
Args:
|
||||
class_path: The class path tuple being deserialized.
|
||||
kwargs: The kwargs dict for the class constructor.
|
||||
|
||||
Raises:
|
||||
ValueError: If template_format is `'jinja2'`.
|
||||
"""
|
||||
_block_jinja2_templates(class_path, kwargs)
|
||||
|
||||
|
||||
AllowedObject = type[Serializable]
|
||||
"""Type alias for classes that can be included in the `allowed_objects` parameter.
|
||||
|
||||
Must be a `Serializable` subclass (the class itself, not an instance).
|
||||
"""
|
||||
|
||||
InitValidator = Callable[[tuple[str, ...], dict[str, Any]], None]
|
||||
"""Type alias for a callable that validates kwargs during deserialization.
|
||||
|
||||
The callable receives:
|
||||
|
||||
- `class_path`: A tuple of strings identifying the class being instantiated
|
||||
(e.g., `('langchain', 'schema', 'messages', 'AIMessage')`).
|
||||
- `kwargs`: The kwargs dict that will be passed to the constructor.
|
||||
|
||||
The validator should raise an exception if the object should not be deserialized.
|
||||
"""
|
||||
|
||||
|
||||
def _compute_allowed_class_paths(
|
||||
allowed_objects: Iterable[AllowedObject],
|
||||
import_mappings: dict[tuple[str, ...], tuple[str, ...]],
|
||||
) -> set[tuple[str, ...]]:
|
||||
"""Return allowed class paths from an explicit list of classes.
|
||||
|
||||
A class path is a tuple of strings identifying a serializable class, derived from
|
||||
`Serializable.lc_id()`. For example: `('langchain_core', 'messages', 'AIMessage')`.
|
||||
|
||||
Args:
|
||||
allowed_objects: Iterable of `Serializable` subclasses to allow.
|
||||
import_mappings: Mapping of legacy class paths to current class paths.
|
||||
|
||||
Returns:
|
||||
Set of allowed class paths.
|
||||
|
||||
Example:
|
||||
```python
|
||||
# Allow a specific class
|
||||
_compute_allowed_class_paths([MyPrompt], {}) ->
|
||||
{("langchain_core", "prompts", "MyPrompt")}
|
||||
|
||||
# Include legacy paths that map to the same class
|
||||
import_mappings = {("old", "Prompt"): ("langchain_core", "prompts", "MyPrompt")}
|
||||
_compute_allowed_class_paths([MyPrompt], import_mappings) ->
|
||||
{("langchain_core", "prompts", "MyPrompt"), ("old", "Prompt")}
|
||||
```
|
||||
"""
|
||||
allowed_objects_list = list(allowed_objects)
|
||||
|
||||
allowed_class_paths: set[tuple[str, ...]] = set()
|
||||
for allowed_obj in allowed_objects_list:
|
||||
if not isinstance(allowed_obj, type) or not issubclass(
|
||||
allowed_obj, Serializable
|
||||
):
|
||||
msg = "allowed_objects must contain Serializable subclasses."
|
||||
raise TypeError(msg)
|
||||
|
||||
class_path = tuple(allowed_obj.lc_id())
|
||||
allowed_class_paths.add(class_path)
|
||||
# Add legacy paths that map to the same class.
|
||||
for mapping_key, mapping_value in import_mappings.items():
|
||||
if tuple(mapping_value) == class_path:
|
||||
allowed_class_paths.add(mapping_key)
|
||||
return allowed_class_paths
|
||||
|
||||
|
||||
class Reviver:
|
||||
"""Reviver for JSON objects."""
|
||||
"""Reviver for JSON objects.
|
||||
|
||||
Used as the `object_hook` for `json.loads` to reconstruct LangChain objects from
|
||||
their serialized JSON representation.
|
||||
|
||||
Only classes in the allowlist can be instantiated.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
allowed_objects: Iterable[AllowedObject] | Literal["all", "core"] = "core",
|
||||
secrets_map: Optional[dict[str, str]] = None,
|
||||
valid_namespaces: Optional[list[str]] = None,
|
||||
secrets_from_env: bool = True, # noqa: FBT001,FBT002
|
||||
@@ -58,22 +287,51 @@ class Reviver:
|
||||
] = None,
|
||||
*,
|
||||
ignore_unserializable_fields: bool = False,
|
||||
init_validator: InitValidator | None = default_init_validator,
|
||||
) -> None:
|
||||
"""Initialize the reviver.
|
||||
|
||||
Args:
|
||||
secrets_map: A map of secrets to load. If a secret is not found in
|
||||
the map, it will be loaded from the environment if `secrets_from_env`
|
||||
is True. Defaults to None.
|
||||
valid_namespaces: A list of additional namespaces (modules)
|
||||
to allow to be deserialized. Defaults to None.
|
||||
allowed_objects: Allowlist of classes that can be deserialized.
|
||||
- `'core'` (default): Allow classes defined in the serialization
|
||||
mappings for `langchain_core`.
|
||||
- `'all'`: Allow classes defined in the serialization mappings.
|
||||
|
||||
This includes core LangChain types (messages, prompts, documents,
|
||||
etc.) and trusted partner integrations. See
|
||||
`langchain_core.load.mapping` for the full list.
|
||||
- Explicit list of classes: Only those specific classes are allowed.
|
||||
secrets_map: A map of secrets to load.
|
||||
If a secret is not found in the map, it will be loaded from the
|
||||
environment if `secrets_from_env` is `True`.
|
||||
|
||||
Defaults to `None`.
|
||||
valid_namespaces: Additional namespaces (modules) to allow during
|
||||
deserialization, beyond the default trusted namespaces.
|
||||
|
||||
Defaults to `None`.
|
||||
secrets_from_env: Whether to load secrets from the environment.
|
||||
Defaults to True.
|
||||
additional_import_mappings: A dictionary of additional namespace mappings
|
||||
|
||||
Defaults to `True`.
|
||||
additional_import_mappings: A dictionary of additional namespace mappings.
|
||||
|
||||
You can use this to override default mappings or add new mappings.
|
||||
Defaults to None.
|
||||
|
||||
When `allowed_objects` is `None` (using defaults), paths from these
|
||||
mappings are also added to the allowed class paths.
|
||||
|
||||
Defaults to `None`.
|
||||
ignore_unserializable_fields: Whether to ignore unserializable fields.
|
||||
Defaults to False.
|
||||
|
||||
Defaults to `False`.
|
||||
init_validator: Optional callable to validate kwargs before instantiation.
|
||||
|
||||
If provided, this function is called with `(class_path, kwargs)` where
|
||||
`class_path` is the class path tuple and `kwargs` is the kwargs dict.
|
||||
The validator should raise an exception if the object should not be
|
||||
deserialized, otherwise return `None`.
|
||||
|
||||
Defaults to `default_init_validator` which blocks jinja2 templates.
|
||||
"""
|
||||
self.secrets_from_env = secrets_from_env
|
||||
self.secrets_map = secrets_map or {}
|
||||
@@ -92,7 +350,26 @@ class Reviver:
|
||||
if self.additional_import_mappings
|
||||
else ALL_SERIALIZABLE_MAPPINGS
|
||||
)
|
||||
# Compute allowed class paths:
|
||||
# - "all" -> use default paths from mappings (+ additional_import_mappings)
|
||||
# - Explicit list -> compute from those classes
|
||||
if allowed_objects in ("all", "core"):
|
||||
self.allowed_class_paths: set[tuple[str, ...]] | None = (
|
||||
_get_default_allowed_class_paths(
|
||||
cast("Literal['all', 'core']", allowed_objects)
|
||||
).copy()
|
||||
)
|
||||
# Add paths from additional_import_mappings to the defaults
|
||||
if self.additional_import_mappings:
|
||||
for key, value in self.additional_import_mappings.items():
|
||||
self.allowed_class_paths.add(key)
|
||||
self.allowed_class_paths.add(value)
|
||||
else:
|
||||
self.allowed_class_paths = _compute_allowed_class_paths(
|
||||
cast("Iterable[AllowedObject]", allowed_objects), self.import_mappings
|
||||
)
|
||||
self.ignore_unserializable_fields = ignore_unserializable_fields
|
||||
self.init_validator = init_validator
|
||||
|
||||
def __call__(self, value: dict[str, Any]) -> Any:
|
||||
"""Revive the value.
|
||||
@@ -143,6 +420,20 @@ class Reviver:
|
||||
[*namespace, name] = value["id"]
|
||||
mapping_key = tuple(value["id"])
|
||||
|
||||
if (
|
||||
self.allowed_class_paths is not None
|
||||
and mapping_key not in self.allowed_class_paths
|
||||
):
|
||||
msg = (
|
||||
f"Deserialization of {mapping_key!r} is not allowed. "
|
||||
"The default (allowed_objects='core') only permits core "
|
||||
"langchain-core classes. To allow trusted partner integrations, "
|
||||
"use allowed_objects='all'. Alternatively, pass an explicit list "
|
||||
"of allowed classes via allowed_objects=[...]. "
|
||||
"See langchain_core.load.mapping for the full allowlist."
|
||||
)
|
||||
raise ValueError(msg)
|
||||
|
||||
if (
|
||||
namespace[0] not in self.valid_namespaces
|
||||
# The root namespace ["langchain"] is not a valid identifier.
|
||||
@@ -150,13 +441,11 @@ class Reviver:
|
||||
):
|
||||
msg = f"Invalid namespace: {value}"
|
||||
raise ValueError(msg)
|
||||
# Has explicit import path.
|
||||
# Determine explicit import path
|
||||
if mapping_key in self.import_mappings:
|
||||
import_path = self.import_mappings[mapping_key]
|
||||
# Split into module and name
|
||||
import_dir, name = import_path[:-1], import_path[-1]
|
||||
# Import module
|
||||
mod = importlib.import_module(".".join(import_dir))
|
||||
elif namespace[0] in DISALLOW_LOAD_FROM_PATH:
|
||||
msg = (
|
||||
"Trying to deserialize something that cannot "
|
||||
@@ -164,9 +453,16 @@ class Reviver:
|
||||
f"{mapping_key}."
|
||||
)
|
||||
raise ValueError(msg)
|
||||
# Otherwise, treat namespace as path.
|
||||
else:
|
||||
mod = importlib.import_module(".".join(namespace))
|
||||
# Otherwise, treat namespace as path.
|
||||
import_dir = namespace
|
||||
|
||||
# Validate import path is in trusted namespaces before importing
|
||||
if import_dir[0] not in self.valid_namespaces:
|
||||
msg = f"Invalid namespace: {value}"
|
||||
raise ValueError(msg)
|
||||
|
||||
mod = importlib.import_module(".".join(import_dir))
|
||||
|
||||
cls = getattr(mod, name)
|
||||
|
||||
@@ -178,6 +474,10 @@ class Reviver:
|
||||
# We don't need to recurse on kwargs
|
||||
# as json.loads will do that for us.
|
||||
kwargs = value.get("kwargs", {})
|
||||
|
||||
if self.init_validator is not None:
|
||||
self.init_validator(mapping_key, kwargs)
|
||||
|
||||
return cls(**kwargs)
|
||||
|
||||
return value
|
||||
@@ -187,43 +487,76 @@ class Reviver:
|
||||
def loads(
|
||||
text: str,
|
||||
*,
|
||||
allowed_objects: Iterable[AllowedObject] | Literal["all", "core"] = "core",
|
||||
secrets_map: Optional[dict[str, str]] = None,
|
||||
valid_namespaces: Optional[list[str]] = None,
|
||||
secrets_from_env: bool = True,
|
||||
additional_import_mappings: Optional[dict[tuple[str, ...], tuple[str, ...]]] = None,
|
||||
ignore_unserializable_fields: bool = False,
|
||||
init_validator: InitValidator | None = default_init_validator,
|
||||
) -> Any:
|
||||
"""Revive a LangChain class from a JSON string.
|
||||
|
||||
Equivalent to `load(json.loads(text))`.
|
||||
|
||||
Only classes in the allowlist can be instantiated. The default allowlist includes
|
||||
core LangChain types (messages, prompts, documents, etc.). See
|
||||
`langchain_core.load.mapping` for the full list.
|
||||
|
||||
Args:
|
||||
text: The string to load.
|
||||
secrets_map: A map of secrets to load. If a secret is not found in
|
||||
the map, it will be loaded from the environment if `secrets_from_env`
|
||||
is True. Defaults to None.
|
||||
valid_namespaces: A list of additional namespaces (modules)
|
||||
to allow to be deserialized. Defaults to None.
|
||||
allowed_objects: Allowlist of classes that can be deserialized.
|
||||
|
||||
- `'core'` (default): Allow classes defined in the serialization mappings
|
||||
for langchain_core.
|
||||
- `'all'`: Allow classes defined in the serialization mappings.
|
||||
|
||||
This includes core LangChain types (messages, prompts, documents, etc.)
|
||||
and trusted partner integrations. See `langchain_core.load.mapping` for
|
||||
the full list.
|
||||
- Explicit list of classes: Only those specific classes are allowed.
|
||||
- `[]`: Disallow all deserialization (will raise on any object).
|
||||
secrets_map: A map of secrets to load.
|
||||
|
||||
If a secret is not found in the map, it will be loaded from the environment
|
||||
if `secrets_from_env` is `True`. Defaults to None.
|
||||
valid_namespaces: Additional namespaces (modules) to allow during
|
||||
deserialization, beyond the default trusted namespaces. Defaults to None.
|
||||
secrets_from_env: Whether to load secrets from the environment.
|
||||
Defaults to True.
|
||||
additional_import_mappings: A dictionary of additional namespace mappings
|
||||
additional_import_mappings: A dictionary of additional namespace mappings.
|
||||
|
||||
You can use this to override default mappings or add new mappings.
|
||||
Defaults to None.
|
||||
|
||||
When `allowed_objects` is `None` (using defaults), paths from these
|
||||
mappings are also added to the allowed class paths. Defaults to None.
|
||||
ignore_unserializable_fields: Whether to ignore unserializable fields.
|
||||
Defaults to False.
|
||||
init_validator: Optional callable to validate kwargs before instantiation.
|
||||
|
||||
If provided, this function is called with `(class_path, kwargs)` where
|
||||
`class_path` is the class path tuple and `kwargs` is the kwargs dict.
|
||||
The validator should raise an exception if the object should not be
|
||||
deserialized, otherwise return `None`. Defaults to
|
||||
`default_init_validator` which blocks jinja2 templates.
|
||||
|
||||
Returns:
|
||||
Revived LangChain objects.
|
||||
|
||||
Raises:
|
||||
ValueError: If an object's class path is not in the `allowed_objects` allowlist.
|
||||
"""
|
||||
return json.loads(
|
||||
text,
|
||||
object_hook=Reviver(
|
||||
secrets_map,
|
||||
valid_namespaces,
|
||||
secrets_from_env,
|
||||
additional_import_mappings,
|
||||
ignore_unserializable_fields=ignore_unserializable_fields,
|
||||
),
|
||||
# Parse JSON and delegate to load() for proper escape handling
|
||||
raw_obj = json.loads(text)
|
||||
return load(
|
||||
raw_obj,
|
||||
allowed_objects=allowed_objects,
|
||||
secrets_map=secrets_map,
|
||||
valid_namespaces=valid_namespaces,
|
||||
secrets_from_env=secrets_from_env,
|
||||
additional_import_mappings=additional_import_mappings,
|
||||
ignore_unserializable_fields=ignore_unserializable_fields,
|
||||
init_validator=init_validator,
|
||||
)
|
||||
|
||||
|
||||
@@ -231,46 +564,107 @@ def loads(
|
||||
def load(
|
||||
obj: Any,
|
||||
*,
|
||||
allowed_objects: Iterable[AllowedObject] | Literal["all", "core"] = "core",
|
||||
secrets_map: Optional[dict[str, str]] = None,
|
||||
valid_namespaces: Optional[list[str]] = None,
|
||||
secrets_from_env: bool = True,
|
||||
additional_import_mappings: Optional[dict[tuple[str, ...], tuple[str, ...]]] = None,
|
||||
ignore_unserializable_fields: bool = False,
|
||||
init_validator: InitValidator | None = default_init_validator,
|
||||
) -> Any:
|
||||
"""Revive a LangChain class from a JSON object.
|
||||
|
||||
Use this if you already have a parsed JSON object,
|
||||
eg. from `json.load` or `orjson.loads`.
|
||||
Use this if you already have a parsed JSON object, eg. from `json.load` or
|
||||
`orjson.loads`.
|
||||
|
||||
Only classes in the allowlist can be instantiated. The default allowlist includes
|
||||
core LangChain types (messages, prompts, documents, etc.). See
|
||||
`langchain_core.load.mapping` for the full list.
|
||||
|
||||
Args:
|
||||
obj: The object to load.
|
||||
secrets_map: A map of secrets to load. If a secret is not found in
|
||||
the map, it will be loaded from the environment if `secrets_from_env`
|
||||
is True. Defaults to None.
|
||||
valid_namespaces: A list of additional namespaces (modules)
|
||||
to allow to be deserialized. Defaults to None.
|
||||
allowed_objects: Allowlist of classes that can be deserialized.
|
||||
|
||||
- `'core'` (default): Allow classes defined in the serialization mappings
|
||||
for langchain_core.
|
||||
- `'all'`: Allow classes defined in the serialization mappings.
|
||||
|
||||
This includes core LangChain types (messages, prompts, documents, etc.)
|
||||
and trusted partner integrations. See `langchain_core.load.mapping` for
|
||||
the full list.
|
||||
- Explicit list of classes: Only those specific classes are allowed.
|
||||
- `[]`: Disallow all deserialization (will raise on any object).
|
||||
secrets_map: A map of secrets to load.
|
||||
|
||||
If a secret is not found in the map, it will be loaded from the environment
|
||||
if `secrets_from_env` is `True`. Defaults to None.
|
||||
valid_namespaces: Additional namespaces (modules) to allow during
|
||||
deserialization, beyond the default trusted namespaces. Defaults to None.
|
||||
secrets_from_env: Whether to load secrets from the environment.
|
||||
Defaults to True.
|
||||
additional_import_mappings: A dictionary of additional namespace mappings
|
||||
additional_import_mappings: A dictionary of additional namespace mappings.
|
||||
|
||||
You can use this to override default mappings or add new mappings.
|
||||
Defaults to None.
|
||||
|
||||
When `allowed_objects` is `None` (using defaults), paths from these
|
||||
mappings are also added to the allowed class paths. Defaults to None.
|
||||
ignore_unserializable_fields: Whether to ignore unserializable fields.
|
||||
Defaults to False.
|
||||
init_validator: Optional callable to validate kwargs before instantiation.
|
||||
|
||||
If provided, this function is called with `(class_path, kwargs)` where
|
||||
`class_path` is the class path tuple and `kwargs` is the kwargs dict.
|
||||
The validator should raise an exception if the object should not be
|
||||
deserialized, otherwise return `None`. Defaults to
|
||||
`default_init_validator` which blocks jinja2 templates.
|
||||
|
||||
Returns:
|
||||
Revived LangChain objects.
|
||||
|
||||
Raises:
|
||||
ValueError: If an object's class path is not in the `allowed_objects` allowlist.
|
||||
|
||||
Example:
|
||||
```python
|
||||
from langchain_core.load import load, dumpd
|
||||
from langchain_core.messages import AIMessage
|
||||
|
||||
msg = AIMessage(content="Hello")
|
||||
data = dumpd(msg)
|
||||
|
||||
# Deserialize using default allowlist
|
||||
loaded = load(data)
|
||||
|
||||
# Or with explicit allowlist
|
||||
loaded = load(data, allowed_objects=[AIMessage])
|
||||
|
||||
# Or extend defaults with additional mappings
|
||||
loaded = load(
|
||||
data,
|
||||
additional_import_mappings={
|
||||
("my_pkg", "MyClass"): ("my_pkg", "module", "MyClass"),
|
||||
},
|
||||
)
|
||||
```
|
||||
"""
|
||||
reviver = Reviver(
|
||||
allowed_objects,
|
||||
secrets_map,
|
||||
valid_namespaces,
|
||||
secrets_from_env,
|
||||
additional_import_mappings,
|
||||
ignore_unserializable_fields=ignore_unserializable_fields,
|
||||
init_validator=init_validator,
|
||||
)
|
||||
|
||||
def _load(obj: Any) -> Any:
|
||||
if isinstance(obj, dict):
|
||||
# Need to revive leaf nodes before reviving this node
|
||||
# Check for escaped dict FIRST (before recursing).
|
||||
# Escaped dicts are user data that should NOT be processed as LC objects.
|
||||
if _is_escaped_dict(obj):
|
||||
return _unescape_value(obj)
|
||||
|
||||
# Not escaped - recurse into children then apply reviver
|
||||
loaded_obj = {k: _load(v) for k, v in obj.items()}
|
||||
return reviver(loaded_obj)
|
||||
if isinstance(obj, list):
|
||||
|
||||
@@ -1,21 +1,19 @@
|
||||
"""Serialization mapping.
|
||||
|
||||
This file contains a mapping between the lc_namespace path for a given
|
||||
subclass that implements from Serializable to the namespace
|
||||
This file contains a mapping between the `lc_namespace` path for a given
|
||||
subclass that implements from `Serializable` to the namespace
|
||||
where that class is actually located.
|
||||
|
||||
This mapping helps maintain the ability to serialize and deserialize
|
||||
well-known LangChain objects even if they are moved around in the codebase
|
||||
across different LangChain versions.
|
||||
|
||||
For example,
|
||||
For example, the code for the `AIMessage` class is located in
|
||||
`langchain_core.messages.ai.AIMessage`. This message is associated with the
|
||||
`lc_namespace` of `["langchain", "schema", "messages", "AIMessage"]`,
|
||||
because this code was originally in `langchain.schema.messages.AIMessage`.
|
||||
|
||||
The code for AIMessage class is located in langchain_core.messages.ai.AIMessage,
|
||||
This message is associated with the lc_namespace
|
||||
["langchain", "schema", "messages", "AIMessage"],
|
||||
because this code was originally in langchain.schema.messages.AIMessage.
|
||||
|
||||
The mapping allows us to deserialize an AIMessage created with an older
|
||||
The mapping allows us to deserialize an `AIMessage` created with an older
|
||||
version of LangChain where the code was in a different location.
|
||||
"""
|
||||
|
||||
@@ -275,6 +273,11 @@ SERIALIZABLE_MAPPING: dict[tuple[str, ...], tuple[str, ...]] = {
|
||||
"chat_models",
|
||||
"ChatGroq",
|
||||
),
|
||||
("langchain_xai", "chat_models", "ChatXAI"): (
|
||||
"langchain_xai",
|
||||
"chat_models",
|
||||
"ChatXAI",
|
||||
),
|
||||
("langchain", "chat_models", "fireworks", "ChatFireworks"): (
|
||||
"langchain_fireworks",
|
||||
"chat_models",
|
||||
@@ -530,16 +533,6 @@ SERIALIZABLE_MAPPING: dict[tuple[str, ...], tuple[str, ...]] = {
|
||||
"structured",
|
||||
"StructuredPrompt",
|
||||
),
|
||||
("langchain_sambanova", "chat_models", "ChatSambaNovaCloud"): (
|
||||
"langchain_sambanova",
|
||||
"chat_models",
|
||||
"ChatSambaNovaCloud",
|
||||
),
|
||||
("langchain_sambanova", "chat_models", "ChatSambaStudio"): (
|
||||
"langchain_sambanova",
|
||||
"chat_models",
|
||||
"ChatSambaStudio",
|
||||
),
|
||||
("langchain_core", "prompts", "message", "_DictMessagePromptTemplate"): (
|
||||
"langchain_core",
|
||||
"prompts",
|
||||
|
||||
@@ -111,7 +111,7 @@ class Serializable(BaseModel, ABC):
|
||||
|
||||
# Remove default BaseModel init docstring.
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
"""""" # noqa: D419
|
||||
"""""" # noqa: D419 # Intentional blank docstring
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -45,7 +45,6 @@ class InputTokenDetails(TypedDict, total=False):
|
||||
Does *not* need to sum to full input token count. Does *not* need to have all keys.
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
@@ -72,6 +71,7 @@ class InputTokenDetails(TypedDict, total=False):
|
||||
|
||||
Since there was a cache hit, the tokens were read from the cache. More precisely,
|
||||
the model state given these tokens was read from the cache.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
@@ -81,7 +81,6 @@ class OutputTokenDetails(TypedDict, total=False):
|
||||
Does *not* need to sum to full output token count. Does *not* need to have all keys.
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
@@ -100,6 +99,7 @@ class OutputTokenDetails(TypedDict, total=False):
|
||||
|
||||
Tokens generated by the model in a chain of thought process (i.e. by OpenAI's o1
|
||||
models) that are not returned as part of model output.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
@@ -109,7 +109,6 @@ class UsageMetadata(TypedDict):
|
||||
This is a standard representation of token usage that is consistent across models.
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
@@ -148,6 +147,7 @@ class UsageMetadata(TypedDict):
|
||||
"""Breakdown of output token counts.
|
||||
|
||||
Does *not* need to sum to full output token count. Does *not* need to have all keys.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
@@ -159,12 +159,14 @@ class AIMessage(BaseMessage):
|
||||
This message represents the output of the model and consists of both
|
||||
the raw output as returned by the model together standardized fields
|
||||
(e.g., tool calls, usage metadata) added by the LangChain framework.
|
||||
|
||||
"""
|
||||
|
||||
example: bool = False
|
||||
"""Use to denote that a message is part of an example conversation.
|
||||
|
||||
At the moment, this is ignored by most models. Usage is discouraged.
|
||||
|
||||
"""
|
||||
|
||||
tool_calls: list[ToolCall] = []
|
||||
@@ -175,15 +177,18 @@ class AIMessage(BaseMessage):
|
||||
"""If provided, usage metadata for a message, such as token counts.
|
||||
|
||||
This is a standard representation of token usage that is consistent across models.
|
||||
|
||||
"""
|
||||
|
||||
type: Literal["ai"] = "ai"
|
||||
"""The type of the message (used for deserialization). Defaults to "ai"."""
|
||||
"""The type of the message (used for deserialization). Defaults to ``'ai'``."""
|
||||
|
||||
def __init__(
|
||||
self, content: Union[str, list[Union[str, dict]]], **kwargs: Any
|
||||
self,
|
||||
content: Union[str, list[Union[str, dict]]],
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Pass in content as positional arg.
|
||||
"""Initialize ``AIMessage``.
|
||||
|
||||
Args:
|
||||
content: The content of the message.
|
||||
@@ -254,6 +259,7 @@ class AIMessage(BaseMessage):
|
||||
|
||||
Returns:
|
||||
A pretty representation of the message.
|
||||
|
||||
"""
|
||||
base = super().pretty_repr(html=html)
|
||||
lines = []
|
||||
@@ -293,7 +299,10 @@ class AIMessageChunk(AIMessage, BaseMessageChunk):
|
||||
# non-chunk variant.
|
||||
type: Literal["AIMessageChunk"] = "AIMessageChunk" # type: ignore[assignment]
|
||||
"""The type of the message (used for deserialization).
|
||||
Defaults to "AIMessageChunk"."""
|
||||
|
||||
Defaults to ``AIMessageChunk``.
|
||||
|
||||
"""
|
||||
|
||||
tool_call_chunks: list[ToolCallChunk] = []
|
||||
"""If provided, tool call chunks associated with the message."""
|
||||
@@ -311,7 +320,10 @@ class AIMessageChunk(AIMessage, BaseMessageChunk):
|
||||
"""Initialize tool calls from tool call chunks.
|
||||
|
||||
Returns:
|
||||
This ``AIMessageChunk``.
|
||||
The values with tool calls initialized.
|
||||
|
||||
Raises:
|
||||
ValueError: If the tool call chunks are malformed.
|
||||
"""
|
||||
if not self.tool_call_chunks:
|
||||
if self.tool_calls:
|
||||
@@ -522,9 +534,9 @@ def add_usage(
|
||||
def subtract_usage(
|
||||
left: Optional[UsageMetadata], right: Optional[UsageMetadata]
|
||||
) -> UsageMetadata:
|
||||
"""Recursively subtract two UsageMetadata objects.
|
||||
"""Recursively subtract two ``UsageMetadata`` objects.
|
||||
|
||||
Token counts cannot be negative so the actual operation is max(left - right, 0).
|
||||
Token counts cannot be negative so the actual operation is ``max(left - right, 0)``.
|
||||
|
||||
Example:
|
||||
.. code-block:: python
|
||||
|
||||
@@ -20,7 +20,7 @@ if TYPE_CHECKING:
|
||||
class BaseMessage(Serializable):
|
||||
"""Base abstract message class.
|
||||
|
||||
Messages are the inputs and outputs of ChatModels.
|
||||
Messages are the inputs and outputs of ``ChatModel``s.
|
||||
"""
|
||||
|
||||
content: Union[str, list[Union[str, dict]]]
|
||||
@@ -31,17 +31,18 @@ class BaseMessage(Serializable):
|
||||
|
||||
For example, for a message from an AI, this could include tool calls as
|
||||
encoded by the model provider.
|
||||
|
||||
"""
|
||||
|
||||
response_metadata: dict = Field(default_factory=dict)
|
||||
"""Response metadata. For example: response headers, logprobs, token counts, model
|
||||
name."""
|
||||
"""Examples: response headers, logprobs, token counts, model name."""
|
||||
|
||||
type: str
|
||||
"""The type of the message. Must be a string that is unique to the message type.
|
||||
|
||||
The purpose of this field is to allow for easy identification of the message type
|
||||
when deserializing messages.
|
||||
|
||||
"""
|
||||
|
||||
name: Optional[str] = None
|
||||
@@ -51,20 +52,26 @@ class BaseMessage(Serializable):
|
||||
|
||||
Usage of this field is optional, and whether it's used or not is up to the
|
||||
model implementation.
|
||||
|
||||
"""
|
||||
|
||||
id: Optional[str] = Field(default=None, coerce_numbers_to_str=True)
|
||||
"""An optional unique identifier for the message. This should ideally be
|
||||
provided by the provider/model which created the message."""
|
||||
"""An optional unique identifier for the message.
|
||||
|
||||
This should ideally be provided by the provider/model which created the message.
|
||||
|
||||
"""
|
||||
|
||||
model_config = ConfigDict(
|
||||
extra="allow",
|
||||
)
|
||||
|
||||
def __init__(
|
||||
self, content: Union[str, list[Union[str, dict]]], **kwargs: Any
|
||||
self,
|
||||
content: Union[str, list[Union[str, dict]]],
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Pass in content as positional arg.
|
||||
"""Initialize ``BaseMessage``.
|
||||
|
||||
Args:
|
||||
content: The string contents of the message.
|
||||
@@ -73,7 +80,7 @@ class BaseMessage(Serializable):
|
||||
|
||||
@classmethod
|
||||
def is_lc_serializable(cls) -> bool:
|
||||
"""BaseMessage is serializable.
|
||||
"""``BaseMessage`` is serializable.
|
||||
|
||||
Returns:
|
||||
True
|
||||
@@ -90,10 +97,11 @@ class BaseMessage(Serializable):
|
||||
return ["langchain", "schema", "messages"]
|
||||
|
||||
def text(self) -> str:
|
||||
"""Get the text content of the message.
|
||||
"""Get the text ``content`` of the message.
|
||||
|
||||
Returns:
|
||||
The text content of the message.
|
||||
|
||||
"""
|
||||
if isinstance(self.content, str):
|
||||
return self.content
|
||||
@@ -136,6 +144,7 @@ class BaseMessage(Serializable):
|
||||
|
||||
Returns:
|
||||
A pretty representation of the message.
|
||||
|
||||
"""
|
||||
title = get_msg_title_repr(self.type.title() + " Message", bold=html)
|
||||
# TODO: handle non-string content.
|
||||
@@ -155,11 +164,12 @@ def merge_content(
|
||||
"""Merge multiple message contents.
|
||||
|
||||
Args:
|
||||
first_content: The first content. Can be a string or a list.
|
||||
contents: The other contents. Can be a string or a list.
|
||||
first_content: The first ``content``. Can be a string or a list.
|
||||
contents: The other ``content``s. Can be a string or a list.
|
||||
|
||||
Returns:
|
||||
The merged content.
|
||||
|
||||
"""
|
||||
merged = first_content
|
||||
for content in contents:
|
||||
@@ -207,9 +217,10 @@ class BaseMessageChunk(BaseMessage):
|
||||
|
||||
For example,
|
||||
|
||||
`AIMessageChunk(content="Hello") + AIMessageChunk(content=" World")`
|
||||
``AIMessageChunk(content="Hello") + AIMessageChunk(content=" World")``
|
||||
|
||||
will give ``AIMessageChunk(content="Hello World")``
|
||||
|
||||
will give `AIMessageChunk(content="Hello World")`
|
||||
"""
|
||||
if isinstance(other, BaseMessageChunk):
|
||||
# If both are (subclasses of) BaseMessageChunk,
|
||||
@@ -257,8 +268,9 @@ def message_to_dict(message: BaseMessage) -> dict:
|
||||
message: Message to convert.
|
||||
|
||||
Returns:
|
||||
Message as a dict. The dict will have a "type" key with the message type
|
||||
and a "data" key with the message data as a dict.
|
||||
Message as a dict. The dict will have a ``type`` key with the message type
|
||||
and a ``data`` key with the message data as a dict.
|
||||
|
||||
"""
|
||||
return {"type": message.type, "data": message.model_dump()}
|
||||
|
||||
@@ -267,10 +279,11 @@ def messages_to_dict(messages: Sequence[BaseMessage]) -> list[dict]:
|
||||
"""Convert a sequence of Messages to a list of dictionaries.
|
||||
|
||||
Args:
|
||||
messages: Sequence of messages (as BaseMessages) to convert.
|
||||
messages: Sequence of messages (as ``BaseMessage``s) to convert.
|
||||
|
||||
Returns:
|
||||
List of messages as dicts.
|
||||
|
||||
"""
|
||||
return [message_to_dict(m) for m in messages]
|
||||
|
||||
@@ -284,6 +297,7 @@ def get_msg_title_repr(title: str, *, bold: bool = False) -> str:
|
||||
|
||||
Returns:
|
||||
The title representation.
|
||||
|
||||
"""
|
||||
padded = " " + title + " "
|
||||
sep_len = (80 - len(padded)) // 2
|
||||
|
||||
@@ -30,7 +30,10 @@ class ChatMessageChunk(ChatMessage, BaseMessageChunk):
|
||||
# non-chunk variant.
|
||||
type: Literal["ChatMessageChunk"] = "ChatMessageChunk" # type: ignore[assignment]
|
||||
"""The type of the message (used during serialization).
|
||||
Defaults to "ChatMessageChunk"."""
|
||||
|
||||
Defaults to ``'ChatMessageChunk'``.
|
||||
|
||||
"""
|
||||
|
||||
@override
|
||||
def __add__(self, other: Any) -> BaseMessageChunk: # type: ignore[override]
|
||||
|
||||
@@ -15,19 +15,20 @@ from langchain_core.utils._merge import merge_dicts
|
||||
class FunctionMessage(BaseMessage):
|
||||
"""Message for passing the result of executing a tool back to a model.
|
||||
|
||||
FunctionMessage are an older version of the ToolMessage schema, and
|
||||
do not contain the tool_call_id field.
|
||||
``FunctionMessage`` are an older version of the ``ToolMessage`` schema, and
|
||||
do not contain the ``tool_call_id`` field.
|
||||
|
||||
The tool_call_id field is used to associate the tool call request with the
|
||||
The ``tool_call_id`` field is used to associate the tool call request with the
|
||||
tool call response. This is useful in situations where a chat model is able
|
||||
to request multiple tool calls in parallel.
|
||||
|
||||
"""
|
||||
|
||||
name: str
|
||||
"""The name of the function that was executed."""
|
||||
|
||||
type: Literal["function"] = "function"
|
||||
"""The type of the message (used for serialization). Defaults to "function"."""
|
||||
"""The type of the message (used for serialization). Defaults to ``'function'``."""
|
||||
|
||||
|
||||
class FunctionMessageChunk(FunctionMessage, BaseMessageChunk):
|
||||
@@ -38,7 +39,10 @@ class FunctionMessageChunk(FunctionMessage, BaseMessageChunk):
|
||||
# non-chunk variant.
|
||||
type: Literal["FunctionMessageChunk"] = "FunctionMessageChunk" # type: ignore[assignment]
|
||||
"""The type of the message (used for serialization).
|
||||
Defaults to "FunctionMessageChunk"."""
|
||||
|
||||
Defaults to ``'FunctionMessageChunk'``.
|
||||
|
||||
"""
|
||||
|
||||
@override
|
||||
def __add__(self, other: Any) -> BaseMessageChunk: # type: ignore[override]
|
||||
|
||||
@@ -8,7 +8,7 @@ from langchain_core.messages.base import BaseMessage, BaseMessageChunk
|
||||
class HumanMessage(BaseMessage):
|
||||
"""Message from a human.
|
||||
|
||||
HumanMessages are messages that are passed in from a human to the model.
|
||||
``HumanMessage``s are messages that are passed in from a human to the model.
|
||||
|
||||
Example:
|
||||
|
||||
@@ -32,15 +32,22 @@ class HumanMessage(BaseMessage):
|
||||
|
||||
At the moment, this is ignored by most models. Usage is discouraged.
|
||||
Defaults to False.
|
||||
|
||||
"""
|
||||
|
||||
type: Literal["human"] = "human"
|
||||
"""The type of the message (used for serialization). Defaults to "human"."""
|
||||
"""The type of the message (used for serialization).
|
||||
|
||||
Defaults to ``'human'``.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, content: Union[str, list[Union[str, dict]]], **kwargs: Any
|
||||
self,
|
||||
content: Union[str, list[Union[str, dict]]],
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Pass in content as positional arg.
|
||||
"""Initialize ``HumanMessage``.
|
||||
|
||||
Args:
|
||||
content: The string contents of the message.
|
||||
|
||||
@@ -24,6 +24,7 @@ class RemoveMessage(BaseMessage):
|
||||
|
||||
Raises:
|
||||
ValueError: If the 'content' field is passed in kwargs.
|
||||
|
||||
"""
|
||||
if kwargs.pop("content", None):
|
||||
msg = "RemoveMessage does not support 'content' field."
|
||||
|
||||
@@ -28,7 +28,11 @@ class SystemMessage(BaseMessage):
|
||||
"""
|
||||
|
||||
type: Literal["system"] = "system"
|
||||
"""The type of the message (used for serialization). Defaults to "system"."""
|
||||
"""The type of the message (used for serialization).
|
||||
|
||||
Defaults to ``'system'``.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, content: Union[str, list[Union[str, dict]]], **kwargs: Any
|
||||
@@ -50,4 +54,7 @@ class SystemMessageChunk(SystemMessage, BaseMessageChunk):
|
||||
# non-chunk variant.
|
||||
type: Literal["SystemMessageChunk"] = "SystemMessageChunk" # type: ignore[assignment]
|
||||
"""The type of the message (used for serialization).
|
||||
Defaults to "SystemMessageChunk"."""
|
||||
|
||||
Defaults to ``'SystemMessageChunk'``.
|
||||
|
||||
"""
|
||||
|
||||
@@ -14,19 +14,20 @@ from langchain_core.utils._merge import merge_dicts, merge_obj
|
||||
class ToolOutputMixin:
|
||||
"""Mixin for objects that tools can return directly.
|
||||
|
||||
If a custom BaseTool is invoked with a ToolCall and the output of custom code is
|
||||
not an instance of ToolOutputMixin, the output will automatically be coerced to a
|
||||
string and wrapped in a ToolMessage.
|
||||
If a custom BaseTool is invoked with a ``ToolCall`` and the output of custom code is
|
||||
not an instance of ``ToolOutputMixin``, the output will automatically be coerced to
|
||||
a string and wrapped in a ``ToolMessage``.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
"""Message for passing the result of executing a tool back to a model.
|
||||
|
||||
ToolMessages contain the result of a tool invocation. Typically, the result
|
||||
is encoded inside the `content` field.
|
||||
``ToolMessage``s contain the result of a tool invocation. Typically, the result
|
||||
is encoded inside the ``content`` field.
|
||||
|
||||
Example: A ToolMessage representing a result of 42 from a tool call with id
|
||||
Example: A ``ToolMessage`` representing a result of ``42`` from a tool call with id
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
@@ -35,7 +36,7 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
ToolMessage(content="42", tool_call_id="call_Jja7J89XsjrOLA5r!MEOW!SL")
|
||||
|
||||
|
||||
Example: A ToolMessage where only part of the tool output is sent to the model
|
||||
Example: A ``ToolMessage`` where only part of the tool output is sent to the model
|
||||
and the full output is passed in to artifact.
|
||||
|
||||
.. versionadded:: 0.2.17
|
||||
@@ -57,7 +58,7 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
tool_call_id="call_Jja7J89XsjrOLA5r!MEOW!SL",
|
||||
)
|
||||
|
||||
The tool_call_id field is used to associate the tool call request with the
|
||||
The ``tool_call_id`` field is used to associate the tool call request with the
|
||||
tool call response. This is useful in situations where a chat model is able
|
||||
to request multiple tool calls in parallel.
|
||||
|
||||
@@ -67,7 +68,11 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
"""Tool call that this message is responding to."""
|
||||
|
||||
type: Literal["tool"] = "tool"
|
||||
"""The type of the message (used for serialization). Defaults to "tool"."""
|
||||
"""The type of the message (used for serialization).
|
||||
|
||||
Defaults to ``'tool'``.
|
||||
|
||||
"""
|
||||
|
||||
artifact: Any = None
|
||||
"""Artifact of the Tool execution which is not meant to be sent to the model.
|
||||
@@ -77,12 +82,14 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
output is needed in other parts of the code.
|
||||
|
||||
.. versionadded:: 0.2.17
|
||||
|
||||
"""
|
||||
|
||||
status: Literal["success", "error"] = "success"
|
||||
"""Status of the tool invocation.
|
||||
|
||||
.. versionadded:: 0.2.24
|
||||
|
||||
"""
|
||||
|
||||
additional_kwargs: dict = Field(default_factory=dict, repr=False)
|
||||
@@ -97,6 +104,7 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
|
||||
Args:
|
||||
values: The model arguments.
|
||||
|
||||
"""
|
||||
content = values["content"]
|
||||
if isinstance(content, tuple):
|
||||
@@ -135,9 +143,11 @@ class ToolMessage(BaseMessage, ToolOutputMixin):
|
||||
return values
|
||||
|
||||
def __init__(
|
||||
self, content: Union[str, list[Union[str, dict]]], **kwargs: Any
|
||||
self,
|
||||
content: Union[str, list[Union[str, dict]]],
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Create a ToolMessage.
|
||||
"""Initialize ``ToolMessage``.
|
||||
|
||||
Args:
|
||||
content: The string contents of the message.
|
||||
@@ -187,8 +197,8 @@ class ToolCall(TypedDict):
|
||||
|
||||
{"name": "foo", "args": {"a": 1}, "id": "123"}
|
||||
|
||||
This represents a request to call the tool named "foo" with arguments {"a": 1}
|
||||
and an identifier of "123".
|
||||
This represents a request to call the tool named ``'foo'`` with arguments
|
||||
``{"a": 1}`` and an identifier of ``'123'``.
|
||||
|
||||
"""
|
||||
|
||||
@@ -201,6 +211,7 @@ class ToolCall(TypedDict):
|
||||
|
||||
An identifier is needed to associate a tool call request with a tool
|
||||
call result in events when multiple concurrent tool calls are made.
|
||||
|
||||
"""
|
||||
type: NotRequired[Literal["tool_call"]]
|
||||
|
||||
@@ -227,9 +238,9 @@ def tool_call(
|
||||
class ToolCallChunk(TypedDict):
|
||||
"""A chunk of a tool call (e.g., as part of a stream).
|
||||
|
||||
When merging ToolCallChunks (e.g., via AIMessageChunk.__add__),
|
||||
When merging ``ToolCallChunk``s (e.g., via ``AIMessageChunk.__add__``),
|
||||
all string attributes are concatenated. Chunks are only merged if their
|
||||
values of `index` are equal and not None.
|
||||
values of ``index`` are equal and not None.
|
||||
|
||||
Example:
|
||||
|
||||
@@ -282,7 +293,7 @@ def tool_call_chunk(
|
||||
class InvalidToolCall(TypedDict):
|
||||
"""Allowance for errors made by LLM.
|
||||
|
||||
Here we add an `error` key to surface errors made during generation
|
||||
Here we add an ``error`` key to surface errors made during generation
|
||||
(e.g., invalid JSON arguments.)
|
||||
"""
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@ Some examples of what you can do with these functions include:
|
||||
* Convert messages to strings (serialization)
|
||||
* Convert messages from dicts to Message objects (deserialization)
|
||||
* Filter messages from a list of messages based on name, type or id etc.
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
@@ -91,13 +92,14 @@ AnyMessage = Annotated[
|
||||
def get_buffer_string(
|
||||
messages: Sequence[BaseMessage], human_prefix: str = "Human", ai_prefix: str = "AI"
|
||||
) -> str:
|
||||
r"""Convert a sequence of Messages to strings and concatenate them into one string.
|
||||
r"""Convert a sequence of messages to strings and concatenate them into one string.
|
||||
|
||||
Args:
|
||||
messages: Messages to be converted to strings.
|
||||
human_prefix: The prefix to prepend to contents of HumanMessages.
|
||||
Default is "Human".
|
||||
ai_prefix: The prefix to prepend to contents of AIMessages. Default is "AI".
|
||||
human_prefix: The prefix to prepend to contents of ``HumanMessage``s.
|
||||
Default is ``'Human'``.
|
||||
ai_prefix: The prefix to prepend to contents of ``AIMessage``. Default is
|
||||
``'AI'``.
|
||||
|
||||
Returns:
|
||||
A single string concatenation of all input messages.
|
||||
@@ -176,19 +178,20 @@ def _message_from_dict(message: dict) -> BaseMessage:
|
||||
|
||||
|
||||
def messages_from_dict(messages: Sequence[dict]) -> list[BaseMessage]:
|
||||
"""Convert a sequence of messages from dicts to Message objects.
|
||||
"""Convert a sequence of messages from dicts to ``Message`` objects.
|
||||
|
||||
Args:
|
||||
messages: Sequence of messages (as dicts) to convert.
|
||||
|
||||
Returns:
|
||||
list of messages (BaseMessages).
|
||||
|
||||
"""
|
||||
return [_message_from_dict(m) for m in messages]
|
||||
|
||||
|
||||
def message_chunk_to_message(chunk: BaseMessage) -> BaseMessage:
|
||||
"""Convert a message chunk to a message.
|
||||
"""Convert a message chunk to a ``Message``.
|
||||
|
||||
Args:
|
||||
chunk: Message chunk to convert.
|
||||
@@ -221,10 +224,10 @@ def _create_message_from_message_type(
|
||||
id: Optional[str] = None,
|
||||
**additional_kwargs: Any,
|
||||
) -> BaseMessage:
|
||||
"""Create a message from a message type and content string.
|
||||
"""Create a message from a ``Message`` type and content string.
|
||||
|
||||
Args:
|
||||
message_type: (str) the type of the message (e.g., "human", "ai", etc.).
|
||||
message_type: (str) the type of the message (e.g., ``'human'``, ``'ai'``, etc.).
|
||||
content: (str) the content string.
|
||||
name: (str) the name of the message. Default is None.
|
||||
tool_call_id: (str) the tool call id. Default is None.
|
||||
@@ -236,8 +239,9 @@ def _create_message_from_message_type(
|
||||
a message of the appropriate type.
|
||||
|
||||
Raises:
|
||||
ValueError: if the message type is not one of "human", "user", "ai",
|
||||
"assistant", "function", "tool", "system", or "developer".
|
||||
ValueError: if the message type is not one of ``'human'``, ``'user'``, ``'ai'``,
|
||||
``'assistant'``, ``'function'``, ``'tool'``, ``'system'``, or
|
||||
``'developer'``.
|
||||
"""
|
||||
kwargs: dict[str, Any] = {}
|
||||
if name is not None:
|
||||
@@ -303,15 +307,15 @@ def _create_message_from_message_type(
|
||||
|
||||
|
||||
def _convert_to_message(message: MessageLikeRepresentation) -> BaseMessage:
|
||||
"""Instantiate a message from a variety of message formats.
|
||||
"""Instantiate a ``Message`` from a variety of message formats.
|
||||
|
||||
The message format can be one of the following:
|
||||
|
||||
- BaseMessagePromptTemplate
|
||||
- BaseMessage
|
||||
- 2-tuple of (role string, template); e.g., ("human", "{user_input}")
|
||||
- ``BaseMessagePromptTemplate``
|
||||
- ``BaseMessage``
|
||||
- 2-tuple of (role string, template); e.g., (``'human'``, ``'{user_input}'``)
|
||||
- dict: a message dict with role and content keys
|
||||
- string: shorthand for ("human", template); e.g., "{user_input}"
|
||||
- string: shorthand for (``'human'``, template); e.g., ``'{user_input}'``
|
||||
|
||||
Args:
|
||||
message: a representation of a message in one of the supported formats.
|
||||
@@ -322,6 +326,7 @@ def _convert_to_message(message: MessageLikeRepresentation) -> BaseMessage:
|
||||
Raises:
|
||||
NotImplementedError: if the message type is not supported.
|
||||
ValueError: if the message dict does not contain the required keys.
|
||||
|
||||
"""
|
||||
if isinstance(message, BaseMessage):
|
||||
message_ = message
|
||||
@@ -367,6 +372,7 @@ def convert_to_messages(
|
||||
|
||||
Returns:
|
||||
list of messages (BaseMessages).
|
||||
|
||||
"""
|
||||
# Import here to avoid circular imports
|
||||
from langchain_core.prompt_values import PromptValue # noqa: PLC0415
|
||||
@@ -417,36 +423,36 @@ def filter_messages(
|
||||
exclude_ids: Optional[Sequence[str]] = None,
|
||||
exclude_tool_calls: Optional[Sequence[str] | bool] = None,
|
||||
) -> list[BaseMessage]:
|
||||
"""Filter messages based on name, type or id.
|
||||
"""Filter messages based on ``name``, ``type`` or ``id``.
|
||||
|
||||
Args:
|
||||
messages: Sequence Message-like objects to filter.
|
||||
include_names: Message names to include. Default is None.
|
||||
exclude_names: Messages names to exclude. Default is None.
|
||||
include_types: Message types to include. Can be specified as string names (e.g.
|
||||
"system", "human", "ai", ...) or as BaseMessage classes (e.g.
|
||||
SystemMessage, HumanMessage, AIMessage, ...). Default is None.
|
||||
exclude_types: Message types to exclude. Can be specified as string names (e.g.
|
||||
"system", "human", "ai", ...) or as BaseMessage classes (e.g.
|
||||
SystemMessage, HumanMessage, AIMessage, ...). Default is None.
|
||||
include_types: Message types to include. Can be specified as string names
|
||||
(e.g. ``'system'``, ``'human'``, ``'ai'``, ...) or as ``BaseMessage``
|
||||
classes (e.g. ``SystemMessage``, ``HumanMessage``, ``AIMessage``, ...).
|
||||
Default is None.
|
||||
exclude_types: Message types to exclude. Can be specified as string names
|
||||
(e.g. ``'system'``, ``'human'``, ``'ai'``, ...) or as ``BaseMessage``
|
||||
classes (e.g. ``SystemMessage``, ``HumanMessage``, ``AIMessage``, ...).
|
||||
Default is None.
|
||||
include_ids: Message IDs to include. Default is None.
|
||||
exclude_ids: Message IDs to exclude. Default is None.
|
||||
exclude_tool_calls: Tool call IDs to exclude. Default is None.
|
||||
Can be one of the following:
|
||||
|
||||
- ``True``: Each ``AIMessages`` with tool calls and all ``ToolMessages``
|
||||
will be excluded.
|
||||
- ``True``: all ``AIMessage``s with tool calls and all
|
||||
``ToolMessage``s will be excluded.
|
||||
- a sequence of tool call IDs to exclude:
|
||||
|
||||
- ToolMessages with the corresponding tool call ID will be excluded.
|
||||
- The ``tool_calls`` in the AIMessage will be updated to exclude matching
|
||||
tool calls.
|
||||
If all tool_calls are filtered from an AIMessage,
|
||||
the whole message is excluded.
|
||||
- ``ToolMessage``s with the corresponding tool call ID will be
|
||||
excluded.
|
||||
- The ``tool_calls`` in the AIMessage will be updated to exclude
|
||||
matching tool calls. If all ``tool_calls`` are filtered from an
|
||||
AIMessage, the whole message is excluded.
|
||||
|
||||
Returns:
|
||||
A list of Messages that meets at least one of the incl_* conditions and none
|
||||
of the excl_* conditions. If not incl_* conditions are specified then
|
||||
A list of Messages that meets at least one of the ``incl_*`` conditions and none
|
||||
of the ``excl_*`` conditions. If not ``incl_*`` conditions are specified then
|
||||
anything that is not explicitly excluded will be included.
|
||||
|
||||
Raises:
|
||||
@@ -558,13 +564,14 @@ def merge_message_runs(
|
||||
) -> list[BaseMessage]:
|
||||
r"""Merge consecutive Messages of the same type.
|
||||
|
||||
**NOTE**: ToolMessages are not merged, as each has a distinct tool call id that
|
||||
can't be merged.
|
||||
.. note::
|
||||
ToolMessages are not merged, as each has a distinct tool call id that can't be
|
||||
merged.
|
||||
|
||||
Args:
|
||||
messages: Sequence Message-like objects to merge.
|
||||
chunk_separator: Specify the string to be inserted between message chunks.
|
||||
Default is "\n".
|
||||
Default is ``'\n'``.
|
||||
|
||||
Returns:
|
||||
list of BaseMessages with consecutive runs of message types merged into single
|
||||
@@ -705,8 +712,8 @@ def trim_messages(
|
||||
) -> list[BaseMessage]:
|
||||
r"""Trim messages to be below a token count.
|
||||
|
||||
trim_messages can be used to reduce the size of a chat history to a specified token
|
||||
count or specified message count.
|
||||
``trim_messages`` can be used to reduce the size of a chat history to a specified
|
||||
token count or specified message count.
|
||||
|
||||
In either case, if passing the trimmed chat history back into a chat model
|
||||
directly, the resulting chat history should usually satisfy the following
|
||||
@@ -714,13 +721,13 @@ def trim_messages(
|
||||
|
||||
1. The resulting chat history should be valid. Most chat models expect that chat
|
||||
history starts with either (1) a ``HumanMessage`` or (2) a ``SystemMessage``
|
||||
followed by a ``HumanMessage``. To achieve this, set ``start_on="human"``.
|
||||
followed by a ``HumanMessage``. To achieve this, set ``start_on='human'``.
|
||||
In addition, generally a ``ToolMessage`` can only appear after an ``AIMessage``
|
||||
that involved a tool call.
|
||||
Please see the following link for more information about messages:
|
||||
https://python.langchain.com/docs/concepts/#messages
|
||||
2. It includes recent messages and drops old messages in the chat history.
|
||||
To achieve this set the ``strategy="last"``.
|
||||
To achieve this set the ``strategy='last'``.
|
||||
3. Usually, the new chat history should include the ``SystemMessage`` if it
|
||||
was present in the original chat history since the ``SystemMessage`` includes
|
||||
special instructions to the chat model. The ``SystemMessage`` is almost always
|
||||
@@ -734,67 +741,67 @@ def trim_messages(
|
||||
Args:
|
||||
messages: Sequence of Message-like objects to trim.
|
||||
max_tokens: Max token count of trimmed messages.
|
||||
token_counter: Function or llm for counting tokens in a BaseMessage or a list of
|
||||
BaseMessage. If a BaseLanguageModel is passed in then
|
||||
BaseLanguageModel.get_num_tokens_from_messages() will be used.
|
||||
Set to `len` to count the number of **messages** in the chat history.
|
||||
token_counter: Function or llm for counting tokens in a ``BaseMessage`` or a
|
||||
list of ``BaseMessage``. If a ``BaseLanguageModel`` is passed in then
|
||||
``BaseLanguageModel.get_num_tokens_from_messages()`` will be used.
|
||||
Set to ``len`` to count the number of **messages** in the chat history.
|
||||
|
||||
.. note::
|
||||
Use `count_tokens_approximately` to get fast, approximate token counts.
|
||||
This is recommended for using `trim_messages` on the hot path, where
|
||||
Use ``count_tokens_approximately`` to get fast, approximate token
|
||||
counts.
|
||||
This is recommended for using ``trim_messages`` on the hot path, where
|
||||
exact token counting is not necessary.
|
||||
|
||||
strategy: Strategy for trimming.
|
||||
|
||||
- "first": Keep the first <= n_count tokens of the messages.
|
||||
- "last": Keep the last <= n_count tokens of the messages.
|
||||
|
||||
- ``'first'``: Keep the first ``<= n_count`` tokens of the messages.
|
||||
- ``'last'``: Keep the last ``<= n_count`` tokens of the messages.
|
||||
Default is ``'last'``.
|
||||
allow_partial: Whether to split a message if only part of the message can be
|
||||
included. If ``strategy="last"`` then the last partial contents of a message
|
||||
are included. If ``strategy="first"`` then the first partial contents of a
|
||||
included. If ``strategy='last'`` then the last partial contents of a message
|
||||
are included. If ``strategy='first'`` then the first partial contents of a
|
||||
message are included.
|
||||
Default is False.
|
||||
end_on: The message type to end on. If specified then every message after the
|
||||
last occurrence of this type is ignored. If ``strategy=="last"`` then this
|
||||
last occurrence of this type is ignored. If ``strategy='last'`` then this
|
||||
is done before we attempt to get the last ``max_tokens``. If
|
||||
``strategy=="first"`` then this is done after we get the first
|
||||
``max_tokens``. Can be specified as string names (e.g. "system", "human",
|
||||
"ai", ...) or as BaseMessage classes (e.g. SystemMessage, HumanMessage,
|
||||
AIMessage, ...). Can be a single type or a list of types.
|
||||
``strategy='first'`` then this is done after we get the first
|
||||
``max_tokens``. Can be specified as string names (e.g. ``'system'``,
|
||||
``'human'``, ``'ai'``, ...) or as ``BaseMessage`` classes (e.g.
|
||||
``SystemMessage``, ``HumanMessage``, ``AIMessage``, ...). Can be a single
|
||||
type or a list of types.
|
||||
Default is None.
|
||||
start_on: The message type to start on. Should only be specified if
|
||||
``strategy="last"``. If specified then every message before
|
||||
``strategy='last'``. If specified then every message before
|
||||
the first occurrence of this type is ignored. This is done after we trim
|
||||
the initial messages to the last ``max_tokens``. Does not
|
||||
apply to a SystemMessage at index 0 if ``include_system=True``. Can be
|
||||
specified as string names (e.g. "system", "human", "ai", ...) or as
|
||||
BaseMessage classes (e.g. SystemMessage, HumanMessage, AIMessage, ...). Can
|
||||
be a single type or a list of types.
|
||||
apply to a ``SystemMessage`` at index 0 if ``include_system=True``. Can be
|
||||
specified as string names (e.g. ``'system'``, ``'human'``, ``'ai'``, ...) or
|
||||
as ``BaseMessage`` classes (e.g. ``SystemMessage``, ``HumanMessage``,
|
||||
``AIMessage``, ...). Can be a single type or a list of types.
|
||||
Default is None.
|
||||
include_system: Whether to keep the SystemMessage if there is one at index 0.
|
||||
Should only be specified if ``strategy="last"``.
|
||||
Default is False.
|
||||
text_splitter: Function or ``langchain_text_splitters.TextSplitter`` for
|
||||
splitting the string contents of a message. Only used if
|
||||
``allow_partial=True``. If ``strategy="last"`` then the last split tokens
|
||||
from a partial message will be included. if ``strategy=="first"`` then the
|
||||
``allow_partial=True``. If ``strategy='last'`` then the last split tokens
|
||||
from a partial message will be included. if ``strategy='first'`` then the
|
||||
first split tokens from a partial message will be included. Token splitter
|
||||
assumes that separators are kept, so that split contents can be directly
|
||||
concatenated to recreate the original text. Defaults to splitting on
|
||||
newlines.
|
||||
|
||||
Returns:
|
||||
list of trimmed BaseMessages.
|
||||
list of trimmed ``BaseMessage``.
|
||||
|
||||
Raises:
|
||||
ValueError: if two incompatible arguments are specified or an unrecognized
|
||||
``strategy`` is specified.
|
||||
|
||||
Example:
|
||||
Trim chat history based on token count, keeping the SystemMessage if
|
||||
present, and ensuring that the chat history starts with a HumanMessage (
|
||||
or a SystemMessage followed by a HumanMessage).
|
||||
Trim chat history based on token count, keeping the ``SystemMessage`` if
|
||||
present, and ensuring that the chat history starts with a ``HumanMessage`` (
|
||||
or a ``SystemMessage`` followed by a ``HumanMessage``).
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
@@ -849,9 +856,9 @@ def trim_messages(
|
||||
HumanMessage(content="what do you call a speechless parrot"),
|
||||
]
|
||||
|
||||
Trim chat history based on the message count, keeping the SystemMessage if
|
||||
present, and ensuring that the chat history starts with a HumanMessage (
|
||||
or a SystemMessage followed by a HumanMessage).
|
||||
Trim chat history based on the message count, keeping the ``SystemMessage`` if
|
||||
present, and ensuring that the chat history starts with a ``HumanMessage`` (
|
||||
or a ``SystemMessage`` followed by a ``HumanMessage``).
|
||||
|
||||
trim_messages(
|
||||
messages,
|
||||
@@ -1033,6 +1040,7 @@ def convert_to_openai_messages(
|
||||
messages: Union[MessageLikeRepresentation, Sequence[MessageLikeRepresentation]],
|
||||
*,
|
||||
text_format: Literal["string", "block"] = "string",
|
||||
include_id: bool = False,
|
||||
) -> Union[dict, list[dict]]:
|
||||
"""Convert LangChain messages into OpenAI message dicts.
|
||||
|
||||
@@ -1040,17 +1048,18 @@ def convert_to_openai_messages(
|
||||
messages: Message-like object or iterable of objects whose contents are
|
||||
in OpenAI, Anthropic, Bedrock Converse, or VertexAI formats.
|
||||
text_format: How to format string or text block contents:
|
||||
|
||||
- ``'string'``:
|
||||
If a message has a string content, this is left as a string. If
|
||||
a message has content blocks that are all of type 'text', these are
|
||||
joined with a newline to make a single string. If a message has
|
||||
content blocks and at least one isn't of type 'text', then
|
||||
all blocks are left as dicts.
|
||||
- ``'block'``:
|
||||
If a message has a string content, this is turned into a list
|
||||
with a single content block of type 'text'. If a message has content
|
||||
blocks these are left as is.
|
||||
- ``'string'``:
|
||||
If a message has a string content, this is left as a string. If
|
||||
a message has content blocks that are all of type ``'text'``, these
|
||||
are joined with a newline to make a single string. If a message has
|
||||
content blocks and at least one isn't of type ``'text'``, then
|
||||
all blocks are left as dicts.
|
||||
- ``'block'``:
|
||||
If a message has a string content, this is turned into a list
|
||||
with a single content block of type ``'text'``. If a message has
|
||||
content blocks these are left as is.
|
||||
include_id: Whether to include message ids in the openai messages, if they
|
||||
are present in the source messages.
|
||||
|
||||
Raises:
|
||||
ValueError: if an unrecognized ``text_format`` is specified, or if a message
|
||||
@@ -1139,6 +1148,8 @@ def convert_to_openai_messages(
|
||||
oai_msg["refusal"] = message.additional_kwargs["refusal"]
|
||||
if isinstance(message, ToolMessage):
|
||||
oai_msg["tool_call_id"] = message.tool_call_id
|
||||
if include_id and message.id:
|
||||
oai_msg["id"] = message.id
|
||||
|
||||
if not message.content:
|
||||
content = "" if text_format == "string" else []
|
||||
@@ -1379,7 +1390,7 @@ def convert_to_openai_messages(
|
||||
},
|
||||
}
|
||||
)
|
||||
elif block.get("type") == "thinking":
|
||||
elif block.get("type") in ["thinking", "reasoning"]:
|
||||
content.append(block)
|
||||
else:
|
||||
err = (
|
||||
|
||||
@@ -15,14 +15,14 @@ from langchain_core.utils._merge import merge_dicts
|
||||
class ChatGeneration(Generation):
|
||||
"""A single chat generation output.
|
||||
|
||||
A subclass of Generation that represents the response from a chat model
|
||||
A subclass of ``Generation`` that represents the response from a chat model
|
||||
that generates chat messages.
|
||||
|
||||
The `message` attribute is a structured representation of the chat message.
|
||||
Most of the time, the message will be of type `AIMessage`.
|
||||
The ``message`` attribute is a structured representation of the chat message.
|
||||
Most of the time, the message will be of type ``AIMessage``.
|
||||
|
||||
Users working with chat models will usually access information via either
|
||||
`AIMessage` (returned from runnable interfaces) or `LLMResult` (available
|
||||
``AIMessage`` (returned from runnable interfaces) or ``LLMResult`` (available
|
||||
via callbacks).
|
||||
"""
|
||||
|
||||
@@ -31,6 +31,7 @@ class ChatGeneration(Generation):
|
||||
|
||||
.. warning::
|
||||
SHOULD NOT BE SET DIRECTLY!
|
||||
|
||||
"""
|
||||
message: BaseMessage
|
||||
"""The message output by the chat model."""
|
||||
@@ -47,6 +48,9 @@ class ChatGeneration(Generation):
|
||||
|
||||
Returns:
|
||||
The values of the object with the text attribute set.
|
||||
|
||||
Raises:
|
||||
ValueError: If the message is not a string or a list.
|
||||
"""
|
||||
text = ""
|
||||
if isinstance(self.message.content, str):
|
||||
@@ -66,9 +70,9 @@ class ChatGeneration(Generation):
|
||||
|
||||
|
||||
class ChatGenerationChunk(ChatGeneration):
|
||||
"""ChatGeneration chunk.
|
||||
"""``ChatGeneration`` chunk.
|
||||
|
||||
ChatGeneration chunks can be concatenated with other ChatGeneration chunks.
|
||||
``ChatGeneration`` chunks can be concatenated with other ``ChatGeneration`` chunks.
|
||||
"""
|
||||
|
||||
message: BaseMessageChunk
|
||||
|
||||
@@ -113,8 +113,12 @@ class ImageURL(TypedDict, total=False):
|
||||
"""Image URL."""
|
||||
|
||||
detail: Literal["auto", "low", "high"]
|
||||
"""Specifies the detail level of the image. Defaults to "auto".
|
||||
Can be "auto", "low", or "high"."""
|
||||
"""Specifies the detail level of the image. Defaults to ``'auto'``.
|
||||
Can be ``'auto'``, ``'low'``, or ``'high'``.
|
||||
|
||||
This follows OpenAI's Chat Completion API's image URL format.
|
||||
|
||||
"""
|
||||
|
||||
url: str
|
||||
"""Either a URL of the image or the base64 encoded image data."""
|
||||
|
||||
@@ -262,7 +262,7 @@ class BaseStringMessagePromptTemplate(BaseMessagePromptTemplate, ABC):
|
||||
def from_template_file(
|
||||
cls,
|
||||
template_file: Union[str, Path],
|
||||
input_variables: list[str],
|
||||
input_variables: list[str], # noqa: ARG003 # Deprecated
|
||||
**kwargs: Any,
|
||||
) -> Self:
|
||||
"""Create a class from a template file.
|
||||
@@ -275,7 +275,7 @@ class BaseStringMessagePromptTemplate(BaseMessagePromptTemplate, ABC):
|
||||
Returns:
|
||||
A new instance of this class.
|
||||
"""
|
||||
prompt = PromptTemplate.from_file(template_file, input_variables)
|
||||
prompt = PromptTemplate.from_file(template_file)
|
||||
return cls(prompt=prompt, **kwargs)
|
||||
|
||||
@abstractmethod
|
||||
@@ -813,7 +813,10 @@ class ChatPromptTemplate(BaseChatPromptTemplate):
|
||||
)
|
||||
|
||||
prompt_value = template.invoke(
|
||||
{"name": "Bob", "user_input": "What is your name?"}
|
||||
{
|
||||
"name": "Bob",
|
||||
"user_input": "What is your name?",
|
||||
}
|
||||
)
|
||||
# Output:
|
||||
# ChatPromptValue(
|
||||
|
||||
@@ -281,7 +281,10 @@ class FewShotChatMessagePromptTemplate(
|
||||
]
|
||||
|
||||
example_prompt = ChatPromptTemplate.from_messages(
|
||||
[("human", "What is {input}?"), ("ai", "{output}")]
|
||||
[
|
||||
("human", "What is {input}?"),
|
||||
("ai", "{output}"),
|
||||
]
|
||||
)
|
||||
|
||||
few_shot_prompt = FewShotChatMessagePromptTemplate(
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user