- **Description:** The root cause of this issue is that when a user
defines `model_config` in a `BaseModel`, the `{"type": <tool_name>}`
value is derived from the title specified in `model_config` when the
results are parsed
[here](https://vscode.dev/github/keenborder786/langchain/blob/fix/tool_name_dict/libs/core/langchain_core/output_parsers/openai_tools.py#L199).
However,
[tool.__name__](https://vscode.dev/github/keenborder786/langchain/blob/fix/tool_name_dict/libs/core/langchain_core/output_parsers/openai_tools.py#L331)
uses the class name (in uppercase) of the `BaseModel`, resulting in a
`KeyError` when a custom title is provided in `model_config`.
The Best Solution will be to use the title provided in `model_config`
attribute if provided one since that is what `type` will be parsed to,
if not then use `tool.__name__`. But need to make sure that this works
only for Pydantic V2.
- **Issue:** #27260
---------
Co-authored-by: Mason Daugherty <mason@langchain.dev>
Packages
Important
This repository is structured as a monorepo, with various packages located in this libs/ directory. Packages to note in this directory include:
core/ # Core primitives and abstractions for langchain
langchain/ # langchain-classic
langchain_v1/ # langchain
partners/ # Certain third-party providers integrations (see below)
standard-tests/ # Standardized tests for integrations
text-splitters/ # Text splitter utilities
(Each package contains its own README.md file with specific details about that package.)
Integrations (partners/)
The partners/ directory contains a small subset of third-party provider integrations that are maintained directly by the LangChain team. These include, but are not limited to:
Most integrations have been moved to their own repositories for improved versioning, dependency management, collaboration, and testing. This includes packages from popular providers such as Google and AWS. Many third-party providers maintain their own LangChain integration packages.
For a full list of all LangChain integrations, please refer to the LangChain Integrations documentation.