Files
langchain/libs/core
Sydney Runkle 645b077c30 perf(core): cache _format_tool_to_openai_function per tool instance
Stash the OpenAI function description dict on the BaseTool instance under
`tool.__dict__["_openai_function_dict"]`. BaseTool.__setattr__ already pops
`tool_call_schema` and `args` when `args_schema`, `description`, or `name`
change; extend the invalidation set to include the new key so the cache
matches the schema caching lifecycle.

Previously, every call to `convert_to_openai_tool(tool)` re-ran
`schema.model_json_schema()` on the cached tool_call_schema pydantic model,
rebuilding the full JSON-schema tree on every model invocation. Summarization
middleware's `count_tokens_approximately` (called twice per model call) plus
the prompt-caching middleware's `bind_tools` meant three fresh schema
generations per model call × 15-ish tools × 500 model calls in a 100-turn
agent run — tens of seconds of pydantic work that's identical every time.

With this cache the first call pays the schema-gen cost once per tool; all
subsequent calls are a dict lookup.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-24 09:30:16 -04:00
..

🦜🍎 LangChain Core

PyPI - Version PyPI - License PyPI - Downloads Twitter

Looking for the JS/TS version? Check out LangChain.js.

To help you ship LangChain apps to production faster, check out LangSmith. LangSmith is a unified developer platform for building, testing, and monitoring LLM applications.

Quick Install

pip install langchain-core

🤔 What is this?

LangChain Core contains the base abstractions that power the LangChain ecosystem.

These abstractions are designed to be as modular and simple as possible.

The benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.

⛰️ Why build on top of LangChain Core?

The LangChain ecosystem is built on top of langchain-core. Some of the benefits:

  • Modularity: We've designed Core around abstractions that are independent of each other, and not tied to any specific model provider.
  • Stability: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.
  • Battle-tested: Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.

📖 Documentation

For full documentation, see the API reference. For conceptual guides, tutorials, and examples on using LangChain, see the LangChain Docs. You can also chat with the docs using Chat LangChain.

📕 Releases & Versioning

See our Releases and Versioning policies.

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see the Contributing Guide.