Files
langchain/libs/core
Vincent Min ff9f17bc66 fix(core): preserve ordering in RunnableRetry batch/abatch results (#32526)
Description: Fixes a bug in RunnableRetry where .batch / .abatch could
return misordered outputs (e.g. inputs [0,1,2] yielding [1,1,2]) when
some items succeeded on an earlier attempt and others were retried. Root
cause: successful results were stored keyed by the index within the
shrinking “pending” subset rather than the original input index, causing
collisions and reordered/duplicated outputs after retries. Fix updates
_batch and _abatch to:

- Track remaining original indices explicitly.
- Call underlying batch/abatch only on remaining inputs.
- Map results back to original indices.
- Preserve final ordering by reconstructing outputs in original
positional order.

Issue: Fixes #21326

Tests:

- Added regression tests: test_retry_batch_preserves_order and
test_async_retry_batch_preserves_order asserting correct ordering after
a single controlled failure + retry.
- Existing retry tests still pass.

Dependencies:

- None added or changed.

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
2025-09-11 16:18:25 -04:00
..
2025-09-10 14:10:44 +00:00

🦜🍎 LangChain Core

PyPI - License PyPI - Downloads

Quick Install

pip install langchain-core

What is it?

LangChain Core contains the base abstractions that power the the LangChain ecosystem.

These abstractions are designed to be as modular and simple as possible.

The benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.

For full documentation see the API reference.

⛰️ Why build on top of LangChain Core?

The LangChain ecosystem is built on top of langchain-core. Some of the benefits:

  • Modularity: We've designed Core around abstractions that are independent of each other, and not tied to any specific model provider.
  • Stability: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.
  • Battle-tested: Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.

1 Core Interface: Runnables

The concept of a Runnable is central to LangChain Core it is the interface that most LangChain Core components implement, giving them

  • A common invocation interface (invoke(), batch(), stream(), etc.)
  • Built-in utilities for retries, fallbacks, schemas and runtime configurability
  • Easy deployment with LangGraph

For more check out the Runnable docs. Examples of components that implement the interface include: Chat Models, Tools, Retrievers, and Output Parsers.

📕 Releases & Versioning

As langchain-core contains the base abstractions and runtime for the whole LangChain ecosystem, we will communicate any breaking changes with advance notice and version bumps. The exception for this is anything in langchain_core.beta. The reason for langchain_core.beta is that given the rate of change of the field, being able to move quickly is still a priority, and this module is our attempt to do so.

Minor version increases will occur for:

  • Breaking changes for any public interfaces NOT in langchain_core.beta

Patch version increases will occur for:

  • Bug fixes
  • New features
  • Any changes to private interfaces
  • Any changes to langchain_core.beta

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see the Contributing Guide.