Akim Tsvigun f345ae5a1d docs: Integration with Nebius AI Studio (#31293)
Thank you for contributing to LangChain!

[x] PR title: langchain_ollama: support custom headers for Ollama
partner APIs

Where "package" is whichever of langchain, core, etc. is being modified.
Use "docs: ..." for purely docs changes, "infra: ..." for CI changes.
Example: "core: add foobar LLM"
[x] PR message:

**Description: This PR adds support for passing custom HTTP headers to
Ollama models when used as a LangChain integration. This is especially
useful for enterprise users or partners who need to send authentication
tokens, API keys, or custom tracking headers when querying secured
Ollama servers.
Issue: N/A (new enhancement)
**Dependencies: No external dependencies introduced.
Twitter handle: @arunkumar_offl
[x] Add tests and docs: If you're adding a new integration, please
include
1.Added a unit test in test_chat_models.py to validate headers are
passed correctly.
2. Added an example notebook:
docs/docs/integrations/llms/ollama_custom_headers.ipynb showing how to
use custom headers.

[x] Lint and test: Ran make format, make lint, and make test to ensure
the code is clean and passing all checks.

Additional guidelines:

Make sure optional dependencies are imported within a function.
Please do not add dependencies to pyproject.toml files (even optional
ones) unless they are required for unit tests.
Most PRs should not touch more than one package.
Changes should be backwards compatible.
If no one reviews your PR within a few days, please @-mention one of
baskaryan, eyurtsev, ccurme, vbarda, hwchase17.

This MR is only for the docs. Added integration with Nebius AI Studio to
docs. The integration package is available at
[https://github.com/nebius/langchain-nebius](https://github.com/nebius/langchain-nebius).

---------

Co-authored-by: Akim Tsvigun <aktsvigun@nebius.com>
Co-authored-by: Chester Curme <chester.curme@gmail.com>
2025-06-14 16:15:27 -04:00
2025-05-15 15:43:57 -04:00

LangChain Logo

Release Notes CI PyPI - License PyPI - Downloads GitHub star chart Open Issues Open in Dev Containers Twitter CodSpeed Badge

Note

Looking for the JS/TS library? Check out LangChain.js.

LangChain is a framework for building LLM-powered applications. It helps you chain together interoperable components and third-party integrations to simplify AI application development — all while future-proofing decisions as the underlying technology evolves.

pip install -U langchain

To learn more about LangChain, check out the docs. If youre looking for more advanced customization or agent orchestration, check out LangGraph, our framework for building controllable agent workflows.

Why use LangChain?

LangChain helps developers build applications powered by LLMs through a standard interface for models, embeddings, vector stores, and more.

Use LangChain for:

  • Real-time data augmentation. Easily connect LLMs to diverse data sources and external / internal systems, drawing from LangChains vast library of integrations with model providers, tools, vector stores, retrievers, and more.
  • Model interoperability. Swap models in and out as your engineering team experiments to find the best choice for your applications needs. As the industry frontier evolves, adapt quickly — LangChains abstractions keep you moving without losing momentum.

LangChains ecosystem

While the LangChain framework can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools when building LLM applications.

To improve your LLM application development, pair LangChain with:

  • LangSmith - Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time.
  • LangGraph - Build agents that can reliably handle complex tasks with LangGraph, our low-level agent orchestration framework. LangGraph offers customizable architecture, long-term memory, and human-in-the-loop workflows — and is trusted in production by companies like LinkedIn, Uber, Klarna, and GitLab.
  • LangGraph Platform - Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in LangGraph Studio.

Additional resources

  • Tutorials: Simple walkthroughs with guided examples on getting started with LangChain.
  • How-to Guides: Quick, actionable code snippets for topics such as tool calling, RAG use cases, and more.
  • Conceptual Guides: Explanations of key concepts behind the LangChain framework.
  • API Reference: Detailed reference on navigating base packages and integrations for LangChain.
Description
Building applications with LLMs through composability
Readme MIT Cite this repository 4.8 GiB
Languages
Jupyter Notebook 73.8%
Python 21.1%
omnetpp-msg 4.8%
Makefile 0.1%
MDX 0.1%