Files
Yi Liu 19ddd42891 fix(ollama): raise error when clients are not initialized (#35185)
## Summary
- When `self._client` is `None` in `_create_chat_stream()`, the method
silently produces an empty generator instead of failing.
- The error only surfaces later as a misleading `"No data received from
Ollama stream"` ValueError, making it difficult to diagnose the actual
root cause (uninitialized client).
- Changed to raise `RuntimeError` immediately with a clear message when
the sync client is not initialized.

## Why this matters
Users who hit this path see a confusing error message that points them
in the wrong direction. An explicit error at the point of failure makes
debugging straightforward.

## Test plan
- [x] Added `test_create_chat_stream_raises_when_client_none`
- [x] Existing tests still pass

> This PR was authored with the help of AI tools.

---------

Co-authored-by: Mason Daugherty <github@mdrxy.com>
2026-02-12 11:56:53 -05:00
..
2026-02-04 16:16:52 -05:00
2026-01-13 01:54:11 -05:00

langchain-ollama

PyPI - Version PyPI - License PyPI - Downloads Twitter

Looking for the JS/TS version? Check out LangChain.js.

Quick Install

pip install langchain-ollama

🤔 What is this?

This package contains the LangChain integration with Ollama

📖 Documentation

For full documentation, see the API reference. For conceptual guides, tutorials, and examples on using these classes, see the LangChain Docs.

📕 Releases & Versioning

See our Releases and Versioning policies.

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see the Contributing Guide.