mirror of
https://github.com/hwchase17/langchain.git
synced 2026-04-11 15:03:36 +00:00
Closes #34207 --- Expose log probabilities from the Ollama Python SDK through `ChatOllama`. The ollama client already returns a `logprobs` field on chat responses for supported models, but `ChatOllama` had no way to request or surface it. ## Changes - Add `logprobs` and `top_logprobs` fields to `ChatOllama`, forwarded to the client via `_build_chat_params`. Setting `top_logprobs` without `logprobs=True` auto-enables it with a warning; setting it with `logprobs=False` raises a `ValueError` - Surface per-token logprobs on intermediate streaming chunks (both sync `_create_chat_stream` and async `_create_async_chat_stream`) via `response_metadata["logprobs"]`, accumulated into the final response on `invoke()` - Bump minimum `ollama` SDK from `>=0.6.0` to `>=0.6.1` — the version that added logprobs support --------- Co-authored-by: Mason Daugherty <mason@langchain.dev> Co-authored-by: Mason Daugherty <github@mdrxy.com>
FAQ
Looking for an integration not listed here? Check out the integrations documentation and the note in the libs/ README about third-party maintained packages.
Integration docs
For full documentation, see the primary and API reference docs for integrations.