langchain/libs/partners/ollama
Mason Daugherty 96cbd90cba
fix: formatting issues in docstrings (#32265)
Ensures proper reStructuredText formatting by adding the required blank
line before closing docstring quotes, which resolves the "Block quote
ends without a blank line; unexpected unindent" warning.
2025-07-27 23:37:47 -04:00
..
langchain_ollama fix: formatting issues in docstrings (#32265) 2025-07-27 23:37:47 -04:00
scripts partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
tests feat(ollama): warn on empty load responses (#32161) 2025-07-22 13:21:11 -04:00
.gitignore
LICENSE
Makefile feat(docs): improve devx, fix Makefile targets (#32237) 2025-07-25 14:49:03 -04:00
pyproject.toml fix: LLM mimicking Unicode responses due to forced Unicode conversion of non-ASCII characters. (#32222) 2025-07-24 17:01:31 -04:00
README.md ollama: update tests, docs (#31736) 2025-06-25 20:13:20 +00:00
uv.lock release(ollama): 0.3.6 (#32180) 2025-07-22 13:24:17 -04:00

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

For the package to work, you will need to install and run the Ollama server locally (download).

To run integration tests (make integration_tests), you will need the following models installed in your Ollama server:

  • llama3.1
  • deepseek-r1:1.5b

Install these models by running:

ollama pull <name-of-model>

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3.1")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3.1")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes traditional LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3.1")
llm.invoke("The meaning of life is")