langchain/libs/partners/ollama
Mason Daugherty 5e9eb19a83
chore: update branch with changes from master (#32277)
Co-authored-by: Maxime Grenu <69890511+cluster2600@users.noreply.github.com>
Co-authored-by: Claude <claude@anthropic.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: jmaillefaud <jonathan.maillefaud@evooq.ch>
Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
Co-authored-by: tanwirahmad <tanwirahmad@users.noreply.github.com>
Co-authored-by: Christophe Bornet <cbornet@hotmail.com>
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: niceg <79145285+growmuye@users.noreply.github.com>
Co-authored-by: Chaitanya varma <varmac301@gmail.com>
Co-authored-by: dishaprakash <57954147+dishaprakash@users.noreply.github.com>
Co-authored-by: Chester Curme <chester.curme@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Kanav Bansal <13186335+bansalkanav@users.noreply.github.com>
Co-authored-by: Aleksandr Filippov <71711753+alex-feel@users.noreply.github.com>
Co-authored-by: Alex Feel <afilippov@spotware.com>
2025-07-28 10:39:41 -04:00
..
langchain_ollama chore: update branch with changes from master (#32277) 2025-07-28 10:39:41 -04:00
scripts partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
tests feat(ollama): warn on empty load responses (#32161) 2025-07-22 13:21:11 -04:00
.gitignore
LICENSE
Makefile chore: update branch with changes from master (#32277) 2025-07-28 10:39:41 -04:00
pyproject.toml chore: update branch with changes from master (#32277) 2025-07-28 10:39:41 -04:00
README.md ollama: update tests, docs (#31736) 2025-06-25 20:13:20 +00:00
uv.lock release(ollama): 0.3.6 (#32180) 2025-07-22 13:24:17 -04:00

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

For the package to work, you will need to install and run the Ollama server locally (download).

To run integration tests (make integration_tests), you will need the following models installed in your Ollama server:

  • llama3.1
  • deepseek-r1:1.5b

Install these models by running:

ollama pull <name-of-model>

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3.1")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3.1")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes traditional LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3.1")
llm.invoke("The meaning of life is")