langchain/libs/partners/ollama
2024-11-18 19:10:39 -08:00
..
langchain_ollama partners/ollama: Enabled Token Level Streaming when Using Bind Tools for ChatOllama (#27689) 2024-11-15 11:36:27 -05:00
scripts ollama: add pydocstyle linting for ollama (#27686) 2024-10-31 03:06:55 +00:00
tests standard-tests: rename langchain_standard_tests to langchain_tests, release 0.3.2 (#28203) 2024-11-18 19:10:39 -08:00
.gitignore ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
LICENSE ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
Makefile standard-tests[patch]: add Ser/Des test 2024-09-04 10:24:06 -07:00
poetry.lock multiple: langchain-standard-tests -> langchain-tests (#28139) 2024-11-15 11:32:04 -08:00
pyproject.toml multiple: langchain-standard-tests -> langchain-tests (#28139) 2024-11-15 11:32:04 -08:00
README.md ollama: init package (#23615) 2024-07-20 00:43:29 +00:00

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

You will also need to run the Ollama server locally. You can download it here.

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")