langchain/libs/partners/ollama
Syed Baqar Abbas f175319303
[feat] Added backwards compatibility for OllamaEmbeddings initialization (migration from langchain_community.embeddings to langchain_ollama.embeddings (#29296)
- [feat] **Added backwards compatibility for OllamaEmbeddings
initialization (migration from `langchain_community.embeddings` to
`langchain_ollama.embeddings`**: "langchain_ollama"
- **Description:** Given that `OllamaEmbeddings` from
`langchain_community.embeddings` is deprecated, code is being shifted to
``langchain_ollama.embeddings`. However, this does not offer backward
compatibility of initializing the parameters and `OllamaEmbeddings`
object.
    - **Issue:** #29294 
    - **Dependencies:** None
    - **Twitter handle:** @BaqarAbbas2001


## Additional Information
Previously, `OllamaEmbeddings` from `langchain_community.embeddings`
used to support the following options:

e9abe583b2/libs/community/langchain_community/embeddings/ollama.py (L125-L139)

However, in the new package `from langchain_ollama import
OllamaEmbeddings`, there is no method to set these options. I have added
these parameters to resolve this issue.

This issue was also discussed in
https://github.com/langchain-ai/langchain/discussions/29113
2025-01-20 11:16:29 -05:00
..
langchain_ollama [feat] Added backwards compatibility for OllamaEmbeddings initialization (migration from langchain_community.embeddings to langchain_ollama.embeddings (#29296) 2025-01-20 11:16:29 -05:00
scripts ollama: add pydocstyle linting for ollama (#27686) 2024-10-31 03:06:55 +00:00
tests json mode standard test (#25497) 2024-12-17 18:47:34 +00:00
.gitignore ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
LICENSE ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
Makefile standard-tests[patch]: add Ser/Des test 2024-09-04 10:24:06 -07:00
poetry.lock partners/ollama: release 0.2.2 (#28802) 2024-12-18 22:11:08 +00:00
pyproject.toml partners/ollama: release 0.2.2 (#28802) 2024-12-18 22:11:08 +00:00
README.md ollama: init package (#23615) 2024-07-20 00:43:29 +00:00

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

You will also need to run the Ollama server locally. You can download it here.

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")