langchain/libs/partners/mistralai
ccurme 49eeb0f3c3
standard-tests: add benchmarks (#31302)
Co-authored-by: Sydney Runkle <sydneymarierunkle@gmail.com>
2025-05-29 15:21:37 +00:00
..
langchain_mistralai partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
scripts multiple: pydantic 2 compatibility, v0.3 (#26443) 2024-09-13 14:38:45 -07:00
tests partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
.gitignore
LICENSE
Makefile infra: add UV_FROZEN to makefiles (#29642) 2025-02-06 14:36:54 -05:00
pyproject.toml packaging: remove Python upper bound for langchain and co libs (#31025) 2025-04-28 14:44:28 -04:00
README.md
uv.lock standard-tests: add benchmarks (#31302) 2025-05-29 15:21:37 +00:00

langchain-mistralai

This package contains the LangChain integrations for MistralAI through their mistralai SDK.

Installation

pip install -U langchain-mistralai

Chat Models

This package contains the ChatMistralAI class, which is the recommended way to interface with MistralAI models.

To use, install the requirements, and configure your environment.

export MISTRAL_API_KEY=your-api-key

Then initialize

from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI

chat = ChatMistralAI(model="mistral-small")
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)

ChatMistralAI also supports async and streaming functionality:

# For async...
await chat.ainvoke(messages)

# For streaming...
for chunk in chat.stream(messages):
    print(chunk.content, end="", flush=True)

Embeddings

With MistralAIEmbeddings, you can directly use the default model 'mistral-embed', or set a different one if available.

Choose model

embedding.model = 'mistral-embed'

Simple query

res_query = embedding.embed_query("The test information")

Documents

res_document = embedding.embed_documents(["test1", "another test"])