langchain/libs/partners/mistralai
blaufink 28f8d436f6
mistral: fix of issue #26029 (#28233)
- Description: Azure AI takes an issue with the safe_mode parameter
being set to False instead of None. Therefore, this PR changes the
default value of safe_mode from False to None. This results in it being
filtered out before the request is sent - avoind the extra-parameter
issue described below.

- Issue: #26029

- Dependencies: /

---------

Co-authored-by: blaufink <sebastian.brueckner@outlook.de>
Co-authored-by: Erick Friis <erick@langchain.dev>
2024-12-05 23:28:12 +00:00
..
langchain_mistralai mistral: fix of issue #26029 (#28233) 2024-12-05 23:28:12 +00:00
scripts multiple: pydantic 2 compatibility, v0.3 (#26443) 2024-09-13 14:38:45 -07:00
tests groq,openai,mistralai: fix unit tests (#28279) 2024-11-22 04:54:01 +00:00
.gitignore
LICENSE
Makefile standard-tests[patch]: add Ser/Des test 2024-09-04 10:24:06 -07:00
poetry.lock mistral[patch]: Release 0.2.3 (#28452) 2024-12-02 08:26:28 -08:00
pyproject.toml mistral[patch]: Release 0.2.3 (#28452) 2024-12-02 08:26:28 -08:00
README.md

langchain-mistralai

This package contains the LangChain integrations for MistralAI through their mistralai SDK.

Installation

pip install -U langchain-mistralai

Chat Models

This package contains the ChatMistralAI class, which is the recommended way to interface with MistralAI models.

To use, install the requirements, and configure your environment.

export MISTRAL_API_KEY=your-api-key

Then initialize

from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI

chat = ChatMistralAI(model="mistral-small")
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)

ChatMistralAI also supports async and streaming functionality:

# For async...
await chat.ainvoke(messages)

# For streaming...
for chunk in chat.stream(messages):
    print(chunk.content, end="", flush=True)

Embeddings

With MistralAIEmbeddings, you can directly use the default model 'mistral-embed', or set a different one if available.

Choose model

embedding.model = 'mistral-embed'

Simple query

res_query = embedding.embed_query("The test information")

Documents

res_document = embedding.embed_documents(["test1", "another test"])