From dc207665131396dce0e51c86db25436156d30184 Mon Sep 17 00:00:00 2001 From: thehunmonkgroup Date: Wed, 20 Dec 2023 00:22:43 -0500 Subject: [PATCH] docs: readme for langchain-mistralai (#14917) - **Description:** Add README doc for MistralAI partner package. - **Tag maintainer:** @baskaryan --- libs/partners/mistralai/README.md | 40 +++++++++++++++++++++++++++++++ 1 file changed, 40 insertions(+) diff --git a/libs/partners/mistralai/README.md b/libs/partners/mistralai/README.md index 7f803ce7746..9ec32ed4321 100644 --- a/libs/partners/mistralai/README.md +++ b/libs/partners/mistralai/README.md @@ -1 +1,41 @@ # langchain-mistralai + +This package contains the LangChain integrations for [MistralAI](https://docs.mistral.ai) through their [mistralai](https://pypi.org/project/mistralai/) SDK. + +## Installation + +```bash +pip install -U langchain-mistralai +``` + +## Chat Models + +This package contains the `ChatMistralAI` class, which is the recommended way to interface with MistralAI models. + +To use, install the requirements, and configure your environment. + +```bash +export MISTRAL_API_KEY=your-api-key +``` + +Then initialize + +```python +from langchain_core.messages import HumanMessage +from langchain_mistralai.chat_models import ChatMistralAI + +chat = ChatMistralAI(model="mistral-small") +messages = [HumanMessage(content="say a brief hello")] +chat.invoke(messages) +``` + +`ChatMistralAI` also supports async and streaming functionality: + +```python +# For async... +await chat.ainvoke(messages) + +# For streaming... +for chunk in chat.stream(messages): + print(chunk.content, end="", flush=True) +```