community[patch]: Fix MLX LLM Stream (#20575)

Closes #20561

This PR fixes MLX LLM stream `AttributeError`. 

Recently, `mlx-lm` changed the token decoding logic, which affected the
LC+MLX integration.

Additionally, I made minor fixes such as: docs example broken link and
enforcing pipeline arguments (max_tokens, temp and etc) for invoke.
   
- **Issue:** #20561
    
- **Twitter handle:** @Prince_Canuma
This commit is contained in:
Prince Canuma
2024-05-21 02:17:08 +02:00
committed by GitHub
parent 96bd0b0844
commit 3587c60396
2 changed files with 55 additions and 12 deletions

View File

@@ -9,7 +9,7 @@
"This notebook shows how to get started using `MLX` LLM's as chat models.\n",
"\n",
"In particular, we will:\n",
"1. Utilize the [MLXPipeline](https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/llms/mlx_pipelines.py), \n",
"1. Utilize the [MLXPipeline](https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/llms/mlx_pipeline.py), \n",
"2. Utilize the `ChatMLX` class to enable any of these LLMs to interface with LangChain's [Chat Messages](https://python.langchain.com/docs/modules/model_io/chat/#messages) abstraction.\n",
"3. Demonstrate how to use an open-source LLM to power an `ChatAgent` pipeline\n"
]