community: ChatSnowflakeCortex - Add streaming functionality (#27753)

Description:
snowflake.py
Add _stream and _stream_content methods to enable streaming
functionality
fix pydantic issues and added functionality with the overall langchain
version upgrade
added bind_tools method for agentic workflows support through langgraph
updated the _generate method to account for agentic workflows support
through langgraph
cosmetic changes to comments and if conditions

snowflake.ipynb
Added _stream example
cosmetic changes to comments
fixed lint errors

check_pydantic.sh
Decreased counter from 126 to 125 as suggested when formatting

---------

Co-authored-by: Prathamesh Nimkar <prathamesh.nimkar@snowflake.com>
Co-authored-by: Erick Friis <erick@langchain.dev>
This commit is contained in:
Prathamesh Nimkar
2024-12-11 21:35:40 -05:00
committed by GitHub
parent d834c6b618
commit ca054ed1b1
3 changed files with 223 additions and 69 deletions

View File

@@ -22,24 +22,16 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Note: you may need to restart the kernel to use updated packages.\n"
]
}
],
"outputs": [],
"source": [
"%pip install --upgrade --quiet snowflake-snowpark-python"
]
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
@@ -73,14 +65,14 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.chat_models import ChatSnowflakeCortex\n",
"from langchain_core.messages import HumanMessage, SystemMessage\n",
"\n",
"# By default, we'll be using the cortex provided model: `snowflake-arctic`, with function: `complete`\n",
"# By default, we'll be using the cortex provided model: `mistral-large`, with function: `complete`\n",
"chat = ChatSnowflakeCortex()"
]
},
@@ -92,16 +84,16 @@
"\n",
"```python\n",
"chat = ChatSnowflakeCortex(\n",
" # change default cortex model and function\n",
" model=\"snowflake-arctic\",\n",
" # Change the default cortex model and function\n",
" model=\"mistral-large\",\n",
" cortex_function=\"complete\",\n",
"\n",
" # change default generation parameters\n",
" # Change the default generation parameters\n",
" temperature=0,\n",
" max_tokens=10,\n",
" top_p=0.95,\n",
"\n",
" # specify snowflake credentials\n",
" # Specify your Snowflake Credentials\n",
" account=\"YOUR_SNOWFLAKE_ACCOUNT\",\n",
" username=\"YOUR_SNOWFLAKE_USERNAME\",\n",
" password=\"YOUR_SNOWFLAKE_PASSWORD\",\n",
@@ -117,28 +109,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Calling the model\n",
"We can now call the model using the `invoke` or `generate` method.\n",
"\n",
"#### Generation"
"### Calling the chat model\n",
"We can now call the chat model using the `invoke` or `stream` methods."
]
},
{
"cell_type": "code",
"execution_count": 9,
"cell_type": "markdown",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=\" Large language models are artificial intelligence systems designed to understand, generate, and manipulate human language. These models are typically based on deep learning techniques and are trained on vast amounts of text data to learn patterns and structures in language. They can perform a wide range of language-related tasks, such as language translation, text generation, sentiment analysis, and answering questions. Some well-known large language models include Google's BERT, OpenAI's GPT series, and Facebook's RoBERTa. These models have shown remarkable performance in various natural language processing tasks, and their applications continue to expand as research in AI progresses.\", response_metadata={'completion_tokens': 131, 'prompt_tokens': 29, 'total_tokens': 160}, id='run-5435bd0a-83fd-4295-b237-66cbd1b5c0f3-0')"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages = [\n",
" SystemMessage(content=\"You are a friendly assistant.\"),\n",
@@ -151,14 +128,31 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Streaming\n",
"`ChatSnowflakeCortex` doesn't support streaming as of now. Support for streaming will be coming in the later versions!"
"### Stream"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Sample input prompt\n",
"messages = [\n",
" SystemMessage(content=\"You are a friendly assistant.\"),\n",
" HumanMessage(content=\"What are large language models?\"),\n",
"]\n",
"\n",
"# Invoke the stream method and print each chunk as it arrives\n",
"print(\"Stream Method Response:\")\n",
"for chunk in chat._stream(messages):\n",
" print(chunk.message.content)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"display_name": "langchain",
"language": "python",
"name": "python3"
},
@@ -172,7 +166,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
"version": "3.9.20"
}
},
"nbformat": 4,