mirror of
				https://github.com/hwchase17/langchain.git
				synced 2025-10-31 16:08:59 +00:00 
			
		
		
		
	
		
			
				
	
	
		
			129 lines
		
	
	
		
			3.9 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
			
		
		
	
	
			129 lines
		
	
	
		
			3.9 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
| {
 | |
|  "cells": [
 | |
|   {
 | |
|    "cell_type": "markdown",
 | |
|    "id": "6488fdaf",
 | |
|    "metadata": {},
 | |
|    "source": [
 | |
|     "# Chat Prompt Template\n",
 | |
|     "\n",
 | |
|     "Chat Models takes a list of chat messages as input - this list commonly referred to as a prompt.\n",
 | |
|     "Typically this is not simply a hardcoded list of messages but rather a combination of a template, some examples, and user input.\n",
 | |
|     "LangChain provides several classes and functions to make constructing and working with prompts easy.\n"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "code",
 | |
|    "execution_count": 7,
 | |
|    "id": "7647a621",
 | |
|    "metadata": {},
 | |
|    "outputs": [],
 | |
|    "source": [
 | |
|     "from langchain.prompts import (\n",
 | |
|     "    ChatPromptTemplate,\n",
 | |
|     "    PromptTemplate,\n",
 | |
|     "    SystemMessagePromptTemplate,\n",
 | |
|     "    AIMessagePromptTemplate,\n",
 | |
|     "    HumanMessagePromptTemplate,\n",
 | |
|     ")\n",
 | |
|     "from langchain.schema import (\n",
 | |
|     "    AIMessage,\n",
 | |
|     "    HumanMessage,\n",
 | |
|     "    SystemMessage\n",
 | |
|     ")"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "markdown",
 | |
|    "id": "acb4a2f6",
 | |
|    "metadata": {},
 | |
|    "source": [
 | |
|     "You can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplates`. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or Message object, depending on whether you want to use the formatted value as input to an llm or chat model.\n",
 | |
|     "\n",
 | |
|     "For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "code",
 | |
|    "execution_count": 2,
 | |
|    "id": "3124f5e9",
 | |
|    "metadata": {},
 | |
|    "outputs": [],
 | |
|    "source": [
 | |
|     "template=\"You are a helpful assistant that translates {input_language} to {output_language}.\"\n",
 | |
|     "system_message_prompt = SystemMessagePromptTemplate.from_template(template)\n",
 | |
|     "human_template=\"{text}\"\n",
 | |
|     "human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "code",
 | |
|    "execution_count": 3,
 | |
|    "id": "9c7e2e6f",
 | |
|    "metadata": {},
 | |
|    "outputs": [
 | |
|     {
 | |
|      "data": {
 | |
|       "text/plain": [
 | |
|        "[SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}),\n",
 | |
|        " HumanMessage(content='I love programming.', additional_kwargs={})]"
 | |
|       ]
 | |
|      },
 | |
|      "execution_count": 3,
 | |
|      "metadata": {},
 | |
|      "output_type": "execute_result"
 | |
|     }
 | |
|    ],
 | |
|    "source": [
 | |
|     "chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])\n",
 | |
|     "\n",
 | |
|     "# get a chat completion from the formatted messages\n",
 | |
|     "chat_prompt.format_prompt(input_language=\"English\", output_language=\"French\", text=\"I love programming.\").to_messages()"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "markdown",
 | |
|    "id": "0dbdf94f",
 | |
|    "metadata": {},
 | |
|    "source": [
 | |
|     "If you wanted to construct the MessagePromptTemplate more directly, you could create a PromptTemplate outside and then pass it in, eg:"
 | |
|    ]
 | |
|   },
 | |
|   {
 | |
|    "cell_type": "code",
 | |
|    "execution_count": 8,
 | |
|    "id": "5a8d249e",
 | |
|    "metadata": {},
 | |
|    "outputs": [],
 | |
|    "source": [
 | |
|     "prompt=PromptTemplate(\n",
 | |
|     "    template=\"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
 | |
|     "    input_variables=[\"input_language\", \"output_language\"],\n",
 | |
|     ")\n",
 | |
|     "system_message_prompt = SystemMessagePromptTemplate(prompt=prompt)"
 | |
|    ]
 | |
|   }
 | |
|  ],
 | |
|  "metadata": {
 | |
|   "kernelspec": {
 | |
|    "display_name": "Python 3 (ipykernel)",
 | |
|    "language": "python",
 | |
|    "name": "python3"
 | |
|   },
 | |
|   "language_info": {
 | |
|    "codemirror_mode": {
 | |
|     "name": "ipython",
 | |
|     "version": 3
 | |
|    },
 | |
|    "file_extension": ".py",
 | |
|    "mimetype": "text/x-python",
 | |
|    "name": "python",
 | |
|    "nbconvert_exporter": "python",
 | |
|    "pygments_lexer": "ipython3",
 | |
|    "version": "3.9.1"
 | |
|   }
 | |
|  },
 | |
|  "nbformat": 4,
 | |
|  "nbformat_minor": 5
 | |
| }
 |