langchain[patch], experimental[minor]: Adds OllamaFunctions wrapper (#13330)

CC @baskaryan @hwchase17 @jmorganca 

Having a bit of trouble importing `langchain_experimental` from a
notebook, will figure it out tomorrow

~Ah and also is blocked by #13226~

---------

Co-authored-by: Lance Martin <lance@langchain.dev>
Co-authored-by: Bagatur <baskaryan@gmail.com>
This commit is contained in:
Jacob Lee
2023-11-30 16:13:57 -08:00
committed by GitHub
parent 4063bf144a
commit 3328507f11
7 changed files with 529 additions and 6 deletions

View File

@@ -119,6 +119,159 @@
"chat_model(messages)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Extraction\n",
" \n",
"Update your version of Ollama and supply the [`format`](https://github.com/jmorganca/ollama/blob/main/docs/api.md#json-mode) flag.\n",
"\n",
"We can enforce the model to produce JSON.\n",
"\n",
"**Note:** You can also try out the experimental [OllamaFunctions](https://python.langchain.com/docs/integrations/chat/ollama_functions) wrapper for convenience."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"from langchain.callbacks.manager import CallbackManager\n",
"from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler\n",
"from langchain.chat_models import ChatOllama\n",
"\n",
"chat_model = ChatOllama(\n",
" model=\"llama2\",\n",
" format=\"json\",\n",
" callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" Sure! Here's a JSON response with the colors of the sky at different times of the day:\n",
" Begriffe und Abkürzungen:\n",
"\n",
"* `time`: The time of day (in 24-hour format)\n",
"* `sky_color`: The color of the sky at that time (as a hex code)\n",
"\n",
"Here are the colors of the sky at different times of the day:\n",
"```json\n",
"[\n",
" {\n",
" \"time\": \"6am\",\n",
" \"sky_color\": \"#0080c0\"\n",
" },\n",
" {\n",
" \"time\": \"9am\",\n",
" \"sky_color\": \"#3498db\"\n",
" },\n",
" {\n",
" \"time\": \"12pm\",\n",
" \"sky_color\": \"#ef7c00\"\n",
" },\n",
" {\n",
" \"time\": \"3pm\",\n",
" \"sky_color\": \"#9564b6\"\n",
" },\n",
" {\n",
" \"time\": \"6pm\",\n",
" \"sky_color\": \"#e78ac3\"\n",
" },\n",
" {\n",
" \"time\": \"9pm\",\n",
" \"sky_color\": \"#5f006a\"\n",
" }\n",
"]\n",
"```\n",
"In this response, the `time` property is a string in 24-hour format, representing the time of day. The `sky_color` property is a hex code representing the color of the sky at that time. For example, at 6am, the sky is blue (#0080c0), while at 9pm, it's dark blue (#5f006a)."
]
}
],
"source": [
"from langchain.schema import HumanMessage\n",
"\n",
"messages = [\n",
" HumanMessage(\n",
" content=\"What color is the sky at different times of the day? Respond using JSON\"\n",
" )\n",
"]\n",
"\n",
"chat_model_response = chat_model(messages)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" Sure! Based on the JSON schema you provided, here's the information we can gather about a person named John who is 35 years old and loves pizza:\n",
"\n",
"**Name:** John\n",
"\n",
"**Age:** 35 (integer)\n",
"\n",
"**Favorite food:** Pizza (string)\n",
"\n",
"So, the JSON object for John would look like this:\n",
"```json\n",
"{\n",
" \"name\": \"John\",\n",
" \"age\": 35,\n",
" \"fav_food\": \"pizza\"\n",
"}\n",
"```\n",
"Note that we cannot provide additional information about John beyond what is specified in the schema. For example, we do not have any information about his gender, occupation, or address, as those fields are not included in the schema."
]
}
],
"source": [
"import json\n",
"\n",
"from langchain.schema import HumanMessage\n",
"\n",
"json_schema = {\n",
" \"title\": \"Person\",\n",
" \"description\": \"Identifying information about a person.\",\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"name\": {\"title\": \"Name\", \"description\": \"The person's name\", \"type\": \"string\"},\n",
" \"age\": {\"title\": \"Age\", \"description\": \"The person's age\", \"type\": \"integer\"},\n",
" \"fav_food\": {\n",
" \"title\": \"Fav Food\",\n",
" \"description\": \"The person's favorite food\",\n",
" \"type\": \"string\",\n",
" },\n",
" },\n",
" \"required\": [\"name\", \"age\"],\n",
"}\n",
"\n",
"messages = [\n",
" HumanMessage(\n",
" content=\"Please tell me about a person using the following JSON schema:\"\n",
" ),\n",
" HumanMessage(content=json.dumps(json_schema, indent=2)),\n",
" HumanMessage(\n",
" content=\"Now, considering the schema, tell me about a person named John who is 35 years old and loves pizza.\"\n",
" ),\n",
"]\n",
"\n",
"chat_model_response = chat_model(messages)"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -375,5 +528,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}

View File

@@ -0,0 +1,171 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Ollama Functions\n",
"\n",
"This notebook shows how to use an experimental wrapper around Ollama that gives it the same API as OpenAI Functions.\n",
"\n",
"Note that more powerful and capable models will perform better with complex schema and/or multiple functions. The examples below use Mistral.\n",
"For a complete list of supported models and model variants, see the [Ollama model library](https://ollama.ai/library).\n",
"\n",
"## Setup\n",
"\n",
"Follow [these instructions](https://github.com/jmorganca/ollama) to set up and run a local Ollama instance.\n",
"\n",
"## Usage\n",
"\n",
"You can initialize OllamaFunctions in a similar way to how you'd initialize a standard ChatOllama instance:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from langchain_experimental.llms.ollama_functions import OllamaFunctions\n",
"\n",
"model = OllamaFunctions(model=\"mistral\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can then bind functions defined with JSON Schema parameters and a `function_call` parameter to force the model to call the given function:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"model = model.bind(\n",
" functions=[\n",
" {\n",
" \"name\": \"get_current_weather\",\n",
" \"description\": \"Get the current weather in a given location\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"location\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The city and state, \" \"e.g. San Francisco, CA\",\n",
" },\n",
" \"unit\": {\n",
" \"type\": \"string\",\n",
" \"enum\": [\"celsius\", \"fahrenheit\"],\n",
" },\n",
" },\n",
" \"required\": [\"location\"],\n",
" },\n",
" }\n",
" ],\n",
" function_call={\"name\": \"get_current_weather\"},\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Calling a function with this model then results in JSON output matching the provided schema:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='', additional_kwargs={'function_call': {'name': 'get_current_weather', 'arguments': '{\"location\": \"Boston, MA\", \"unit\": \"celsius\"}'}})"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain.schema import HumanMessage\n",
"\n",
"model.invoke(\"what is the weather in Boston?\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Using for extraction\n",
"\n",
"One useful thing you can do with function calling here is extracting properties from a given input in a structured format:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[{'name': 'Alex', 'height': 5, 'hair_color': 'blonde'},\n",
" {'name': 'Claudia', 'height': 6, 'hair_color': 'brunette'}]"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain.chains import create_extraction_chain\n",
"\n",
"# Schema\n",
"schema = {\n",
" \"properties\": {\n",
" \"name\": {\"type\": \"string\"},\n",
" \"height\": {\"type\": \"integer\"},\n",
" \"hair_color\": {\"type\": \"string\"},\n",
" },\n",
" \"required\": [\"name\", \"height\"],\n",
"}\n",
"\n",
"# Input\n",
"input = \"\"\"Alex is 5 feet tall. Claudia is 1 feet taller than Alex and jumps higher than him. Claudia is a brunette and Alex is blonde.\"\"\"\n",
"\n",
"# Run chain\n",
"llm = OllamaFunctions(model=\"mistral\", temperature=0)\n",
"chain = create_extraction_chain(schema, llm)\n",
"chain.run(input)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.5"
}
},
"nbformat": 4,
"nbformat_minor": 2
}