Compare commits

...

3 Commits

Author SHA1 Message Date
isaac hershenson
305889b860 bagatur comments 2024-06-14 12:37:00 -07:00
isaac hershenson
5dda0812b1 fmt 2024-06-14 11:10:35 -07:00
isaac hershenson
86333dab0f first draft 2024-06-14 11:02:57 -07:00
2 changed files with 41 additions and 8 deletions

View File

@@ -98,7 +98,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
@@ -153,7 +153,7 @@
"# | output: false\n",
"# | echo: false\n",
"\n",
"%pip install -qU langchain langchain_openai\n",
"%pip install -qU langchain langchain_openai openai\n",
"\n",
"import os\n",
"from getpass import getpass\n",
@@ -167,7 +167,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@@ -178,6 +178,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Force model to select specific tools\n",
"\n",
"We can also use the `tool_choice` parameter to ensure certain behavior. For example, we can force our tool to call the multiply tool by using the following code:"
]
},
@@ -244,6 +246,38 @@
"As we can see, even though the prompt didn't really suggest a tool call, our LLM made one since it was forced to do so. You can look at the docs for [`bind_tool`](https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.BaseChatOpenAI.html#langchain_openai.chat_models.base.BaseChatOpenAI.bind_tools) to learn about all the ways to customize how your LLM selects tools."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Disable parallel tool calling (OpenAI only)\n",
"\n",
"By default, OpenAI will run multiple tools (if selected) in parallel. For instance, if you ask it the weather of 3 cities and there is a ``get_weather`` function it will call the function 3 times. To ensure that only a single tool call is made, you can set the ``parallel_tool_calls`` parameter to False."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'tool_calls': [{'id': 'call_6blH9pWNr9PfZEcbwBGzRI1M',\n",
" 'function': {'arguments': '{\"a\":2,\"b\":2}', 'name': 'Add'},\n",
" 'type': 'function'}]}"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm_with_tools = llm.bind_tools(tools, parallel_tool_calls=False)\n",
"llm_with_tools.invoke(\"Please call the first tool two times\").additional_kwargs"
]
},
{
"cell_type": "markdown",
"metadata": {},

View File

@@ -836,7 +836,6 @@ class BaseChatOpenAI(BaseChatModel):
"any" or "required": force at least one tool to be called;
True: forces tool call (requires `tools` be length 1);
False: no effect;
or a dict of the form:
{"type": "function", "function": {"name": <<tool_name>>}}.
**kwargs: Any additional parameters to pass to the
@@ -1080,7 +1079,7 @@ class BaseChatOpenAI(BaseChatModel):
"schema must be specified when method is 'function_calling'. "
"Received None."
)
llm = self.bind_tools([schema], tool_choice=True)
llm = self.bind_tools([schema], tool_choice=True, parallel_tool_calls=False)
if is_pydantic_schema:
output_parser: OutputParserLike = PydanticToolsParser(
tools=[schema], first_tool_only=True
@@ -1283,13 +1282,13 @@ class ChatOpenAI(BaseChatOpenAI):
Note that ``openai >= 1.32`` supports a ``parallel_tool_calls`` parameter
that defaults to ``True``. This parameter can be set to ``False`` to
disable parallel tool calls:
disable parallel tool calls. You can pass this argument into .bind_tools:
.. code-block:: python
llm_with_tools = llm.bind_tools([GetWeather, GetPopulation],parallel_tool_calls=False)
ai_msg = llm_with_tools.invoke(
"What is the weather in LA and NY?",
parallel_tool_calls=False,
"What is the weather in LA and NY?"
)
ai_msg.tool_calls