bagatur comments

This commit is contained in:
isaac hershenson
2024-06-14 12:37:00 -07:00
parent 5dda0812b1
commit 305889b860
2 changed files with 38 additions and 9 deletions

View File

@@ -98,7 +98,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
@@ -153,7 +153,7 @@
"# | output: false\n",
"# | echo: false\n",
"\n",
"%pip install -qU langchain langchain_openai\n",
"%pip install -qU langchain langchain_openai openai\n",
"\n",
"import os\n",
"from getpass import getpass\n",
@@ -167,7 +167,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@@ -178,6 +178,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Force model to select specific tools\n",
"\n",
"We can also use the `tool_choice` parameter to ensure certain behavior. For example, we can force our tool to call the multiply tool by using the following code:"
]
},
@@ -244,6 +246,38 @@
"As we can see, even though the prompt didn't really suggest a tool call, our LLM made one since it was forced to do so. You can look at the docs for [`bind_tool`](https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.BaseChatOpenAI.html#langchain_openai.chat_models.base.BaseChatOpenAI.bind_tools) to learn about all the ways to customize how your LLM selects tools."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Disable parallel tool calling (OpenAI only)\n",
"\n",
"By default, OpenAI will run multiple tools (if selected) in parallel. For instance, if you ask it the weather of 3 cities and there is a ``get_weather`` function it will call the function 3 times. To ensure that only a single tool call is made, you can set the ``parallel_tool_calls`` parameter to False."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'tool_calls': [{'id': 'call_6blH9pWNr9PfZEcbwBGzRI1M',\n",
" 'function': {'arguments': '{\"a\":2,\"b\":2}', 'name': 'Add'},\n",
" 'type': 'function'}]}"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm_with_tools = llm.bind_tools(tools, parallel_tool_calls=False)\n",
"llm_with_tools.invoke(\"Please call the first tool two times\").additional_kwargs"
]
},
{
"cell_type": "markdown",
"metadata": {},

View File

@@ -817,7 +817,6 @@ class BaseChatOpenAI(BaseChatModel):
tool_choice: Optional[
Union[dict, str, Literal["auto", "none", "required", "any"], bool]
] = None,
parallel_tool_calls: Optional[bool] = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, BaseMessage]:
"""Bind tool-like objects to this chat model.
@@ -839,15 +838,11 @@ class BaseChatOpenAI(BaseChatModel):
False: no effect;
or a dict of the form:
{"type": "function", "function": {"name": <<tool_name>>}}.
parallel_tool_calls: Whether to run tool calls in parallel
Can either be True or False
**kwargs: Any additional parameters to pass to the
:class:`~langchain.runnable.Runnable` constructor.
"""
formatted_tools = [convert_to_openai_tool(tool) for tool in tools]
if parallel_tool_calls:
kwargs["parallel_tool_calls"] = parallel_tool_calls
if tool_choice:
if isinstance(tool_choice, str):
# tool_choice is a tool/function name
@@ -1084,7 +1079,7 @@ class BaseChatOpenAI(BaseChatModel):
"schema must be specified when method is 'function_calling'. "
"Received None."
)
llm = self.bind_tools([schema], tool_choice=True)
llm = self.bind_tools([schema], tool_choice=True, parallel_tool_calls=False)
if is_pydantic_schema:
output_parser: OutputParserLike = PydanticToolsParser(
tools=[schema], first_tool_only=True