Update compatibility table for ChatDatabricks (#27676)

`ChatDatabricks` added support for structured output and JSON mode in
the last release. This PR updates the feature table accordingly.

Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
This commit is contained in:
Yuki Watanabe 2024-10-31 00:56:55 +09:00 committed by GitHub
parent bd5ea18a6c
commit e593e017d2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -36,7 +36,7 @@
"### Model features\n",
"| [Tool calling](/docs/how_to/tool_calling/) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ | \n",
"| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ | \n",
"\n",
"### Supported Methods\n",
"\n",
@ -289,6 +289,55 @@
"await asyncio.gather(*futures)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Tool calling\n",
"\n",
"ChatDatabricks supports OpenAI-compatible tool calling API that lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool. tool-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.\n",
"\n",
"With `ChatDatabricks.bind_tools`, we can easily pass in Pydantic classes, dict schemas, LangChain tools, or even functions as tools to the model. Under the hood these are converted to the OpenAI-compatible tool schemas, which looks like:\n",
"\n",
"```\n",
"{\n",
" \"name\": \"...\",\n",
" \"description\": \"...\",\n",
" \"parameters\": {...} # JSONSchema\n",
"}\n",
"```\n",
"\n",
"and passed in every model invocation."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from pydantic import BaseModel, Field\n",
"\n",
"\n",
"class GetWeather(BaseModel):\n",
" \"\"\"Get the current weather in a given location\"\"\"\n",
"\n",
" location: str = Field(..., description=\"The city and state, e.g. San Francisco, CA\")\n",
"\n",
"\n",
"class GetPopulation(BaseModel):\n",
" \"\"\"Get the current population in a given location\"\"\"\n",
"\n",
" location: str = Field(..., description=\"The city and state, e.g. San Francisco, CA\")\n",
"\n",
"\n",
"llm_with_tools = chat_model.bind_tools([GetWeather, GetPopulation])\n",
"ai_msg = llm_with_tools.invoke(\n",
" \"Which city is hotter today and which is bigger: LA or NY?\"\n",
")\n",
"print(ai_msg.tool_calls)"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -466,7 +515,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "Python 3",
"language": "python",
"name": "python3"
},