docs: update GenAI structured output section to include JSON mode details (#32992)

This commit is contained in:
Mason Daugherty
2025-09-17 13:40:34 -04:00
committed by GitHub
parent 54a9556f5c
commit 8b3f74012c

View File

@@ -31,7 +31,7 @@
"\n", "\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n", "| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n", "| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |\n", "| ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |\n",
"\n", "\n",
"### Setup\n", "### Setup\n",
"\n", "\n",
@@ -653,15 +653,35 @@
"\n", "\n",
"# Initialize the model\n", "# Initialize the model\n",
"llm = ChatGoogleGenerativeAI(model=\"gemini-2.0-flash\", temperature=0)\n", "llm = ChatGoogleGenerativeAI(model=\"gemini-2.0-flash\", temperature=0)\n",
"structured_llm = llm.with_structured_output(Person)\n", "\n",
"# Method 1: Default function calling approach\n",
"structured_llm_default = llm.with_structured_output(Person)\n",
"\n",
"# Method 2: Native JSON mode\n",
"structured_llm_json = llm.with_structured_output(Person, method=\"json_mode\")\n",
"\n", "\n",
"# Invoke the model with a query asking for structured information\n", "# Invoke the model with a query asking for structured information\n",
"result = structured_llm.invoke(\n", "result = structured_llm_json.invoke(\n",
" \"Who was the 16th president of the USA, and how tall was he in meters?\"\n", " \"Who was the 16th president of the USA, and how tall was he in meters?\"\n",
")\n", ")\n",
"print(result)" "print(result)"
] ]
}, },
{
"cell_type": "markdown",
"id": "g9w06ld1ggq",
"metadata": {},
"source": [
"### Structured Output Methods\n",
"\n",
"Two methods are supported for structured output:\n",
"\n",
"- **`method=\"function_calling\"` (default)**: Uses tool calling to extract structured data. Compatible with all Gemini models.\n",
"- **`method=\"json_mode\"`**: Uses Gemini's native structured output with `responseSchema`. More reliable but requires Gemini 1.5+ models.\n",
"\n",
"The `json_mode` method is **recommended for better reliability** as it constrains the model's generation process directly rather than relying on post-processing tool calls."
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"id": "90d4725e", "id": "90d4725e",