Update examples to fix execution problems (#685)

On the [Getting Started
page](https://langchain.readthedocs.io/en/latest/modules/prompts/getting_started.html)
for prompt templates, I believe the very last example

```python
print(dynamic_prompt.format(adjective=long_string))
```

should actually be

```python
print(dynamic_prompt.format(input=long_string))
```

The existing example produces `KeyError: 'input'` as expected

***

On the [Create a custom prompt
template](https://langchain.readthedocs.io/en/latest/modules/prompts/examples/custom_prompt_template.html#id1)
page, I believe the line

```python
Function Name: {kwargs["function_name"]}
```

should actually be

```python
Function Name: {kwargs["function_name"].__name__}
```

The existing example produces the prompt:

```
        Given the function name and source code, generate an English language explanation of the function.
        Function Name: <function get_source_code at 0x7f907bc0e0e0>
        Source Code:
        def get_source_code(function_name):
    # Get the source code of the function
    return inspect.getsource(function_name)

        Explanation:
```

***

On the [Example
Selectors](https://langchain.readthedocs.io/en/latest/modules/prompts/examples/example_selectors.html)
page, the first example does not define `example_prompt`, which is also
subtly different from previous example prompts used. For user
convenience, I suggest including

```python
example_prompt = PromptTemplate(
    input_variables=["input", "output"],
    template="Input: {input}\nOutput: {output}",
)
```

in the code to be copy-pasted
This commit is contained in:
Amos Ng 2023-01-23 05:49:25 +07:00 committed by GitHub
parent 86dbdb118b
commit 8baf6fb920
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 7 additions and 2 deletions

View File

@ -54,7 +54,7 @@ class FunctionExplainerPromptTemplate(BasePromptTemplate, BaseModel):
# Generate the prompt to be sent to the language model # Generate the prompt to be sent to the language model
prompt = f""" prompt = f"""
Given the function name and source code, generate an English language explanation of the function. Given the function name and source code, generate an English language explanation of the function.
Function Name: {kwargs["function_name"]} Function Name: {kwargs["function_name"].__name__}
Source Code: Source Code:
{source_code} {source_code}
Explanation: Explanation:

View File

@ -48,6 +48,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from langchain.prompts import PromptTemplate\n",
"from langchain.prompts.example_selector import LengthBasedExampleSelector" "from langchain.prompts.example_selector import LengthBasedExampleSelector"
] ]
}, },
@ -75,6 +76,10 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"example_prompt = PromptTemplate(\n",
" input_variables=[\"input\", \"output\"],\n",
" template=\"Input: {input}\\nOutput: {output}\",\n",
")\n",
"example_selector = LengthBasedExampleSelector(\n", "example_selector = LengthBasedExampleSelector(\n",
" # These are the examples is has available to choose from.\n", " # These are the examples is has available to choose from.\n",
" examples=examples, \n", " examples=examples, \n",

View File

@ -211,7 +211,7 @@ In contrast, if we provide a very long input, the `LengthBasedExampleSelector` w
```python ```python
long_string = "big and huge and massive and large and gigantic and tall and much much much much much bigger than everything else" long_string = "big and huge and massive and large and gigantic and tall and much much much much much bigger than everything else"
print(dynamic_prompt.format(adjective=long_string)) print(dynamic_prompt.format(input=long_string))
# -> Give the antonym of every input # -> Give the antonym of every input
# -> Word: happy # -> Word: happy