cleanup warnings (#8379)

This commit is contained in:
Bagatur
2023-07-27 13:43:05 -07:00
committed by GitHub
parent 41524304bf
commit 55beab326c
10 changed files with 10 additions and 10 deletions

View File

@@ -1,7 +1,7 @@
### Use Case
In this tutorial, we'll configure few shot examples for self-ask with search.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
## Using an example set

View File

@@ -77,7 +77,7 @@ For example, in OpenAI [Chat Completion API](https://platform.openai.com/docs/gu
LangChain provides several prompt templates to make constructing and working with prompts easily. You are encouraged to use these chat related prompt templates instead of `PromptTemplate` when querying chat models to fully exploit the potential of underlying chat model.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python

View File

@@ -2,7 +2,7 @@
One common use case for wanting to partial a prompt template is if you get some of the variables before others. For example, suppose you have a prompt template that requires two variables, `foo` and `baz`. If you get the `foo` value early on in the chain, but the `baz` value later, it can be annoying to wait until you have both variables in the same place to pass them to the prompt template. Instead, you can partial the prompt template with the `foo` value, and then pass the partialed prompt template along and just use that. Below is an example of doing this:
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python