mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-07 05:52:15 +00:00
docs:misc fixes (#9671)
Improve internal consistency in LangChain documentation - Change occurrences of eg and eg. to e.g. - Fix headers containing unnecessary capital letters. - Change instances of "few shot" to "few-shot". - Add periods to end of sentences where missing. - Minor spelling and grammar fixes.
This commit is contained in:
@@ -130,10 +130,10 @@ chain.run(number=2, callbacks=[handler])
|
||||
|
||||
The `callbacks` argument is available on most objects throughout the API (Chains, Models, Tools, Agents, etc.) in two different places:
|
||||
|
||||
- **Constructor callbacks**: defined in the constructor, eg. `LLMChain(callbacks=[handler], tags=['a-tag'])`, which will be used for all calls made on that object, and will be scoped to that object only, eg. if you pass a handler to the `LLMChain` constructor, it will not be used by the Model attached to that chain.
|
||||
- **Request callbacks**: defined in the `run()`/`apply()` methods used for issuing a request, eg. `chain.run(input, callbacks=[handler])`, which will be used for that specific request only, and all sub-requests that it contains (eg. a call to an LLMChain triggers a call to a Model, which uses the same handler passed in the `call()` method).
|
||||
- **Constructor callbacks**: defined in the constructor, e.g. `LLMChain(callbacks=[handler], tags=['a-tag'])`, which will be used for all calls made on that object, and will be scoped to that object only, e.g. if you pass a handler to the `LLMChain` constructor, it will not be used by the Model attached to that chain.
|
||||
- **Request callbacks**: defined in the `run()`/`apply()` methods used for issuing a request, e.g. `chain.run(input, callbacks=[handler])`, which will be used for that specific request only, and all sub-requests that it contains (e.g. a call to an LLMChain triggers a call to a Model, which uses the same handler passed in the `call()` method).
|
||||
|
||||
The `verbose` argument is available on most objects throughout the API (Chains, Models, Tools, Agents, etc.) as a constructor argument, eg. `LLMChain(verbose=True)`, and it is equivalent to passing a `ConsoleCallbackHandler` to the `callbacks` argument of that object and all child objects. This is useful for debugging, as it will log all events to the console.
|
||||
The `verbose` argument is available on most objects throughout the API (Chains, Models, Tools, Agents, etc.) as a constructor argument, e.g. `LLMChain(verbose=True)`, and it is equivalent to passing a `ConsoleCallbackHandler` to the `callbacks` argument of that object and all child objects. This is useful for debugging, as it will log all events to the console.
|
||||
|
||||
### When do you want to use each of these?
|
||||
|
||||
|
@@ -628,7 +628,7 @@ local_chain("How many customers are there?")
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
Even this relatively large model will most likely fail to generate more complicated SQL by itself. However, you can log its inputs and outputs so that you can hand-correct them and use the corrected examples for few shot prompt examples later. In practice, you could log any executions of your chain that raise exceptions (as shown in the example below) or get direct user feedback in cases where the results are incorrect (but did not raise an exception).
|
||||
Even this relatively large model will most likely fail to generate more complicated SQL by itself. However, you can log its inputs and outputs so that you can hand-correct them and use the corrected examples for few-shot prompt examples later. In practice, you could log any executions of your chain that raise exceptions (as shown in the example below) or get direct user feedback in cases where the results are incorrect (but did not raise an exception).
|
||||
|
||||
|
||||
```bash
|
||||
@@ -878,7 +878,7 @@ YAML_EXAMPLES = """
|
||||
"""
|
||||
```
|
||||
|
||||
Now that you have some examples (with manually corrected output SQL), you can do few shot prompt seeding the usual way:
|
||||
Now that you have some examples (with manually corrected output SQL), you can do few-shot prompt seeding the usual way:
|
||||
|
||||
|
||||
```python
|
||||
@@ -925,7 +925,7 @@ few_shot_prompt = FewShotPromptTemplate(
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
The model should do better now with this few shot prompt, especially for inputs similar to the examples you have seeded it with.
|
||||
The model should do better now with this few-shot prompt, especially for inputs similar to the examples you have seeded it with.
|
||||
|
||||
|
||||
```python
|
||||
|
@@ -4,7 +4,7 @@ In addition to controlling which characters you can split on, you can also contr
|
||||
|
||||
- `length_function`: how the length of chunks is calculated. Defaults to just counting number of characters, but it's pretty common to pass a token counter here.
|
||||
- `chunk_size`: the maximum size of your chunks (as measured by the length function).
|
||||
- `chunk_overlap`: the maximum overlap between chunks. It can be nice to have some overlap to maintain some continuity between chunks (eg do a sliding window).
|
||||
- `chunk_overlap`: the maximum overlap between chunks. It can be nice to have some overlap to maintain some continuity between chunks (e.g. do a sliding window).
|
||||
- `add_start_index`: whether to include the starting position of each chunk within the original document in the metadata.
|
||||
|
||||
|
||||
|
@@ -34,7 +34,7 @@ chat(chat_prompt.format_prompt(input_language="English", output_language="French
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
If you wanted to construct the MessagePromptTemplate more directly, you could create a PromptTemplate outside and then pass it in, eg:
|
||||
If you wanted to construct the MessagePromptTemplate more directly, you could create a PromptTemplate outside and then pass it in, e.g.:
|
||||
|
||||
|
||||
```python
|
||||
|
@@ -1,13 +1,13 @@
|
||||
### Use Case
|
||||
|
||||
In this tutorial, we'll configure few shot examples for self-ask with search.
|
||||
In this tutorial, we'll configure few-shot examples for self-ask with search.
|
||||
|
||||
|
||||
## Using an example set
|
||||
|
||||
### Create the example set
|
||||
|
||||
To get started, create a list of few shot examples. Each example should be a dictionary with the keys being the input variables and the values being the values for those input variables.
|
||||
To get started, create a list of few-shot examples. Each example should be a dictionary with the keys being the input variables and the values being the values for those input variables.
|
||||
|
||||
```python
|
||||
from langchain.prompts.few_shot import FewShotPromptTemplate
|
||||
@@ -69,9 +69,9 @@ So the final answer is: No
|
||||
]
|
||||
```
|
||||
|
||||
### Create a formatter for the few shot examples
|
||||
### Create a formatter for the few-shot examples
|
||||
|
||||
Configure a formatter that will format the few shot examples into a string. This formatter should be a `PromptTemplate` object.
|
||||
Configure a formatter that will format the few-shot examples into a string. This formatter should be a `PromptTemplate` object.
|
||||
|
||||
|
||||
```python
|
||||
@@ -98,7 +98,7 @@ print(example_prompt.format(**examples[0]))
|
||||
|
||||
### Feed examples and formatter to `FewShotPromptTemplate`
|
||||
|
||||
Finally, create a `FewShotPromptTemplate` object. This object takes in the few shot examples and the formatter for the few shot examples.
|
||||
Finally, create a `FewShotPromptTemplate` object. This object takes in the few-shot examples and the formatter for the few-shot examples.
|
||||
|
||||
|
||||
```python
|
||||
@@ -171,7 +171,7 @@ print(prompt.format(input="Who was the father of Mary Ball Washington?"))
|
||||
We will reuse the example set and the formatter from the previous section. However, instead of feeding the examples directly into the `FewShotPromptTemplate` object, we will feed them into an `ExampleSelector` object.
|
||||
|
||||
|
||||
In this tutorial, we will use the `SemanticSimilarityExampleSelector` class. This class selects few shot examples based on their similarity to the input. It uses an embedding model to compute the similarity between the input and the few shot examples, as well as a vector store to perform the nearest neighbor search.
|
||||
In this tutorial, we will use the `SemanticSimilarityExampleSelector` class. This class selects few-shot examples based on their similarity to the input. It uses an embedding model to compute the similarity between the input and the few-shot examples, as well as a vector store to perform the nearest neighbor search.
|
||||
|
||||
|
||||
```python
|
||||
@@ -224,7 +224,7 @@ for example in selected_examples:
|
||||
|
||||
### Feed example selector into `FewShotPromptTemplate`
|
||||
|
||||
Finally, create a `FewShotPromptTemplate` object. This object takes in the example selector and the formatter for the few shot examples.
|
||||
Finally, create a `FewShotPromptTemplate` object. This object takes in the example selector and the formatter for the few-shot examples.
|
||||
|
||||
|
||||
```python
|
||||
|
@@ -1,4 +1,4 @@
|
||||
## Partial With Strings
|
||||
## Partial with strings
|
||||
|
||||
One common use case for wanting to partial a prompt template is if you get some of the variables before others. For example, suppose you have a prompt template that requires two variables, `foo` and `baz`. If you get the `foo` value early on in the chain, but the `baz` value later, it can be annoying to wait until you have both variables in the same place to pass them to the prompt template. Instead, you can partial the prompt template with the `foo` value, and then pass the partialed prompt template along and just use that. Below is an example of doing this:
|
||||
|
||||
@@ -40,7 +40,7 @@ print(prompt.format(bar="baz"))
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
## Partial With Functions
|
||||
## Partial with functions
|
||||
|
||||
The other common use is to partial with a function. The use case for this is when you have a variable you know that you always want to fetch in a common way. A prime example of this is with date or time. Imagine you have a prompt which you always want to have the current date. You can't hard code it in the prompt, and passing it along with the other input variables is a bit annoying. In this case, it's very handy to be able to partial the prompt with a function that always returns the current date.
|
||||
|
||||
|
Reference in New Issue
Block a user