docs:misc fixes (#9671)

Improve internal consistency in LangChain documentation
- Change occurrences of eg and eg. to e.g.
- Fix headers containing unnecessary capital letters.
- Change instances of "few shot" to "few-shot".
- Add periods to end of sentences where missing.
- Minor spelling and grammar fixes.
This commit is contained in:
seamusp
2023-08-23 22:36:54 -07:00
committed by GitHub
parent 6283f3b63c
commit 25f2c82ae8
25 changed files with 85 additions and 106 deletions

View File

@@ -107,7 +107,7 @@ import PromptTemplateChatModel from "@snippets/get_started/quickstart/prompt_tem
<PromptTemplateLLM/>
However, the advantages of using these over raw string formatting are several.
You can "partial" out variables - eg you can format only some of the variables at a time.
You can "partial" out variables - e.g. you can format only some of the variables at a time.
You can compose them together, easily combining different templates into a single prompt.
For explanations of these functionalities, see the [section on prompts](/docs/modules/model_io/prompts) for more detail.
@@ -121,12 +121,12 @@ Let's take a look at this below:
ChatPromptTemplates can also include other things besides ChatMessageTemplates - see the [section on prompts](/docs/modules/model_io/prompts) for more detail.
## Output Parsers
## Output parsers
OutputParsers convert the raw output of an LLM into a format that can be used downstream.
There are few main type of OutputParsers, including:
- Convert text from LLM -> structured information (eg JSON)
- Convert text from LLM -> structured information (e.g. JSON)
- Convert a ChatMessage into just a string
- Convert the extra information returned from a call besides the message (like OpenAI function invocation) into a string.
@@ -149,7 +149,7 @@ import LLMChain from "@snippets/get_started/quickstart/llm_chain.mdx"
<LLMChain/>
## Next Steps
## Next steps
This is it!
We've now gone over how to create the core building block of LangChain applications - the LLMChains.

View File

@@ -1,6 +1,6 @@
# Few-shot prompt templates
In this tutorial, we'll learn how to create a prompt template that uses few shot examples. A few shot prompt template can be constructed from either a set of examples, or from an Example Selector object.
In this tutorial, we'll learn how to create a prompt template that uses few-shot examples. A few-shot prompt template can be constructed from either a set of examples, or from an Example Selector object.
import Example from "@snippets/modules/model_io/prompts/prompt_templates/few_shot_examples.mdx"

View File

@@ -6,7 +6,7 @@ sidebar_position: 0
Prompt templates are pre-defined recipes for generating prompts for language models.
A template may include instructions, few shot examples, and specific context and
A template may include instructions, few-shot examples, and specific context and
questions appropriate for a given task.
LangChain provides tooling to create and work with prompt templates.

View File

@@ -1,6 +1,6 @@
# Partial prompt templates
Like other methods, it can make sense to "partial" a prompt template - eg pass in a subset of the required values, as to create a new prompt template which expects only the remaining subset of values.
Like other methods, it can make sense to "partial" a prompt template - e.g. pass in a subset of the required values, as to create a new prompt template which expects only the remaining subset of values.
LangChain supports this in two ways:
1. Partial formatting with string values.

View File

@@ -2,8 +2,8 @@
This notebook goes over how to compose multiple prompts together. This can be useful when you want to reuse parts of prompts. This can be done with a PipelinePrompt. A PipelinePrompt consists of two main parts:
- Final prompt: This is the final prompt that is returned
- Pipeline prompts: This is a list of tuples, consisting of a string name and a prompt template. Each prompt template will be formatted and then passed to future prompt templates as a variable with the same name.
- Final prompt: The final prompt that is returned
- Pipeline prompts: A list of tuples, consisting of a string name and a prompt template. Each prompt template will be formatted and then passed to future prompt templates as a variable with the same name.
import Example from "@snippets/modules/model_io/prompts/prompt_templates/prompt_composition.mdx"