mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-07 22:11:51 +00:00
updates some spelling mistakes (#8537)
Just updating some spelling / grammar issues in the documentation. No code changes. --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
This commit is contained in:
@@ -17,7 +17,7 @@ class CommaSeparatedListOutputParser(BaseOutputParser):
|
||||
return text.strip().split(", ")
|
||||
|
||||
template = """You are a helpful assistant who generates comma separated lists.
|
||||
A user will pass in a category, and you should generated 5 objects in that category in a comma separated list.
|
||||
A user will pass in a category, and you should generate 5 objects in that category in a comma separated list.
|
||||
ONLY return a comma separated list, and nothing more."""
|
||||
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
|
||||
human_template = "{text}"
|
||||
|
@@ -74,7 +74,7 @@ These chat messages differ from raw string (which you would pass into a [LLM](/d
|
||||
|
||||
For example, in OpenAI [Chat Completion API](https://platform.openai.com/docs/guides/chat/introduction), a chat message can be associated with the AI, human or system role. The model is supposed to follow instruction from system chat message more closely.
|
||||
|
||||
LangChain provides several prompt templates to make constructing and working with prompts easily. You are encouraged to use these chat related prompt templates instead of `PromptTemplate` when querying chat models to fully exploit the potential of underlying chat model.
|
||||
LangChain provides several prompt templates to make constructing and working with prompts easy. You are encouraged to use these chat related prompt templates instead of `PromptTemplate` when querying chat models to fully utilize the potential of the underlying chat model.
|
||||
|
||||
|
||||
|
||||
|
Reference in New Issue
Block a user