diff --git a/docs/docs_skeleton/docs/get_started/quickstart.mdx b/docs/docs_skeleton/docs/get_started/quickstart.mdx index 3bbbc331ea9..8250083adc4 100644 --- a/docs/docs_skeleton/docs/get_started/quickstart.mdx +++ b/docs/docs_skeleton/docs/get_started/quickstart.mdx @@ -47,7 +47,7 @@ import ChatModel from "@snippets/get_started/quickstart/chat_model.mdx" ## Prompt templates -Most LLM applications do not pass user input directly into to an LLM. Usually they will add the user input to a larger piece of text, called a prompt template, that provides additional context on the specific task at hand. +Most LLM applications do not pass user input directly into an LLM. Usually they will add the user input to a larger piece of text, called a prompt template, that provides additional context on the specific task at hand. In the previous example, the text we passed to the model contained instructions to generate a company name. For our application, it'd be great if the user only had to provide the description of a company/product, without having to worry about giving the model instructions. @@ -155,4 +155,4 @@ You can use Memory with chains and agents initialized with chat models. The main - \ No newline at end of file +