Minor template cleaning (#12573)

This commit is contained in:
Lance Martin
2023-10-30 11:27:44 -07:00
committed by GitHub
parent b1e3843931
commit 08103e6d48
5 changed files with 28 additions and 140 deletions

View File

@@ -9,32 +9,34 @@ model = Replicate(
)
# Prompt with output schema specification
template = """A article will be passed to you. Extract from it all papers that are mentioned by this article.
Do not extract the name of the article itself. If no papers are mentioned that's fine - you don't need to extract any! Just return an empty list.
Do not make up or guess ANY extra information. Only extract what exactly is in the text.
Respond with json that adheres to the following jsonschema:
template = """You are an AI language model assistant. Your task is to generate 3 different versions of the given user /
question to retrieve relevant documents from a vector database. By generating multiple perspectives on the user /
question, your goal is to help the user overcome some of the limitations of distance-based similarity search. /
Respond with json that adheres to the following jsonschema:
{{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {{
"author": {{
"question_1": {{
"type": "string",
"description": "The author of the paper."
"description": "First version of the user question."
}},
"title": {{
"question_2": {{
"type": "string",
"description": "The title of the paper."
"description": "Second version of the user question."
}},
"question_3": {{
"type": "string",
"description": "Third version of the user question."
}}
}},
"required": ["author", "title"],
"required": ["question_1","question_2","question_3"],
"additionalProperties": false
}}""" # noqa: E501
prompt = ChatPromptTemplate.from_messages([("system", template), ("human", "{input}")])
prompt = ChatPromptTemplate.from_messages(
[("system", template), ("human", "{question}")]
)
# Chain
chain = prompt | model