experimental[patch]: Fix LLM graph transformer default prompt (#18856)

Some LLMs do not allow multiple user messages in sequence.
This commit is contained in:
Tomaz Bratanic 2024-03-12 04:11:52 +01:00 committed by GitHub
parent 19721246f5
commit cda43c5a11
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -52,14 +52,12 @@ default_prompt = ChatPromptTemplate.from_messages(
(
"human",
(
"Tip: Make sure to answer in the correct format and do "
"not include any explanations. "
"Use the given format to extract information from the "
"following input: {input}"
),
),
(
"human",
"Tip: Make sure to answer in the correct format and do not include any ",
),
]
)