backend: fix a crash on inputs greater than n_ctx (#2498)

This fixes a regression in commit 4fc4d94b ("fix chat-style prompt
templates (#1970)"), which moved some return statements into a new
function (LLModel::decodePrompt) without making them return from the
parent as well.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
This commit is contained in:
Jared Van Bortel
2024-07-01 11:33:46 -04:00
committed by GitHub
parent 146428fa0a
commit bd307abfe6
2 changed files with 12 additions and 7 deletions

View File

@@ -248,7 +248,7 @@ protected:
return true;
}
void decodePrompt(std::function<bool(int32_t)> promptCallback,
bool decodePrompt(std::function<bool(int32_t)> promptCallback,
std::function<bool(int32_t, const std::string&)> responseCallback,
std::function<bool(bool)> recalculateCallback,
PromptContext &promptCtx,