mirror of
https://github.com/nomic-ai/gpt4all.git
synced 2025-09-06 02:50:36 +00:00
llmodel: change tokenToString to not use string_view (#968)
fixes a definite use-after-free and likely avoids some other potential ones - std::string will convert to a std::string_view automatically but as soon as the std::string in question goes out of scope it is already freed and the string_view is pointing at freed memory - this is *mostly* fine if its returning a reference to the tokenizer's internal vocab table but it's, imo, too easy to return a reference to a dynamically constructed string with this as replit is doing (and unfortunately needs to do to convert the internal whitespace replacement symbol back to a space)
This commit is contained in:
@@ -121,7 +121,7 @@ void LLModel::prompt(const std::string &prompt,
|
||||
if (id == token) return;
|
||||
}
|
||||
|
||||
const std::string_view str = tokenToString(id);
|
||||
const std::string str = tokenToString(id);
|
||||
|
||||
// Check if the provided str is part of our reverse prompts
|
||||
bool foundPartialReversePrompt = false;
|
||||
|
Reference in New Issue
Block a user