Fixes https://github.com/nomic-ai/gpt4all/issues/1760 LLModel ERROR: Could not find CPU LLaMA implementation.

Inspired by Microsoft docs for LoadLibraryExA (https://learn.microsoft.com/en-us/windows/win32/api/libloaderapi/nf-libloaderapi-loadlibraryexa).
When using LOAD_LIBRARY_SEARCH_DLL_LOAD_DIR, the lpFileName parameter must specify a fully qualified path, also it needs to be backslashes (\), not forward slashes (/).
This commit is contained in:
ThiloteE
2023-12-29 23:57:46 +01:00
committed by AT
parent 3e99b90c0b
commit 38d81c14d0
2 changed files with 5 additions and 2 deletions

View File

@@ -104,7 +104,7 @@ const std::vector<LLModel::Implementation> &LLModel::Implementation::implementat
// Add to list if model implementation
try {
Dlhandle dl(p.string());
Dlhandle dl(std::filesystem::absolute(p).string());
if (!Implementation::isImplementation(dl)) {
continue;
}