Files
langchain/docs
Bob Lin 0866a984fe Update n_gpu_layers"s description (#16685)
The `n_gpu_layers` parameter in `llama.cpp` supports the use of `-1`,
which means to offload all layers to the GPU, so the document has been
updated.

Ref:
35918873b4/llama_cpp/server/settings.py (L29C22-L29C117)


35918873b4/llama_cpp/llama.py (L125)
2024-01-28 16:46:50 -08:00
..
2023-12-15 17:46:12 -08:00
2024-01-05 09:15:00 -08:00
2023-12-17 12:55:49 -08:00
2024-01-08 08:38:14 -08:00
2024-01-08 08:38:14 -08:00
2024-01-24 20:57:17 -07:00

LangChain Documentation

For more information on contributing to our documentation, see the Documentation Contributing Guide