feat(llm): adds serveral settings for llamacpp and ollama (#1703)

This commit is contained in:
icsy7867
2024-03-11 17:51:05 -04:00
committed by GitHub
parent 410bf7a71f
commit 02dc83e8e9
10 changed files with 91 additions and 8 deletions

View File

@@ -5,6 +5,7 @@ NOTE: We are not testing the switch based on the config in
is currently architecture (it is hard to patch the `settings` and the app while
the tests are directly importing them).
"""
from typing import Annotated
import pytest