mirror of
https://github.com/imartinez/privateGPT.git
synced 2025-09-17 23:57:58 +00:00
feat(llm): drop default_system_prompt (#1385)
As discussed on Discord, the decision has been made to remove the system prompts by default, to better segregate the API and the UI usages. A concurrent PR (#1353) is enabling the dynamic setting of a system prompt in the UI. Therefore, if UI users want to use a custom system prompt, they can specify one directly in the UI. If the API users want to use a custom prompt, they can pass it directly into their messages that they are passing to the API. In the highlight of the two use case above, it becomes clear that default system_prompt does not need to exist.
This commit is contained in:
@@ -108,15 +108,6 @@ class LocalSettings(BaseModel):
|
||||
"`llama2` is the historic behaviour. `default` might work better with your custom models."
|
||||
),
|
||||
)
|
||||
default_system_prompt: str | None = Field(
|
||||
None,
|
||||
description=(
|
||||
"The default system prompt to use for the chat engine. "
|
||||
"If none is given - use the default system prompt (from the llama_index). "
|
||||
"Please note that the default prompt might not be the same for all prompt styles. "
|
||||
"Also note that this is only used if the first message is not a system message. "
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class EmbeddingSettings(BaseModel):
|
||||
|
Reference in New Issue
Block a user