community[patch]: BedrockChat -> Support Titan express as chat model (#15408)

Titan Express model was not supported as a chat model because LangChain
messages were not "translated" to a text prompt.

Co-authored-by: Guillem Orellana Trullols <guillem.orellana_trullols@siemens.com>
This commit is contained in:
Guillem Orellana Trullols
2024-01-22 20:37:23 +01:00
committed by GitHub
parent 1b9001db47
commit aad2aa7188
3 changed files with 26 additions and 11 deletions

View File

@@ -272,10 +272,12 @@ class BedrockBase(BaseModel, ABC):
try:
response = self.client.invoke_model(
body=body, modelId=self.model_id, accept=accept, contentType=contentType
body=body,
modelId=self.model_id,
accept=accept,
contentType=contentType,
)
text = LLMInputOutputAdapter.prepare_output(provider, response)
except Exception as e:
raise ValueError(f"Error raised by bedrock service: {e}").with_traceback(
e.__traceback__