groq[patch]: update model for integration tests (#31440)

Llama-3.1 started failing consistently with
> groq.BadRequestError: Error code: 400 - ***'error': ***'message':
"Failed to call a function. Please adjust your prompt. See
'failed_generation' for more details.", 'type': 'invalid_request_error',
'code': 'tool_use_failed', 'failed_generation':
'<function=brave_search>***"query": "Hello!"***</function>'***
This commit is contained in:
ccurme 2025-05-30 13:27:12 -04:00 committed by GitHub
parent 5b9394319b
commit 5bf89628bf
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -29,14 +29,10 @@ class BaseTestGroq(ChatModelIntegrationTests):
return True
class TestGroqLlama(BaseTestGroq):
class TestGroqGemma(BaseTestGroq):
@property
def chat_model_params(self) -> dict:
return {
"model": "llama-3.1-8b-instant",
"temperature": 0,
"rate_limiter": rate_limiter,
}
return {"model": "gemma2-9b-it", "rate_limiter": rate_limiter}
@property
def supports_json_mode(self) -> bool: