community[patch]: Updating default PPLX model to supported llama-3.1 model. (#25643)

# Issue

As of late July, Perplexity [no longer supports Llama 3
models](https://docs.perplexity.ai/changelog/introducing-new-and-improved-sonar-models).

# Description

This PR updates the default model and doc examples to reflect their
latest supported model. (Mostly updating the same places changed by
#23723.)

# Twitter handle

`@acompa_` on behalf of the team at Not Diamond. Check us out
[here](https://notdiamond.ai).

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
This commit is contained in:
Alejandro Companioni
2024-08-23 04:33:30 -04:00
committed by GitHub
parent 163ef35dd1
commit bcd5842b5d
2 changed files with 12 additions and 6 deletions

View File

@@ -55,20 +55,20 @@ class ChatPerplexity(BaseChatModel):
from langchain_community.chat_models import ChatPerplexity
chat = ChatPerplexity(
model="llama-3-sonar-small-32k-online",
model="llama-3.1-sonar-small-128k-online",
temperature=0.7,
)
"""
client: Any #: :meta private:
model: str = "llama-3-sonar-small-32k-online"
model: str = "llama-3.1-sonar-small-128k-online"
"""Model name."""
temperature: float = 0.7
"""What sampling temperature to use."""
model_kwargs: Dict[str, Any] = Field(default_factory=dict)
"""Holds any model parameters valid for `create` call not explicitly specified."""
pplx_api_key: Optional[str] = Field(None, alias="api_key")
"""Base URL path for API requests,
"""Base URL path for API requests,
leave blank if not using a proxy or service emulator."""
request_timeout: Optional[Union[float, Tuple[float, float]]] = Field(
None, alias="timeout"