docs: replace deprecated llama 3 model with sonar in ChatPerplexity example (#31716)

**Description:** Updates ChatPerplexity documentation to replace
deprecated llama 3 model reference with the current sonar model in the
API key example code block.

**Issue:** N/A (maintenance update for deprecated model)

**Dependencies:** No new dependencies required

---------

Co-authored-by: Mason Daugherty <github@mdrxy.com>
This commit is contained in:
Daniel Fjeldstad 2025-06-24 19:35:18 +02:00 committed by GitHub
parent 8878a7b143
commit cc4f5269b1
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -106,9 +106,7 @@
"metadata": {},
"outputs": [],
"source": [
"chat = ChatPerplexity(\n",
" temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\"\n",
")"
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"sonar\")"
]
},
{
@ -132,7 +130,7 @@
},
"outputs": [],
"source": [
"chat = ChatPerplexity(temperature=0, model=\"llama-3.1-sonar-small-128k-online\")"
"chat = ChatPerplexity(temperature=0, model=\"sonar\")"
]
},
{
@ -200,7 +198,7 @@
}
],
"source": [
"chat = ChatPerplexity(temperature=0, model=\"llama-3.1-sonar-small-128k-online\")\n",
"chat = ChatPerplexity(temperature=0, model=\"sonar\")\n",
"prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n",
"chain = prompt | chat\n",
"response = chain.invoke({\"topic\": \"cats\"})\n",
@ -235,7 +233,7 @@
}
],
"source": [
"chat = ChatPerplexity(temperature=0.7, model=\"llama-3.1-sonar-small-128k-online\")\n",
"chat = ChatPerplexity(temperature=0.7, model=\"sonar\")\n",
"response = chat.invoke(\n",
" \"Tell me a joke about cats\", extra_body={\"search_recency_filter\": \"week\"}\n",
")\n",
@ -284,7 +282,7 @@
}
],
"source": [
"chat = ChatPerplexity(temperature=0.7, model=\"llama-3.1-sonar-small-128k-online\")\n",
"chat = ChatPerplexity(temperature=0.7, model=\"sonar\")\n",
"\n",
"for chunk in chat.stream(\"Give me a list of famous tourist attractions in Pakistan\"):\n",
" print(chunk.content, end=\"\", flush=True)"