docs: updated PPLX model (#23723)

Description: updated pplx docs to reference a currently [supported
model](https://docs.perplexity.ai/docs/model-cards). pplx-70b-online
->llama-3-sonar-small-32k-online

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
This commit is contained in:
mattthomps1
2024-07-02 08:48:49 -04:00
committed by GitHub
parent aa165539f6
commit cc55823486
2 changed files with 9 additions and 6 deletions

View File

@@ -45,7 +45,7 @@
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
"\n",
"```python\n",
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"pplx-70b-online\")\n",
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\")\n",
"```\n",
"\n",
"You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook."
@@ -78,7 +78,7 @@
},
"outputs": [],
"source": [
"chat = ChatPerplexity(temperature=0, model=\"pplx-70b-online\")"
"chat = ChatPerplexity(temperature=0, model=\"llama-3-sonar-small-32k-online\")"
]
},
{
@@ -146,7 +146,7 @@
}
],
"source": [
"chat = ChatPerplexity(temperature=0, model=\"pplx-70b-online\")\n",
"chat = ChatPerplexity(temperature=0, model=\"llama-3-sonar-small-32k-online\")\n",
"prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n",
"chain = prompt | chat\n",
"response = chain.invoke({\"topic\": \"cats\"})\n",
@@ -195,7 +195,7 @@
}
],
"source": [
"chat = ChatPerplexity(temperature=0.7, model=\"pplx-70b-online\")\n",
"chat = ChatPerplexity(temperature=0.7, model=\"llama-3-sonar-small-32k-online\")\n",
"prompt = ChatPromptTemplate.from_messages(\n",
" [(\"human\", \"Give me a list of famous tourist attractions in Pakistan\")]\n",
")\n",