docs: updated PPLX model (#23723)

Description: updated pplx docs to reference a currently [supported
model](https://docs.perplexity.ai/docs/model-cards). pplx-70b-online
->llama-3-sonar-small-32k-online

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
This commit is contained in:
mattthomps1 2024-07-02 08:48:49 -04:00 committed by GitHub
parent aa165539f6
commit cc55823486
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 9 additions and 6 deletions

View File

@ -45,7 +45,7 @@
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n", "The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
"\n", "\n",
"```python\n", "```python\n",
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"pplx-70b-online\")\n", "chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\")\n",
"```\n", "```\n",
"\n", "\n",
"You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook." "You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook."
@ -78,7 +78,7 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"chat = ChatPerplexity(temperature=0, model=\"pplx-70b-online\")" "chat = ChatPerplexity(temperature=0, model=\"llama-3-sonar-small-32k-online\")"
] ]
}, },
{ {
@ -146,7 +146,7 @@
} }
], ],
"source": [ "source": [
"chat = ChatPerplexity(temperature=0, model=\"pplx-70b-online\")\n", "chat = ChatPerplexity(temperature=0, model=\"llama-3-sonar-small-32k-online\")\n",
"prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n", "prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n",
"chain = prompt | chat\n", "chain = prompt | chat\n",
"response = chain.invoke({\"topic\": \"cats\"})\n", "response = chain.invoke({\"topic\": \"cats\"})\n",
@ -195,7 +195,7 @@
} }
], ],
"source": [ "source": [
"chat = ChatPerplexity(temperature=0.7, model=\"pplx-70b-online\")\n", "chat = ChatPerplexity(temperature=0.7, model=\"llama-3-sonar-small-32k-online\")\n",
"prompt = ChatPromptTemplate.from_messages(\n", "prompt = ChatPromptTemplate.from_messages(\n",
" [(\"human\", \"Give me a list of famous tourist attractions in Pakistan\")]\n", " [(\"human\", \"Give me a list of famous tourist attractions in Pakistan\")]\n",
")\n", ")\n",

View File

@ -54,11 +54,14 @@ class ChatPerplexity(BaseChatModel):
from langchain_community.chat_models import ChatPerplexity from langchain_community.chat_models import ChatPerplexity
chat = ChatPerplexity(model="pplx-70b-online", temperature=0.7) chat = ChatPerplexity(
model="llama-3-sonar-small-32k-online",
temperature=0.7,
)
""" """
client: Any #: :meta private: client: Any #: :meta private:
model: str = "pplx-70b-online" model: str = "llama-3-sonar-small-32k-online"
"""Model name.""" """Model name."""
temperature: float = 0.7 temperature: float = 0.7
"""What sampling temperature to use.""" """What sampling temperature to use."""