mirror of
https://github.com/hwchase17/langchain.git
synced 2025-07-02 03:15:11 +00:00
docs[patch]: promptlayer
pages update (#14416)
Updated provider page by adding LLM and ChatLLM references; removed a content that is duplicate text from the LLM referenced page. Updated the collback page
This commit is contained in:
parent
18aba7fdef
commit
a05230a4ba
@ -7,12 +7,13 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"# PromptLayer\n",
|
"# PromptLayer\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
">[PromptLayer](https://docs.promptlayer.com/introduction) is a platform for prompt engineering. It also helps with the LLM observability to visualize requests, version prompts, and track usage.\n",
|
||||||
|
">\n",
|
||||||
|
">While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. [`PromptLayerOpenAI`](https://python.langchain.com/docs/integrations/llms/promptlayer_openai)), using a callback is the recommended way to integrate `PromptLayer` with LangChain.\n",
|
||||||
"\n",
|
"\n",
|
||||||
">[PromptLayer](https://promptlayer.com) is a an LLM observability platform that lets you visualize requests, version prompts, and track usage. In this guide we will go over how to setup the `PromptLayerCallbackHandler`. \n",
|
"In this guide, we will go over how to setup the `PromptLayerCallbackHandler`. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. [`PromptLayerOpenAI`](https://python.langchain.com/docs/integrations/llms/promptlayer_openai)), this callback is the recommended way to integrate PromptLayer with LangChain.\n",
|
"See [PromptLayer docs](https://docs.promptlayer.com/languages/langchain) for more information."
|
||||||
"\n",
|
|
||||||
"See [our docs](https://docs.promptlayer.com/languages/langchain) for more information."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
@ -1,49 +1,49 @@
|
|||||||
# PromptLayer
|
# PromptLayer
|
||||||
|
|
||||||
This page covers how to use [PromptLayer](https://www.promptlayer.com) within LangChain.
|
>[PromptLayer](https://docs.promptlayer.com/introduction) is a platform for prompt engineering.
|
||||||
It is broken into two parts: installation and setup, and then references to specific PromptLayer wrappers.
|
> It also helps with the LLM observability to visualize requests, version prompts, and track usage.
|
||||||
|
>
|
||||||
|
>While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g.
|
||||||
|
> [`PromptLayerOpenAI`](https://docs.promptlayer.com/languages/langchain)),
|
||||||
|
> using a callback is the recommended way to integrate `PromptLayer` with LangChain.
|
||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
|
|
||||||
If you want to work with PromptLayer:
|
To work with `PromptLayer`, we have to:
|
||||||
- Install the promptlayer python library `pip install promptlayer`
|
- Create a `PromptLayer` account
|
||||||
- Create a PromptLayer account
|
|
||||||
- Create an api token and set it as an environment variable (`PROMPTLAYER_API_KEY`)
|
- Create an api token and set it as an environment variable (`PROMPTLAYER_API_KEY`)
|
||||||
|
|
||||||
## Wrappers
|
Install a Python package:
|
||||||
|
|
||||||
### LLM
|
```bash
|
||||||
|
pip install promptlayer
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## Callback
|
||||||
|
|
||||||
|
See a [usage example](/docs/integrations/callbacks/promptlayer).
|
||||||
|
|
||||||
|
```python
|
||||||
|
import promptlayer # Don't forget this import!
|
||||||
|
from langchain.callbacks import PromptLayerCallbackHandler
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## LLM
|
||||||
|
|
||||||
|
See a [usage example](/docs/integrations/llms/promptlayer_openai).
|
||||||
|
|
||||||
There exists an PromptLayer OpenAI LLM wrapper, which you can access with
|
|
||||||
```python
|
```python
|
||||||
from langchain.llms import PromptLayerOpenAI
|
from langchain.llms import PromptLayerOpenAI
|
||||||
```
|
```
|
||||||
|
|
||||||
To tag your requests, use the argument `pl_tags` when initializing the LLM
|
|
||||||
|
## Chat Models
|
||||||
|
|
||||||
|
See a [usage example](/docs/integrations/chat/promptlayer_chatopenai).
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from langchain.llms import PromptLayerOpenAI
|
from langchain.chat_models import PromptLayerChatOpenAI
|
||||||
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"])
|
|
||||||
```
|
```
|
||||||
|
|
||||||
To get the PromptLayer request id, use the argument `return_pl_id` when initializing the LLM
|
|
||||||
```python
|
|
||||||
from langchain.llms import PromptLayerOpenAI
|
|
||||||
llm = PromptLayerOpenAI(return_pl_id=True)
|
|
||||||
```
|
|
||||||
This will add the PromptLayer request ID in the `generation_info` field of the `Generation` returned when using `.generate` or `.agenerate`
|
|
||||||
|
|
||||||
For example:
|
|
||||||
```python
|
|
||||||
llm_results = llm.generate(["hello world"])
|
|
||||||
for res in llm_results.generations:
|
|
||||||
print("pl request id: ", res[0].generation_info["pl_request_id"])
|
|
||||||
```
|
|
||||||
You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
|
|
||||||
|
|
||||||
This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai) LLM, except that
|
|
||||||
- all your requests will be logged to your PromptLayer account
|
|
||||||
- you can add `pl_tags` when instantiating to tag your requests on PromptLayer
|
|
||||||
- you can add `return_pl_id` when instantiating to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
|
|
||||||
|
|
||||||
|
|
||||||
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/integrations/chat/promptlayer_chatopenai) and `PromptLayerOpenAIChat`
|
|
||||||
|
Loading…
Reference in New Issue
Block a user