mirror of
https://github.com/hwchase17/langchain.git
synced 2026-04-23 20:23:59 +00:00
Update Tracing Docs
This commit is contained in:
@@ -1,57 +1,116 @@
|
||||
# Tracing
|
||||
# LangChain Tracing
|
||||
|
||||
By enabling tracing in your LangChain runs, you’ll be able to more effectively visualize, step through, and debug your chains and agents.
|
||||
LangChain Plus helps you visualize, monitor, and evaluate your LangChain components. To get started, use the local quickstart below or install using one of the following guides.
|
||||
|
||||
First, you should install tracing and set up your environment properly.
|
||||
You can use either a locally hosted version of this (uses Docker) or a cloud hosted version (in closed alpha).
|
||||
If you're interested in using the hosted platform, please fill out the form [here](https://forms.gle/tRCEMSeopZf6TE3b6).
|
||||
- [Locally Hosted Tracing](../tracing/local_installation.md)
|
||||
- [Cloud Hosted Tracing](../tracing/hosted_installation.md)
|
||||
|
||||
- [Locally Hosted Setup](../tracing/local_installation.md)
|
||||
- [Cloud Hosted Setup](../tracing/hosted_installation.md)
|
||||
_Our hosted alpha is currently invite-only. To sign up for the wait list, please fill out the form [here](https://forms.gle/tRCEMSeopZf6TE3b6)._
|
||||
|
||||
## Tracing Walkthrough
|
||||
|
||||
When you first access the UI, you should see a page with your tracing sessions.
|
||||
An initial one "default" should already be created for you.
|
||||
A session is just a way to group traces together.
|
||||
If you click on a session, it will take you to a page with no recorded traces that says "No Runs."
|
||||
You can create a new session with the new session form.
|
||||
## Local QuickStart
|
||||
|
||||

|
||||
|
||||
If we click on the `default` session, we can see that to start we have no traces stored.
|
||||
```bash
|
||||
pip install -U "langchain[openai]"
|
||||
langchain plus start
|
||||
LANGCHAIN_TRACING_V2=true python -c "from langchain.chat_models import ChatOpenAI; print(ChatOpenAI().predict('Hello, world!'))"
|
||||
```
|
||||
|
||||

|
||||
## Saving Traces
|
||||
|
||||
If we now start running chains and agents with tracing enabled, we will see data show up here.
|
||||
To do so, we can run [this notebook](../tracing/agent_with_tracing.ipynb) as an example.
|
||||
After running it, we will see an initial trace show up.
|
||||
Once you've either launched the local tracing server or made an account and retrieved an API key to the hosted solution, the easiest way to start logging traces is:
|
||||
|
||||

|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2="true"
|
||||
# export LANGCHAIN_ENDPOINT="https://api.langchain.plus" # Uncomment if using hosted if logging traces to a hosted server
|
||||
# export LANGCHAIN_API_KEY="my api key" # Uncomment add add your API key generated from the settings page if logging traces to a hosted server
|
||||
```
|
||||
|
||||
From here we can explore the trace at a high level by clicking on the arrow to show nested runs.
|
||||
We can keep on clicking further and further down to explore deeper and deeper.
|
||||
|
||||

|
||||
|
||||
We can also click on the "Explore" button of the top level run to dive even deeper.
|
||||
Here, we can see the inputs and outputs in full, as well as all the nested traces.
|
||||
|
||||

|
||||
|
||||
We can keep on exploring each of these nested traces in more detail.
|
||||
For example, here is the lowest level trace with the exact inputs/outputs to the LLM.
|
||||
|
||||

|
||||
|
||||
## Changing Sessions
|
||||
|
||||
1. To initially record traces to a session other than `"default"`, you can set the `LANGCHAIN_SESSION` environment variable to the name of the session you want to record to:
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING"] = "true"
|
||||
os.environ["LANGCHAIN_SESSION"] = "my_session" # Make sure this session actually exists. You can create a new session in the UI.
|
||||
```
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
os.environ["LANGCHAIN_SESSION"] = "my_session"
|
||||
```
|
||||
|
||||
2. To switch sessions mid-script or mid-notebook, you have a few options:
|
||||
|
||||
a. Explicitly pass in a new `LangChainTracer` callback
|
||||
|
||||
```python
|
||||
from langchain.callbacks.tracers import LangChainTracer
|
||||
tracer = LangChainTracer(session_name="My new session")
|
||||
agent.run("How many people live in canada as of 2023?", callbacks=[tracer])
|
||||
```
|
||||
|
||||
b. Use the `tracing_v2_enabled` context manager:
|
||||
|
||||
```python
|
||||
import os
|
||||
from langchain.callbacks.manager import tracing_v2_enabled
|
||||
os.environ["LANGCHAIN_SESSION"] = "my_session"
|
||||
# ... traces logged to "my_session" ...
|
||||
with tracing_v2_enabled("My Scoped Session Name"):
|
||||
# ... traces logged to "My New Session" ...
|
||||
```
|
||||
|
||||
c. Update the `LANGCHAIN_SESSION` environment variable (not thread safe)
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_SESSION"] = "my_session"
|
||||
# ... traces logged to 'my_session' ...
|
||||
os.environ["LANGCHAIN_SESSION"] = "My New Session"
|
||||
# ... traces logged to "My New Session" ...
|
||||
```
|
||||
|
||||
## Tracing UI Walkthrough
|
||||
|
||||
When you first access the LangChain Plus UI (and after signing in, if you are using the hosted version), you should be greeted by the home screen with more instructions on how to get started.
|
||||
|
||||
Traces from your LangChain runs can be found in the `Sessions` page. A "default" session should already be created for you.
|
||||
|
||||
A session is just a way to group traces together. If you click on a session, it will take you to a page with no recorded traces.
|
||||
You can create a new session with the `Create Session` form, or by specifying a new session in when capturing traces in LangChain.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we click on the `default` session, we can see that to start we have no traces stored.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we now start running chains and agents with tracing enabled, we will see data show up here.
|
||||
|
||||
<!-- To do so, we can run [this notebook](../tracing/agent_with_tracing.ipynb) as an example. After running it, we will see an initial trace show up. -->
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
|
||||
From here we can explore the trace at a high level by clicking on the arrow to show nested runs.
|
||||
We can keep on clicking further and further down to explore deeper and deeper.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can also click on the "Explore" button of the top level run to dive even deeper.
|
||||
Here, we can see the inputs and outputs in full, as well as all the nested traces.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can keep on exploring each of these nested traces in more detail.
|
||||
For example, here is the lowest level trace with the exact inputs/outputs to the LLM.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
|
||||
|
||||
## Using the LangChainPlus Client
|
||||
|
||||
2. To switch sessions mid-script or mid-notebook, do NOT set the `LANGCHAIN_SESSION` environment variable. Instead: `langchain.set_tracing_callback_manager(session_name="my_session")`
|
||||
|
||||
24
docs/tracing/datasets.md
Normal file
24
docs/tracing/datasets.md
Normal file
@@ -0,0 +1,24 @@
|
||||
# Tracing Datasets
|
||||
|
||||
This guide provides instructions for how to use Datasets generated from LangChain traces.
|
||||
|
||||
Some things datasets are useful for include:
|
||||
|
||||
- Compare the results of different models or prompts to pick the most appropriate configuration.
|
||||
- Test for regressions in LLM or agent behavior over known use cases.
|
||||
- Run a model N times to measure the stability of its predictions and infer the reliability of its performance.
|
||||
- Run an evaluation chain over your agents' outputs to quantify your agents' performance.
|
||||
|
||||
|
||||
## Creating a Dataset
|
||||
|
||||
Datasets store the inputs and outputs of LLM, chat model, or chain or agent runs
|
||||
|
||||
### Using the LangChainPlusClient
|
||||
|
||||
|
||||
### Using the UI
|
||||
|
||||
|
||||
## Running LangChain objects on datasets
|
||||
|
||||
@@ -1,36 +1,72 @@
|
||||
# Cloud Hosted Setup
|
||||
# Cloud Hosted Tracing Setup
|
||||
|
||||
We offer a hosted version of tracing at [langchainplus.vercel.app](https://langchainplus.vercel.app/). You can use this to view traces from your run without having to run the server locally.
|
||||
This guide provides instructions for setting up your environment to use the cloud-hosted version of the LangChain Plus tracing server. For instructions on locally hosted tracing, please reference the [Locally Hosted Tracing Setup](./local_installation.md) guide.
|
||||
|
||||
Note: we are currently only offering this to a limited number of users. The hosted platform is VERY alpha, in active development, and data might be dropped at any time. Don't depend on data being persisted in the system long term and don't log traces that may contain sensitive information. If you're interested in using the hosted platform, please fill out the form [here](https://forms.gle/tRCEMSeopZf6TE3b6).
|
||||
We offer a hosted version of tracing at the [LangChain Plus website](https://www.langchain.plus/). You can use this to interact with your traces and evaluation datasets without having to install the local server.
|
||||
|
||||
## Installation
|
||||
**Note**: We are currently only offering this to a limited number of users. The hosted platform is in the alpha stage, actively under development, and data might be dropped at any time. Do not depend on data being persisted in the system long term and refrain from logging traces that may contain sensitive information. If you're interested in using the hosted platform, please fill out the form [here](https://forms.gle/tRCEMSeopZf6TE3b6).
|
||||
|
||||
1. Login to the system and click "API Key" in the top right corner. Generate a new key and keep it safe. You will need it to authenticate with the system.
|
||||
## Setup
|
||||
|
||||
## Environment Setup
|
||||
Follow these steps to set up your environment to use the cloud-hosted tracing server:
|
||||
|
||||
After installation, you must now set up your environment to use tracing.
|
||||
1. Log in to the system and click "API Key" in the top right corner. Generate a new key and assign it to the `LANGCHAIN_API_KEY` environment variable.
|
||||
|
||||
This can be done by setting an environment variable in your terminal by running `export LANGCHAIN_HANDLER=langchain`.
|
||||
|
||||
You can also do this by adding the below snippet to the top of every script. **IMPORTANT:** this must go at the VERY TOP of your script, before you import anything from `langchain`.
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_HANDLER"] = "langchain"
|
||||
```bash
|
||||
export LANGCHAIN_API_KEY="your api key"
|
||||
```
|
||||
|
||||
You will also need to set an environment variable to specify the endpoint and your API key. This can be done with the following environment variables:
|
||||
## Environment Configuration
|
||||
|
||||
1. `LANGCHAIN_ENDPOINT` = "https://langchain-api-gateway-57eoxz8z.uc.gateway.dev"
|
||||
2. `LANGCHAIN_API_KEY` - set this to the API key you generated during installation.
|
||||
Once you've set up your account, configure your LangChain application's environment to use tracing. This can be done by setting an environment variable in your terminal by running:
|
||||
|
||||
An example of adding all relevant environment variables is below:
|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2=true
|
||||
```
|
||||
|
||||
You can also add the following snippet to the top of every script:
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_HANDLER"] = "langchain"
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
```
|
||||
|
||||
Additionally, you need to set an environment variables to specify the endpoint. You can do this with the following environment variable:
|
||||
|
||||
```bash
|
||||
export LANGCHAIN_ENDPOINT="https://api.langchain.plus"
|
||||
```
|
||||
|
||||
Here's an example of adding all relevant environment variables:
|
||||
|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2="true"
|
||||
export LANGCHAIN_ENDPOINT="https://api.langchain.plus"
|
||||
export LANGCHAIN_API_KEY="my api key"
|
||||
# export LANGCHAIN_SESSION="My Session Name" # Optional, otherwise, traces are logged to the "default" session
|
||||
```
|
||||
|
||||
Or in python:
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
os.environ["LANGCHAIN_ENDPOINT"] = "https://langchain-api-gateway-57eoxz8z.uc.gateway.dev"
|
||||
os.environ["LANGCHAIN_API_KEY"] = "my_api_key" # Don't commit this to your repo! Better to set it in your terminal.
|
||||
os.environ["LANGCHAIN_API_KEY"] = "my_api_key" # Don't commit this to your repo! Set it in your terminal instead.
|
||||
# os.environ["LANGCHAIN_SESSION"] = "My Session Name" # Optional, otherwise, traces are logged to the "default" session
|
||||
```
|
||||
|
||||
## Tracing Context Manager
|
||||
|
||||
Although using environment variables is recommended for most tracing use cases, you can also configure runs to be sent to a specific session using the context manager:
|
||||
|
||||
```python
|
||||
from langchain.callbacks.manager import tracing_v2_enabled
|
||||
|
||||
with tracing_v2_enabled("My Session Name"):
|
||||
...
|
||||
```
|
||||
|
||||
|
||||
## Navigating the LangChainPlus UI
|
||||
|
||||
You can check out an overview of the LangChainPlus UI in the [LangChain Tracing](../additional_resources/tracing.md) guide.
|
||||
@@ -1,35 +1,113 @@
|
||||
# Locally Hosted Setup
|
||||
# Locally Hosted Tracing Setup
|
||||
|
||||
This page contains instructions for installing and then setting up the environment to use the locally hosted version of tracing.
|
||||
This guide provides instructions for installing and setting up your environment to use the locally hosted version of the LangChain Plus tracing server. For instructions on a hosted tracing solution, please reference the [Hosted Tracing Setup](./hosted_installation.md) guide.
|
||||
|
||||
## Installation
|
||||
|
||||
1. Ensure you have Docker installed (see [Get Docker](https://docs.docker.com/get-docker/)) and that it’s running.
|
||||
2. Install the latest version of `langchain`: `pip install langchain` or `pip install langchain -U` to upgrade your
|
||||
existing version.
|
||||
3. Run `langchain-server`. This command was installed automatically when you ran the above command (`pip install langchain`).
|
||||
1. This will spin up the server in the terminal, hosted on port `4137` by default.
|
||||
2. Once you see the terminal
|
||||
output `langchain-langchain-frontend-1 | ➜ Local: [http://localhost:4173/](http://localhost:4173/)`, navigate
|
||||
to [http://localhost:4173/](http://localhost:4173/)
|
||||
1. Ensure Docker is installed and running on your system. To install Docker, refer to the [Get Docker](https://docs.docker.com/get-docker/) documentation.
|
||||
2. Install the latest version of `langchain` by running the following command:
|
||||
```bash
|
||||
pip install -U langchain
|
||||
```
|
||||
3. Start the LangChain Plus tracing server by executing the following command in your terminal:
|
||||
```bash
|
||||
langchain plus start
|
||||
```
|
||||
_Note: The `langchain` command was installed when you installed the LangChain library using (`pip install langchain`)._
|
||||
|
||||
4. You should see a page with your tracing sessions. See the overview page for a walkthrough of the UI.
|
||||
4. After the server has started, it will open the [Local UI](http://localhost). In the terminal, it will also display environment variables that you can configure to send your traces to the server. For more details on this, refer to the Environment Setup section below.
|
||||
|
||||
5. Currently, trace data is not guaranteed to be persisted between runs of `langchain-server`. If you want to
|
||||
persist your data, you can mount a volume to the Docker container. See the [Docker docs](https://docs.docker.com/storage/volumes/) for more info.
|
||||
6. To stop the server, press `Ctrl+C` in the terminal where you ran `langchain-server`.
|
||||
5. To stop the server, run the following command in your terminal:
|
||||
```bash
|
||||
langchain plus stop
|
||||
```
|
||||
|
||||
## Environment Configuration
|
||||
|
||||
## Environment Setup
|
||||
With the LangChain Plus tracing server running, you can begin sending traces by setting the `LANGCHAIN_TRACING_V2` environment variable:
|
||||
|
||||
After installation, you must now set up your environment to use tracing.
|
||||
|
||||
This can be done by setting an environment variable in your terminal by running `export LANGCHAIN_HANDLER=langchain`.
|
||||
|
||||
You can also do this by adding the below snippet to the top of every script. **IMPORTANT:** this must go at the VERY TOP of your script, before you import anything from `langchain`.
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_HANDLER"] = "langchain"
|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2=true
|
||||
```
|
||||
|
||||
Or at the top of every python script:
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
```
|
||||
|
||||
|
||||
Here's an example of adding all relevant environment variables:
|
||||
|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2="true"
|
||||
# export LANGCHAIN_SESSION="My Session Name" # Optional, otherwise, traces are logged to the "default" session
|
||||
```
|
||||
|
||||
Or in python:
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
# os.environ["LANGCHAIN_SESSION"] = "My Session Name" # Optional, otherwise, traces are logged to the "default" session
|
||||
```
|
||||
|
||||
## Tracing Context Manager
|
||||
|
||||
Although using environment variables is recommended for most tracing use cases, you can configure runs to be sent to a specific session using the context manager:
|
||||
|
||||
```python
|
||||
from langchain.callbacks.manager import tracing_v2_enabled
|
||||
|
||||
with tracing_v2_enabled("My Session Name"):
|
||||
...
|
||||
```
|
||||
|
||||
## Connecting from a Remote Server
|
||||
|
||||
To connect to LangChainPlus when running applications on a remote server, such as a [Google Colab notebook](https://colab.research.google.com/) or a [HuggingFace Space](https://huggingface.co/docs/hub/spaces), we offer two simple options:
|
||||
|
||||
1. Use our [hosted tracing](./hosted_installation.md) server.
|
||||
2. Expose a public URL to your local tracing service.
|
||||
|
||||
Below are the full instructions to expose start a local LangChainPlus server and connect from a remote server:
|
||||
|
||||
1. Ensure Docker is installed and running on your system. To install Docker, refer to the [Get Docker](https://docs.docker.com/get-docker/) documentation.
|
||||
2. Install the latest version of `langchain` by running the following command:
|
||||
```bash
|
||||
pip install -U langchain
|
||||
```
|
||||
3. Start the LangChain Plus tracing server and expose by executing the following command in your terminal:
|
||||
```bash
|
||||
langchain plus start --expose
|
||||
```
|
||||
Note: The `--expose` flag is required to expose your local server to the internet. By default, ngrok permits tunneling for up to 2 hours at a time. For longer sessions, you can make an [ngrok account](https://ngrok.com/) and use your auth token:
|
||||
|
||||
```bash
|
||||
langchain plus start --expose --ngrok-authtoken "your auth token"
|
||||
```
|
||||
|
||||
4. After the server has started, it will open the [Local LangChainPlus UI](http://localhost) a well as the [ngrok dashboard](http://0.0.0.0:4040/inspect/http). In the terminal, it will also display environment variables needed to send traces to the server via the tunnel URL. These will look something like the following:
|
||||
|
||||
```bash
|
||||
LANGCHAIN_TRACING_V2=true
|
||||
LANGCHAIN_ENDPOINT=https://1234-01-23-45-678.ngrok.io
|
||||
```
|
||||
|
||||
5. In your remote LangChain application, set the environment variables using the output from your terminal in the previous step:
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = True
|
||||
os.environ["LANGCHAIN_ENDPOINT"] = "https://1234-01-23-45-678.ngrok.io" # Replace with your ngrok tunnel URL
|
||||
```
|
||||
|
||||
6. Run your LangChain code and visualize the traces in the [LangChainPlus UI](http://localhost/sessions)
|
||||
|
||||
7. To stop the server, run the following command in your terminal:
|
||||
```bash
|
||||
langchain plus stop
|
||||
```
|
||||
|
||||
## Navigating the LangChainPlus UI
|
||||
|
||||
You can check out an overview of the LangChainPlus UI in the [LangChain Tracing](../additional_resources/tracing.md) guide.
|
||||
@@ -41,7 +41,7 @@ def get_docker_compose_command() -> List[str]:
|
||||
"Neither 'docker compose' nor 'docker-compose'"
|
||||
" commands are available. Please install the Docker"
|
||||
" server following the instructions for your operating"
|
||||
" system at https://docs.docker.com/engine/install/"
|
||||
" system at https://docs.docker.com/get-docker/"
|
||||
)
|
||||
|
||||
|
||||
@@ -127,12 +127,13 @@ class PlusCommand:
|
||||
]
|
||||
)
|
||||
logger.info(
|
||||
"langchain plus server is running at http://localhost. To connect"
|
||||
" locally, set the following environment variable"
|
||||
" when running your LangChain application."
|
||||
"TheLangChain Plus server is running at http://localhost. To connect"
|
||||
" locally, set the following environment variables"
|
||||
" before running your LangChain application:\n"
|
||||
)
|
||||
|
||||
logger.info("\tLANGCHAIN_TRACING_V2=true")
|
||||
logger.info(f"\tLANGCHAIN_ENDPOINT=http://localhost:8000")
|
||||
self._open_browser("http://localhost")
|
||||
|
||||
def _start_and_expose(self, auth_token: Optional[str]) -> None:
|
||||
@@ -158,9 +159,10 @@ class PlusCommand:
|
||||
)
|
||||
ngrok_url = get_ngrok_url(auth_token)
|
||||
logger.info(
|
||||
"langchain plus server is running at http://localhost."
|
||||
" To connect remotely, set the following environment"
|
||||
" variable when running your LangChain application."
|
||||
"TheLangChain Plus server is running at http://localhost and"
|
||||
f" exposed at URL {ngrok_url}. To connect remotely,"
|
||||
" set the following environment variables"
|
||||
" before running your LangChain application:\n"
|
||||
)
|
||||
logger.info("\tLANGCHAIN_TRACING_V2=true")
|
||||
logger.info(f"\tLANGCHAIN_ENDPOINT={ngrok_url}")
|
||||
|
||||
Reference in New Issue
Block a user