mirror of
https://github.com/hwchase17/langchain.git
synced 2026-04-23 20:23:59 +00:00
words
This commit is contained in:
@@ -1,6 +1,6 @@
|
||||
# LangChain Tracing
|
||||
|
||||
LangChain Plus helps you visualize, monitor, and evaluate your LangChain components. To get started, use the local quickstart below or install using one of the following guides.
|
||||
LangChain Plus helps you visualize, monitor, and evaluate LLM applications. To get started, use the local quickstart below or install using one of the following guides.
|
||||
|
||||
- [Locally Hosted Tracing](../tracing/local_installation.md)
|
||||
- [Cloud Hosted Tracing](../tracing/hosted_installation.md)
|
||||
@@ -10,6 +10,7 @@ _Our hosted alpha is currently invite-only. To sign up for the wait list, please
|
||||
|
||||
## Local QuickStart
|
||||
|
||||
Ensure [Docker](https://docs.docker.com/get-docker/) is installed and running on your system, then run the following:
|
||||
|
||||
```bash
|
||||
pip install -U "langchain[openai]"
|
||||
@@ -19,18 +20,64 @@ LANGCHAIN_TRACING_V2=true python -c "from langchain.chat_models import ChatOpenA
|
||||
|
||||
## Saving Traces
|
||||
|
||||
Once you've either launched the local tracing server or made an account and retrieved an API key to the hosted solution, the easiest way to start logging traces is:
|
||||
Once you've launched the local tracing server or made an account and retrieved an API key to the hosted solution, your LangChain application will automatically log traces as long as
|
||||
you set the following environment variables:
|
||||
|
||||
```bash
|
||||
export LANGCHAIN_TRACING_V2="true"
|
||||
# export LANGCHAIN_ENDPOINT="https://api.langchain.plus" # Uncomment if using hosted if logging traces to a hosted server
|
||||
# export LANGCHAIN_API_KEY="my api key" # Uncomment add add your API key generated from the settings page if logging traces to a hosted server
|
||||
# export LANGCHAIN_SESSION="my session name" # Otherwise, traces are stored in the "default" session
|
||||
# export LANGCHAIN_ENDPOINT="https://api.langchain.plus" # Uncomment if using hosted server
|
||||
# export LANGCHAIN_API_KEY="my api key" # Uncomment and add your API key generated from the settings page if using a hosted server
|
||||
```
|
||||
|
||||
As long as these variables are correctly set, and the server is online, all your LangChain runs will be saved. You can interact with these traces in the UI or using the `LangChainPlus` client.
|
||||
|
||||
## Tracing UI Walkthrough
|
||||
|
||||
When you first access the LangChain Plus UI (and after signing in, if you are using the hosted version), you should be greeted by the home screen with more instructions on how to get started.
|
||||
From here, you can navigate to the `Sessions` and `Datasets` pages. For more information on using datasets in LangChain Plus, check out the [Datasets](../tracing/datasets.md) guide.
|
||||
|
||||
Traces from your LangChain runs can be found in the `Sessions` page. A "default" session should already be created for you.
|
||||
|
||||
A session is just a way to group traces together. If you click on a session, it will take you to a page that with no recorded runs.
|
||||
You can create and save traces to new sessions by specifying the `LANGCHAIN_SESSION` environment variable in your LangChain application. You can check out the [Chainging Sessions](#changing-sessions) section below for more configuration options.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we click on the `default` session, we can see that to start we have no traces stored.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we now start running chains and agents with tracing enabled, we will see data show up here.
|
||||
|
||||
<!-- To do so, we can run [this notebook](../tracing/agent_with_tracing.ipynb) as an example. After running it, we will see an initial trace show up. -->
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
|
||||
From here we can explore the trace at a high level by clicking on the arrow to show nested runs.
|
||||
We can keep on clicking further and further down to explore deeper and deeper.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can also click on the "Explore" button of the top level run to dive even deeper.
|
||||
Here, we can see the inputs and outputs in full, as well as all the nested traces.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can keep on exploring each of these nested traces in more detail.
|
||||
For example, here is the lowest level trace with the exact inputs/outputs to the LLM.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
|
||||
## Changing Sessions
|
||||
|
||||
1. To initially record traces to a session other than `"default"`, you can set the `LANGCHAIN_SESSION` environment variable to the name of the session you want to record to:
|
||||
1. To record traces to a session other than `"default"`, you can set the `LANGCHAIN_SESSION` environment variable to the name of the session you want to record to:
|
||||
|
||||
```python
|
||||
import os
|
||||
@@ -68,49 +115,3 @@ export LANGCHAIN_TRACING_V2="true"
|
||||
os.environ["LANGCHAIN_SESSION"] = "My New Session"
|
||||
# ... traces logged to "My New Session" ...
|
||||
```
|
||||
|
||||
## Tracing UI Walkthrough
|
||||
|
||||
When you first access the LangChain Plus UI (and after signing in, if you are using the hosted version), you should be greeted by the home screen with more instructions on how to get started.
|
||||
|
||||
Traces from your LangChain runs can be found in the `Sessions` page. A "default" session should already be created for you.
|
||||
|
||||
A session is just a way to group traces together. If you click on a session, it will take you to a page with no recorded traces.
|
||||
You can create a new session with the `Create Session` form, or by specifying a new session in when capturing traces in LangChain.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we click on the `default` session, we can see that to start we have no traces stored.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
If we now start running chains and agents with tracing enabled, we will see data show up here.
|
||||
|
||||
<!-- To do so, we can run [this notebook](../tracing/agent_with_tracing.ipynb) as an example. After running it, we will see an initial trace show up. -->
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
|
||||
From here we can explore the trace at a high level by clicking on the arrow to show nested runs.
|
||||
We can keep on clicking further and further down to explore deeper and deeper.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can also click on the "Explore" button of the top level run to dive even deeper.
|
||||
Here, we can see the inputs and outputs in full, as well as all the nested traces.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
We can keep on exploring each of these nested traces in more detail.
|
||||
For example, here is the lowest level trace with the exact inputs/outputs to the LLM.
|
||||
|
||||
<!-- TODO Add screenshots when the UI settles down a bit -->
|
||||
<!--  -->
|
||||
|
||||
|
||||
|
||||
## Using the LangChainPlus Client
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Tracing Datasets
|
||||
# Datasets
|
||||
|
||||
This guide provides instructions for how to use Datasets generated from LangChain traces.
|
||||
|
||||
@@ -12,10 +12,58 @@ Some things datasets are useful for include:
|
||||
|
||||
## Creating a Dataset
|
||||
|
||||
Datasets store the inputs and outputs of LLM, chat model, or chain or agent runs
|
||||
Datasets store the inputs and outputs of LLM, chat model, or chain or agent runs.
|
||||
|
||||
### Using the LangChainPlusClient
|
||||
|
||||
To create the client:
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["LANGCHAIN_TRACING_V2"] = "true"
|
||||
os.environ["LANGCHAIN_SESSION"] = "Tracing Walkthrough"
|
||||
# os.environ["LANGCHAIN_ENDPOINT"] = "https://api.langchain.plus" # Uncomment this line if you want to use the hosted version
|
||||
# os.environ["LANGCHAIN_API_KEY"] = "<YOUR-LANGCHAINPLUS-API-KEY>" # Uncomment this line if you want to use the hosted version.
|
||||
|
||||
client = LangChainPlusClient()
|
||||
```
|
||||
|
||||
There are two main ways to create datasets:
|
||||
|
||||
1. From a CSV or pandas DataFrame
|
||||
|
||||
```python
|
||||
csv_path = "path/too/data.csv"
|
||||
input_keys = ["input"] # column names that will be input to Chain or LLM
|
||||
output_keys = ["output"] # column names that are the output of the Chain or LLM
|
||||
description = "My dataset for evaluation"
|
||||
dataset = client.upload_csv(
|
||||
csv_path,
|
||||
description=description,
|
||||
input_keys=input_keys,
|
||||
output_keys=output_keys,
|
||||
)
|
||||
# Or as a DataFrame
|
||||
import pandas as pd
|
||||
df = pd.read_csv(csv_path)
|
||||
dataset = client.upload_dataframe(
|
||||
df,
|
||||
"My Dataset",
|
||||
description=description,
|
||||
input_keys=input_keys,
|
||||
output_keys=output_keys,
|
||||
)
|
||||
```
|
||||
|
||||
2. From traced runs. Assuming you've already captured runs in a session called "My Agent Session":
|
||||
|
||||
```python
|
||||
runs = client.list_runs(session_name="My Agent Session")
|
||||
dataset = client.create_dataset("My Dataset", "Examples from My Agent")
|
||||
for run in runs:
|
||||
client.create_example(inputs=run.inputs, outputs=run.outputs, dataset_id=dataset.id)
|
||||
```
|
||||
|
||||
|
||||
### Using the UI
|
||||
|
||||
|
||||
@@ -4,11 +4,11 @@ This guide provides instructions for installing and setting up your environment
|
||||
|
||||
## Installation
|
||||
|
||||
1. Ensure Docker is installed and running on your system. To install Docker, refer to the [Get Docker](https://docs.docker.com/get-docker/) documentation.
|
||||
2. Install the latest version of `langchain` by running the following command:
|
||||
1. Install the latest version of `langchain` by running the following command:
|
||||
```bash
|
||||
pip install -U langchain
|
||||
```
|
||||
2. Ensure Docker is installed and running on your system. To install Docker, refer to the [Get Docker](https://docs.docker.com/get-docker/) documentation.
|
||||
3. Start the LangChain Plus tracing server by executing the following command in your terminal:
|
||||
```bash
|
||||
langchain plus start
|
||||
@@ -16,8 +16,7 @@ This guide provides instructions for installing and setting up your environment
|
||||
_Note: The `langchain` command was installed when you installed the LangChain library using (`pip install langchain`)._
|
||||
|
||||
4. After the server has started, it will open the [Local UI](http://localhost). In the terminal, it will also display environment variables that you can configure to send your traces to the server. For more details on this, refer to the Environment Setup section below.
|
||||
|
||||
5. To stop the server, run the following command in your terminal:
|
||||
5. To stop the server, you can run the following command in your terminal:
|
||||
```bash
|
||||
langchain plus stop
|
||||
```
|
||||
|
||||
@@ -127,7 +127,7 @@ class PlusCommand:
|
||||
]
|
||||
)
|
||||
logger.info(
|
||||
"TheLangChain Plus server is running at http://localhost. To connect"
|
||||
"The LangChain Plus server is running at http://localhost. To connect"
|
||||
" locally, set the following environment variables"
|
||||
" before running your LangChain application:\n"
|
||||
)
|
||||
@@ -159,7 +159,7 @@ class PlusCommand:
|
||||
)
|
||||
ngrok_url = get_ngrok_url(auth_token)
|
||||
logger.info(
|
||||
"TheLangChain Plus server is running at http://localhost and"
|
||||
"The LangChain Plus server is running at http://localhost and"
|
||||
f" exposed at URL {ngrok_url}. To connect remotely,"
|
||||
" set the following environment variables"
|
||||
" before running your LangChain application:\n"
|
||||
|
||||
@@ -229,6 +229,7 @@ class LangChainPlusClient(BaseSettings):
|
||||
*,
|
||||
session_id: Optional[str] = None,
|
||||
session_name: Optional[str] = None,
|
||||
execution_order: Optional[int] = 1,
|
||||
run_type: Optional[str] = None,
|
||||
**kwargs: Any,
|
||||
) -> List[Run]:
|
||||
@@ -238,7 +239,10 @@ class LangChainPlusClient(BaseSettings):
|
||||
raise ValueError("Only one of session_id or session_name may be given")
|
||||
session_id = self.read_session(session_name=session_name).id
|
||||
query_params = ListRunsQueryParams(
|
||||
session_id=session_id, run_type=run_type, **kwargs
|
||||
session_id=session_id,
|
||||
run_type=run_type,
|
||||
execution_order=execution_order,
|
||||
**kwargs,
|
||||
)
|
||||
filtered_params = {
|
||||
k: v for k, v in query_params.dict().items() if v is not None
|
||||
|
||||
Reference in New Issue
Block a user