5.0 KiB
Locally Hosted Tracing Setup
This guide provides instructions for installing and setting up your environment to use the locally hosted version of the LangChain Plus tracing server. For instructions on a hosted tracing solution, please reference the Hosted Tracing Setup guide.
If you have docker running, the below snippet is all you need to run a tracing "Hello, World!". Otherwise, continue with the Installation instructions.
pip install -U "langchain[openai]"
langchain plus start
LANGCHAIN_TRACING_V2=true python -c "from langchain.chat_models import ChatOpenAI; print(ChatOpenAI().predict('Hello, world!'))"
Installation
-
Install the latest version of
langchainby running the following command:pip install -U langchain -
Ensure Docker is installed and running on your system. To install Docker, refer to the Get Docker documentation.
-
Start the LangChain Plus tracing server by executing the following command in your terminal:
langchain plus startNote: The
langchaincommand was installed when you installed the LangChain library using (pip install langchain). -
After the server has started, it will open the Local UI. In the terminal, it will also display environment variables that you can configure to send your traces to the server. For more details on this, refer to the Environment Setup section below.
-
To stop the server, you can run the following command in your terminal:
langchain plus stop
Environment Configuration
With the LangChain Plus tracing server running, you can begin sending traces by setting the LANGCHAIN_TRACING_V2 environment variable:
Here's an example of adding all relevant environment variables:
export LANGCHAIN_TRACING_V2="true"
# export LANGCHAIN_SESSION="My Session Name" # Optional, otherwise, traces are logged to the "default" session
Or in python:
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
# os.environ["LANGCHAIN_SESSION"] = "My Session Name" # Optional, otherwise, traces are logged to the "default" session
Tracing Context Manager
Although using environment variables is recommended for most tracing use cases, you can configure runs to be sent to a specific session using the context manager:
from langchain.callbacks.manager import tracing_v2_enabled
with tracing_v2_enabled("My Session Name"):
...
Connecting from a Remote Server
To connect to LangChainPlus when running applications on a remote server, such as a Google Colab notebook, we offer two simple options:
- Use our hosted tracing server.
- Expose a public URL to your local tracing service.
Below are the full instructions to expose start a local LangChainPlus server and connect from a remote server:
-
Ensure Docker is installed and running on your system. To install Docker, refer to the Get Docker documentation.
-
Install the latest version of
langchainby running the following command:pip install -U langchain -
Start the LangChain Plus tracing server and expose by executing the following command in your terminal:
langchain plus start --exposeNote: The
--exposeflag is required to expose your local server to the internet. By default, ngrok permits tunneling for up to 2 hours at a time. For longer sessions, you can make an ngrok account and use your auth token:langchain plus start --expose --ngrok-authtoken "your auth token" -
After the server has started, it will open the Local LangChainPlus UI a well as the ngrok dashboard. In the terminal, it will also display environment variables needed to send traces to the server via the tunnel URL. These will look something like the following:
LANGCHAIN_TRACING_V2=true LANGCHAIN_ENDPOINT=https://1234-01-23-45-678.ngrok.io -
In your remote LangChain application, set the environment variables using the output from your terminal in the previous step:
import os os.environ["LANGCHAIN_TRACING_V2"] = True os.environ["LANGCHAIN_ENDPOINT"] = "https://1234-01-23-45-678.ngrok.io" # Replace with your ngrok tunnel URL -
Run your LangChain code and visualize the traces in the LangChainPlus UI
-
To stop the server, run the following command in your terminal:
langchain plus stop
Congratulations!
Now that you've set up the tracing server, you can use it to debug, monitor, and evaluate your LangChain applications. What's next?
- For an overview of the LangChain Plus UI check out the LangChain Tracing guide.
- For information on how to use your traces as datasets for testing and evaluation, check out the Datasets guide.