Harrison/tracing docs (#804)

Co-authored-by: Ankush Gola <9536492+agola11@users.noreply.github.com>
This commit is contained in:
Harrison Chase
2023-01-29 20:24:22 -08:00
committed by GitHub
parent ae1b589f60
commit f3da4dc6ba
11 changed files with 235 additions and 0 deletions

View File

@@ -0,0 +1,108 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "17c04cc6-c93d-4b6c-a033-e897577f4ed1",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"os.environ[\"LANGCHAIN_HANDLER\"] = \"langchain\"\n",
"\n",
"## Uncomment this if using hosted setup.\n",
"\n",
"# os.environ[\"LANGCHAIN_ENDPOINT\"] = \"https://langchain-api-gateway-57eoxz8z.uc.gateway.dev\" \n",
"\n",
"## Uncomment this if you want traces to be recorded to \"my_session\" instead of default.\n",
"\n",
"# os.environ[\"LANGCHAIN_SESSION\"] = \"my_session\" \n",
"\n",
"## Better to set this environment variable in the terminal\n",
"# Uncomment this if using hosted version. Replace \"my_api_key\" with your actual API Key.\n",
"\n",
"# os.environ[\"LANGCHAIN_API_KEY\"] = \"my_api_key\" \n",
"\n",
"import langchain\n",
"from langchain.agents import Tool, initialize_agent, load_tools\n",
"from langchain.llms import OpenAI"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "bfa16b79-aa4b-4d41-a067-70d1f593f667",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
"\u001b[32;1m\u001b[1;3m I need to use a calculator to solve this.\n",
"Action: Calculator\n",
"Action Input: 2^.123243\u001b[0m\n",
"Observation: \u001b[36;1m\u001b[1;3mAnswer: 1.0891804557407723\n",
"\u001b[0m\n",
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
"Final Answer: 1.0891804557407723\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"'1.0891804557407723'"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Agent run with tracing. Ensure that OPENAI_API_KEY is set appropriately to run this example.\n",
"\n",
"llm = OpenAI(temperature=0)\n",
"tools = load_tools([\"llm-math\"], llm=llm)\n",
"agent = initialize_agent(\n",
" tools, llm, agent=\"zero-shot-react-description\", verbose=True\n",
")\n",
"\n",
"agent.run(\"What is 2 raised to .123243 power?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0bf0304c",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 73 KiB

BIN
docs/tracing/explore.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 239 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 253 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 117 KiB

BIN
docs/tracing/homepage.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

View File

@@ -0,0 +1,36 @@
# Cloud Hosted Setup
We offer a hosted version of tracing at https://langchainplus.vercel.app/. You can use this to view traces from your run without having to run the server locally.
Note: we are currently only offering this to a limited number of users. The hosted platform is VERY alpha, in active development, and data might be dropped at any time. Don't depend on data being persisted in the system long term and don't log traces that may contain sensitive information. If you're interested in using the hosted platform, please fill out the form [here](https://forms.gle/tRCEMSeopZf6TE3b6).
## Installation
1. Login to the system and click "API Key" in the top right corner. Generate a new key and keep it safe. You will need it to authenticate with the system.
## Environment Setup
After installation, you must now set up your environment to use tracing.
This can be done by setting an environment variable in your terminal by running `export LANGCHAIN_HANDLER=langchain`.
You can also do this by adding the below snippet to the top of every script. **IMPORTANT:** this must go at the VERY TOP of your script, before you import anything from `langchain`.
```python
import os
os.environ["LANGCHAIN_HANDLER"] = "langchain"
```
You will also need to set an environment variable to specify the endpoint and your API key. This can be done with the following environment variables:
1. `LANGCHAIN_ENDPOINT` = "https://langchain-api-gateway-57eoxz8z.uc.gateway.dev"
2. `LANGCHAIN_API_KEY` - set this to the API key you generated during installation.
An example of adding all relevant environment variables is below:
```python
import os
os.environ["LANGCHAIN_HANDLER"] = "langchain"
os.environ["LANGCHAIN_ENDPOINT"] = "https://langchain-api-gateway-57eoxz8z.uc.gateway.dev"
os.environ["LANGCHAIN_API_KEY"] = "my_api_key" # Don't commit this to your repo! Better to set it in your terminal.
```

View File

@@ -0,0 +1,35 @@
# Locally Hosted Setup
This page contains instructions for installing and then setting up the environment to use the locally hosted version of tracing.
## Installation
1. Ensure you have Docker installed (see [Get Docker](https://docs.docker.com/get-docker/)) and that its running.
2. Install the latest version of `langchain`: `pip install langchain` or `pip install langchain -U` to upgrade your
existing version.
3. Run `langchain-server`
1. This will spin up the server in the terminal.
2. Once you see the terminal
output `langchain-langchain-frontend-1 | ➜ Local: [http://localhost:4173/](http://localhost:4173/)`, navigate
to [http://localhost:4173/](http://localhost:4173/)
4. You should see a page with your tracing sessions. See the overview page for a walkthrough of the UI.
5. Currently, trace data is not guaranteed to be persisted between runs of `langchain-server`. If you want to
persist your data, you can mount a volume to the Docker container. See the [Docker docs](https://docs.docker.com/storage/volumes/) for more info.
6. To stop the server, press `Ctrl+C` in the terminal where you ran `langchain-server`.
## Environment Setup
After installation, you must now set up your environment to use tracing.
This can be done by setting an environment variable in your terminal by running `export LANGCHAIN_HANDLER=langchain`.
You can also do this by adding the below snippet to the top of every script. **IMPORTANT:** this must go at the VERY TOP of your script, before you import anything from `langchain`.
```python
import os
os.environ["LANGCHAIN_HANDLER"] = "langchain"
```