mirror of
https://github.com/hwchase17/langchain.git
synced 2025-05-05 15:18:32 +00:00
- chat models, messages - documents - agentaction/finish - baseretriever,document - stroutputparser - more messages - basemessage - format_document - baseoutputparser --------- Co-authored-by: Bagatur <baskaryan@gmail.com>
371 lines
9.6 KiB
Plaintext
371 lines
9.6 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "40dab0fa-e56c-4958-959e-bd6d6f829724",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"# Trubrics\n",
|
|
"\n",
|
|
"\n",
|
|
">[Trubrics](https://trubrics.com) is an LLM user analytics platform that lets you collect, analyse and manage user\n",
|
|
"prompts & feedback on AI models.\n",
|
|
">\n",
|
|
">Check out [Trubrics repo](https://github.com/trubrics/trubrics-sdk) for more information on `Trubrics`.\n",
|
|
"\n",
|
|
"In this guide, we will go over how to set up the `TrubricsCallbackHandler`. \n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "c0d060d5-133b-496e-b76e-43284d5545b8",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"## Installation and Setup"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "ce799e10-5433-4b29-8fa1-c1352f761918",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"%pip install --upgrade --quiet trubrics"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "44666917-85f2-4695-897d-54504e343604",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Getting Trubrics Credentials\n",
|
|
"\n",
|
|
"If you do not have a Trubrics account, create one on [here](https://trubrics.streamlit.app/). In this tutorial, we will use the `default` project that is built upon account creation.\n",
|
|
"\n",
|
|
"Now set your credentials as environment variables:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "cd696d03-bea8-42bd-914b-2290fcafb5c9",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"import os\n",
|
|
"\n",
|
|
"os.environ[\"TRUBRICS_EMAIL\"] = \"***@***\"\n",
|
|
"os.environ[\"TRUBRICS_PASSWORD\"] = \"***\""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "cd7177b0-a9e8-45ae-adb0-ea779376511b",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"### Usage"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "6ec1bcd4-3824-43de-84a4-3102a2f6d26d",
|
|
"metadata": {},
|
|
"source": [
|
|
"The `TrubricsCallbackHandler` can receive various optional arguments. See [here](https://trubrics.github.io/trubrics-sdk/platform/user_prompts/#saving-prompts-to-trubrics) for kwargs that can be passed to Trubrics prompts.\n",
|
|
"\n",
|
|
"```python\n",
|
|
"class TrubricsCallbackHandler(BaseCallbackHandler):\n",
|
|
"\n",
|
|
" \"\"\"\n",
|
|
" Callback handler for Trubrics.\n",
|
|
" \n",
|
|
" Args:\n",
|
|
" project: a trubrics project, default project is \"default\"\n",
|
|
" email: a trubrics account email, can equally be set in env variables\n",
|
|
" password: a trubrics account password, can equally be set in env variables\n",
|
|
" **kwargs: all other kwargs are parsed and set to trubrics prompt variables, or added to the `metadata` dict\n",
|
|
" \"\"\"\n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "44d60d9f-b2bd-4ed4-b624-54cce8313815",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"## Examples"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "d38e80f0-7254-4180-82ec-ebd5ee232906",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"Here are two examples of how to use the `TrubricsCallbackHandler` with Langchain [LLMs](https://python.langchain.com/docs/modules/model_io/llms/) or [Chat Models](https://python.langchain.com/docs/modules/model_io/chat/). We will use OpenAI models, so set your `OPENAI_API_KEY` key here:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "9d394b7f-45eb-44ec-b721-17d2402de805",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"os.environ[\"OPENAI_API_KEY\"] = \"sk-***\""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "33be2663-1518-4064-a6a9-4f1ae24ba9d1",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"### 1. With an LLM"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 4,
|
|
"id": "6933f7b7-262b-4acf-8c7c-785d1f32b49f",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.callbacks import TrubricsCallbackHandler\n",
|
|
"from langchain_openai import OpenAI"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 5,
|
|
"id": "eabfa598-0562-46bf-8d64-e751d4d91963",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"\u001b[32m2023-09-26 11:30:02.149\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mtrubrics.platform.auth\u001b[0m:\u001b[36mget_trubrics_auth_token\u001b[0m:\u001b[36m61\u001b[0m - \u001b[1mUser jeff.kayne@trubrics.com has been authenticated.\u001b[0m\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"llm = OpenAI(callbacks=[TrubricsCallbackHandler()])"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 6,
|
|
"id": "a65f9f5d-5ec5-4b1b-a1d8-9520cbadab39",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"\u001b[32m2023-09-26 11:30:07.760\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mtrubrics.platform\u001b[0m:\u001b[36mlog_prompt\u001b[0m:\u001b[36m102\u001b[0m - \u001b[1mUser prompt saved to Trubrics.\u001b[0m\n",
|
|
"\u001b[32m2023-09-26 11:30:08.042\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mtrubrics.platform\u001b[0m:\u001b[36mlog_prompt\u001b[0m:\u001b[36m102\u001b[0m - \u001b[1mUser prompt saved to Trubrics.\u001b[0m\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"res = llm.generate([\"Tell me a joke\", \"Write me a poem\"])"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 7,
|
|
"id": "68b60b98-01da-47be-b513-b71e68f97940",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"--> GPT's joke: \n",
|
|
"\n",
|
|
"Q: What did the fish say when it hit the wall?\n",
|
|
"A: Dam!\n",
|
|
"\n",
|
|
"--> GPT's poem: \n",
|
|
"\n",
|
|
"A Poem of Reflection\n",
|
|
"\n",
|
|
"I stand here in the night,\n",
|
|
"The stars above me filling my sight.\n",
|
|
"I feel such a deep connection,\n",
|
|
"To the world and all its perfection.\n",
|
|
"\n",
|
|
"A moment of clarity,\n",
|
|
"The calmness in the air so serene.\n",
|
|
"My mind is filled with peace,\n",
|
|
"And I am released.\n",
|
|
"\n",
|
|
"The past and the present,\n",
|
|
"My thoughts create a pleasant sentiment.\n",
|
|
"My heart is full of joy,\n",
|
|
"My soul soars like a toy.\n",
|
|
"\n",
|
|
"I reflect on my life,\n",
|
|
"And the choices I have made.\n",
|
|
"My struggles and my strife,\n",
|
|
"The lessons I have paid.\n",
|
|
"\n",
|
|
"The future is a mystery,\n",
|
|
"But I am ready to take the leap.\n",
|
|
"I am ready to take the lead,\n",
|
|
"And to create my own destiny.\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"print(\"--> GPT's joke: \", res.generations[0][0].text)\n",
|
|
"print()\n",
|
|
"print(\"--> GPT's poem: \", res.generations[1][0].text)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "8c767458-c9b8-4d4d-a48c-996e9be00257",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"source": [
|
|
"### 2. With a chat model"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 8,
|
|
"id": "8a61cb5e-bed9-4618-b547-fc21b6e319c4",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.callbacks import TrubricsCallbackHandler\n",
|
|
"from langchain_core.messages import HumanMessage, SystemMessage\n",
|
|
"from langchain_openai import ChatOpenAI"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 9,
|
|
"id": "a1ff1efb-305b-4e82-aea2-264b78350f14",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"chat_llm = ChatOpenAI(\n",
|
|
" callbacks=[\n",
|
|
" TrubricsCallbackHandler(\n",
|
|
" project=\"default\",\n",
|
|
" tags=[\"chat model\"],\n",
|
|
" user_id=\"user-id-1234\",\n",
|
|
" some_metadata={\"hello\": [1, 2]},\n",
|
|
" )\n",
|
|
" ]\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 10,
|
|
"id": "c83d3956-99ab-4b6f-8515-0def83a1698c",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"\u001b[32m2023-09-26 11:30:10.550\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mtrubrics.platform\u001b[0m:\u001b[36mlog_prompt\u001b[0m:\u001b[36m102\u001b[0m - \u001b[1mUser prompt saved to Trubrics.\u001b[0m\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"chat_res = chat_llm(\n",
|
|
" [\n",
|
|
" SystemMessage(content=\"Every answer of yours must be about OpenAI.\"),\n",
|
|
" HumanMessage(content=\"Tell me a joke\"),\n",
|
|
" ]\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 11,
|
|
"id": "40b10314-1727-4dcd-993e-37a52e2349c6",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"Why did the OpenAI computer go to the party?\n",
|
|
"\n",
|
|
"Because it wanted to meet its AI friends and have a byte of fun!\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"print(chat_res.content)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "f66f438d-12e0-4bdd-b004-601495f84c73",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": []
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.10.12"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|