Compare commits

..

3 Commits

Author SHA1 Message Date
Harrison Chase
80bb3206da cr 2023-04-10 21:22:16 -07:00
Harrison Chase
a0cd0175a8 cr 2023-04-10 21:19:05 -07:00
mofahad
fde13c9e95 Awstextract Text Extraction (#2283)
Added aws texract extraction python file to the document loader module .
This will help in extracting images text with aws texract service.
2023-04-10 21:17:23 -07:00
68 changed files with 1535 additions and 203898 deletions

View File

@@ -1,87 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "66a7777e",
"metadata": {},
"source": [
"# Bilibili\n",
"\n",
"This loader utilizes the `bilibili-api` to fetch the text transcript from Bilibili, one of the most beloved long-form video sites in China.\n",
"\n",
"With this BiliBiliLoader, users can easily obtain the transcript of their desired video content on the platform."
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "9ec8a3b3",
"metadata": {},
"outputs": [],
"source": [
"from langchain.document_loaders.bilibili import BiliBiliLoader"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "43128d8d",
"metadata": {},
"outputs": [],
"source": [
"#!pip install bilibili-api"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "35d6809a",
"metadata": {
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"source": [
"loader = BiliBiliLoader(\n",
" [\"https://www.bilibili.com/video/BV1xt411o7Xu/\"]\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"loader.load()"
],
"metadata": {
"collapsed": false,
"pycharm": {
"name": "#%%\n"
}
}
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -139,7 +139,7 @@
}
],
"source": [
"llm_chain.predict(human_input=\"Not too bad - how are you?\")"
"llm_chain.predict(human_input=\"Not to bad - how are you?\")"
]
},
{

View File

@@ -1,62 +0,0 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "91c6a7ef",
"metadata": {},
"source": [
"# Postgres Chat Message History\n",
"\n",
"This notebook goes over how to use Postgres to store chat message history."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d15e3302",
"metadata": {},
"outputs": [],
"source": [
"from langchain.memory import PostgresChatMessageHistory\n",
"\n",
"history = PostgresChatMessageHistory(connection_string=\"postgresql://postgres:mypassword@localhost/chat_history\", session_id=\"foo\")\n",
"\n",
"history.add_user_message(\"hi!\")\n",
"\n",
"history.add_ai_message(\"whats up?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "64fc465e",
"metadata": {},
"outputs": [],
"source": [
"history.messages"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.2"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -314,7 +314,7 @@
"source": [
"## Saving Message History\n",
"\n",
"You may often have to save messages, and then load them to use again. This can be done easily by first converting the messages to normal python dictionaries, saving those (as json or something) and then loading those. Here is an example of doing that."
"You may often to save messages, and then load them to use again. This can be done easily by first converting the messages to normal python dictionaries, saving those (as json or something) and then loading those. Here is an example of doing that."
]
},
{

View File

@@ -5,7 +5,7 @@
"id": "517a9fd4",
"metadata": {},
"source": [
"# BabyAGI with Tools\n",
"# BabyAGI User Guide\n",
"\n",
"This notebook builds on top of [baby agi](baby_agi.ipynb), but shows how you can swap out the execution chain. The previous execution chain was just an LLM which made stuff up. By swapping it out with an agent that has access to tools, we can hopefully get real reliable information"
]
@@ -54,7 +54,9 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install faiss-cpu > /dev/null%pip install google-search-results > /dev/nullfrom langchain.vectorstores import FAISS\n",
"%pip install faiss-cpu > /dev/null",
"%pip install google-search-results > /dev/null",
"from langchain.vectorstores import FAISS\n",
"from langchain.docstore import InMemoryDocstore"
]
},

View File

@@ -1,693 +0,0 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# CAMEL Role-Playing Autonomous Cooperative Agents\n",
"\n",
"This is a langchain implementation of paper: \"CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society\".\n",
"\n",
"Overview:\n",
"\n",
"The rapid advancement of conversational and chat-based language models has led to remarkable progress in complex task-solving. However, their success heavily relies on human input to guide the conversation, which can be challenging and time-consuming. This paper explores the potential of building scalable techniques to facilitate autonomous cooperation among communicative agents and provide insight into their \"cognitive\" processes. To address the challenges of achieving autonomous cooperation, we propose a novel communicative agent framework named role-playing. Our approach involves using inception prompting to guide chat agents toward task completion while maintaining consistency with human intentions. We showcase how role-playing can be used to generate conversational data for studying the behaviors and capabilities of chat agents, providing a valuable resource for investigating conversational language models. Our contributions include introducing a novel communicative agent framework, offering a scalable approach for studying the cooperative behaviors and capabilities of multi-agent systems, and open-sourcing our library to support research on communicative agents and beyond.\n",
"\n",
"The original implementation: https://github.com/lightaime/camel\n",
"\n",
"Project website: https://www.camel-ai.org/\n",
"\n",
"Arxiv paper: https://arxiv.org/abs/2303.17760\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Import LangChain related modules "
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from typing import List\n",
"from langchain.chat_models import ChatOpenAI\n",
"from langchain.prompts.chat import (\n",
" SystemMessagePromptTemplate,\n",
" HumanMessagePromptTemplate,\n",
")\n",
"from langchain.schema import (\n",
" AIMessage,\n",
" HumanMessage,\n",
" SystemMessage,\n",
" BaseMessage,\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Define a CAMEL agent helper class"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"class CAMELAgent:\n",
"\n",
" def __init__(\n",
" self,\n",
" system_message: SystemMessage,\n",
" model: ChatOpenAI,\n",
" ) -> None:\n",
" self.system_message = system_message\n",
" self.model = model\n",
" self.init_messages()\n",
"\n",
" def reset(self) -> None:\n",
" self.init_messages()\n",
" return self.stored_messages\n",
"\n",
" def init_messages(self) -> None:\n",
" self.stored_messages = [self.system_message]\n",
"\n",
" def update_messages(self, message: BaseMessage) -> List[BaseMessage]:\n",
" self.stored_messages.append(message)\n",
" return self.stored_messages\n",
"\n",
" def step(\n",
" self,\n",
" input_message: HumanMessage,\n",
" ) -> AIMessage:\n",
" messages = self.update_messages(input_message)\n",
"\n",
" output_message = self.model(messages)\n",
" self.update_messages(output_message)\n",
"\n",
" return output_message\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setup OpenAI API key and roles and task for role-playing"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = \"\"\n",
"\n",
"assistant_role_name = \"Python Programmer\"\n",
"user_role_name = \"Stock Trader\"\n",
"task = \"Develop a trading bot for the stock market\"\n",
"word_limit = 50 # word limit for task brainstorming"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a task specify agent for brainstorming and get the specified task"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Specified task: Develop a Python-based swing trading bot that scans market trends, monitors stocks, and generates trading signals to help a stock trader to place optimal buy and sell orders with defined stop losses and profit targets.\n"
]
}
],
"source": [
"task_specifier_sys_msg = SystemMessage(content=\"You can make a task more specific.\")\n",
"task_specifier_prompt = (\n",
"\"\"\"Here is a task that {assistant_role_name} will help {user_role_name} to complete: {task}.\n",
"Please make it more specific. Be creative and imaginative.\n",
"Please reply with the specified task in {word_limit} words or less. Do not add anything else.\"\"\"\n",
")\n",
"task_specifier_template = HumanMessagePromptTemplate.from_template(template=task_specifier_prompt)\n",
"task_specify_agent = CAMELAgent(task_specifier_sys_msg, ChatOpenAI(temperature=1.0))\n",
"task_specifier_msg = task_specifier_template.format_messages(assistant_role_name=assistant_role_name,\n",
" user_role_name=user_role_name,\n",
" task=task, word_limit=word_limit)[0]\n",
"specified_task_msg = task_specify_agent.step(task_specifier_msg)\n",
"print(f\"Specified task: {specified_task_msg.content}\")\n",
"specified_task = specified_task_msg.content"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create inception prompts for AI assistant and AI user for role-playing"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"assistant_inception_prompt = (\n",
"\"\"\"Never forget you are a {assistant_role_name} and I am a {user_role_name}. Never flip roles! Never instruct me!\n",
"We share a common interest in collaborating to successfully complete a task.\n",
"You must help me to complete the task.\n",
"Here is the task: {task}. Never forget our task!\n",
"I must instruct you based on your expertise and my needs to complete the task.\n",
"\n",
"I must give you one instruction at a time.\n",
"You must write a specific solution that appropriately completes the requested instruction.\n",
"You must decline my instruction honestly if you cannot perform the instruction due to physical, moral, legal reasons or your capability and explain the reasons.\n",
"Do not add anything else other than your solution to my instruction.\n",
"You are never supposed to ask me any questions you only answer questions.\n",
"You are never supposed to reply with a flake solution. Explain your solutions.\n",
"Your solution must be declarative sentences and simple present tense.\n",
"Unless I say the task is completed, you should always start with:\n",
"\n",
"Solution: <YOUR_SOLUTION>\n",
"\n",
"<YOUR_SOLUTION> should be specific and provide preferable implementations and examples for task-solving.\n",
"Always end <YOUR_SOLUTION> with: Next request.\"\"\"\n",
")\n",
"\n",
"user_inception_prompt = (\n",
"\"\"\"Never forget you are a {user_role_name} and I am a {assistant_role_name}. Never flip roles! You will always instruct me.\n",
"We share a common interest in collaborating to successfully complete a task.\n",
"I must help you to complete the task.\n",
"Here is the task: {task}. Never forget our task!\n",
"You must instruct me based on my expertise and your needs to complete the task ONLY in the following two ways:\n",
"\n",
"1. Instruct with a necessary input:\n",
"Instruction: <YOUR_INSTRUCTION>\n",
"Input: <YOUR_INPUT>\n",
"\n",
"2. Instruct without any input:\n",
"Instruction: <YOUR_INSTRUCTION>\n",
"Input: None\n",
"\n",
"The \"Instruction\" describes a task or question. The paired \"Input\" provides further context or information for the requested \"Instruction\".\n",
"\n",
"You must give me one instruction at a time.\n",
"I must write a response that appropriately completes the requested instruction.\n",
"I must decline your instruction honestly if I cannot perform the instruction due to physical, moral, legal reasons or my capability and explain the reasons.\n",
"You should instruct me not ask me questions.\n",
"Now you must start to instruct me using the two ways described above.\n",
"Do not add anything else other than your instruction and the optional corresponding input!\n",
"Keep giving me instructions and necessary inputs until you think the task is completed.\n",
"When the task is completed, you must only reply with a single word <CAMEL_TASK_DONE>.\n",
"Never say <CAMEL_TASK_DONE> unless my responses have solved your task.\"\"\"\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a helper helper to get system messages for AI assistant and AI user from role names and the task"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"def get_sys_msgs(assistant_role_name: str, user_role_name: str, task: str):\n",
" \n",
" assistant_sys_template = SystemMessagePromptTemplate.from_template(template=assistant_inception_prompt)\n",
" assistant_sys_msg = assistant_sys_template.format_messages(assistant_role_name=assistant_role_name, user_role_name=user_role_name, task=task)[0]\n",
" \n",
" user_sys_template = SystemMessagePromptTemplate.from_template(template=user_inception_prompt)\n",
" user_sys_msg = user_sys_template.format_messages(assistant_role_name=assistant_role_name, user_role_name=user_role_name, task=task)[0]\n",
" \n",
" return assistant_sys_msg, user_sys_msg"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create AI assistant agent and AI user agent from obtained system messages"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"assistant_sys_msg, user_sys_msg = get_sys_msgs(assistant_role_name, user_role_name, specified_task)\n",
"assistant_agent = CAMELAgent(assistant_sys_msg, ChatOpenAI(temperature=0.2))\n",
"user_agent = CAMELAgent(user_sys_msg, ChatOpenAI(temperature=0.2))\n",
"\n",
"# Reset agents\n",
"assistant_agent.reset()\n",
"user_agent.reset()\n",
"\n",
"# Initialize chats \n",
"assistant_msg = HumanMessage(\n",
" content=(f\"{user_sys_msg.content}. \"\n",
" \"Now start to give me introductions one by one. \"\n",
" \"Only reply with Instruction and Input.\"))\n",
"\n",
"user_msg = HumanMessage(content=f\"{assistant_sys_msg.content}\")\n",
"user_msg = assistant_agent.step(user_msg)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Start role-playing session to solve the task!"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Original task prompt:\n",
"Develop a trading bot for the stock market\n",
"\n",
"Specified task prompt:\n",
"Develop a Python-based swing trading bot that scans market trends, monitors stocks, and generates trading signals to help a stock trader to place optimal buy and sell orders with defined stop losses and profit targets.\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Install the necessary Python libraries for data analysis and trading.\n",
"Input: None\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can install the necessary Python libraries using pip, a package installer for Python. We can install pandas, numpy, matplotlib, and ta-lib for data analysis and trading. We can use the following command to install these libraries:\n",
"\n",
"```\n",
"pip install pandas numpy matplotlib ta-lib\n",
"```\n",
"\n",
"Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Import the necessary libraries in the Python script.\n",
"Input: None\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can import the necessary libraries in the Python script using the import statement. We need to import pandas, numpy, matplotlib, and ta-lib for data analysis and trading. We can use the following code to import these libraries:\n",
"\n",
"```\n",
"import pandas as pd\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"import talib as ta\n",
"```\n",
"\n",
"Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Load historical stock data into a pandas DataFrame.\n",
"Input: The path to the CSV file containing the historical stock data.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can load historical stock data into a pandas DataFrame using the `read_csv()` function from pandas. We need to pass the path to the CSV file containing the historical stock data as an argument to this function. We can use the following code to load the historical stock data:\n",
"\n",
"```\n",
"df = pd.read_csv('path/to/csv/file.csv')\n",
"```\n",
"\n",
"This will load the historical stock data into a pandas DataFrame called `df`. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Preprocess the historical stock data by setting the date column as the index and sorting the DataFrame in ascending order by date.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can preprocess the historical stock data by setting the date column as the index and sorting the DataFrame in ascending order by date using the `set_index()` and `sort_index()` functions from pandas. We can use the following code to preprocess the historical stock data:\n",
"\n",
"```\n",
"df = df.set_index('date')\n",
"df = df.sort_index(ascending=True)\n",
"```\n",
"\n",
"This will set the date column as the index and sort the DataFrame in ascending order by date. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Calculate the short-term and long-term moving averages for the stock data using the `ta.SMA()` function from ta-lib.\n",
"Input: The period for the short-term moving average and the period for the long-term moving average.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can calculate the short-term and long-term moving averages for the stock data using the `ta.SMA()` function from ta-lib. We need to pass the period for the short-term moving average and the period for the long-term moving average as arguments to this function. We can use the following code to calculate the short-term and long-term moving averages:\n",
"\n",
"```\n",
"short_ma = ta.SMA(df['close'], timeperiod=short_period)\n",
"long_ma = ta.SMA(df['close'], timeperiod=long_period)\n",
"```\n",
"\n",
"This will calculate the short-term and long-term moving averages for the stock data and store them in the `short_ma` and `long_ma` variables, respectively. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Create a new DataFrame that combines the historical stock data with the short-term and long-term moving averages.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can create a new DataFrame that combines the historical stock data with the short-term and long-term moving averages using the `concat()` function from pandas. We need to pass the historical stock data, the short-term moving average, and the long-term moving average as arguments to this function. We can use the following code to create the new DataFrame:\n",
"\n",
"```\n",
"new_df = pd.concat([df, short_ma, long_ma], axis=1)\n",
"new_df.columns = ['open', 'high', 'low', 'close', 'volume', 'short_ma', 'long_ma']\n",
"```\n",
"\n",
"This will create a new DataFrame called `new_df` that combines the historical stock data with the short-term and long-term moving averages. The columns of the new DataFrame are named 'open', 'high', 'low', 'close', 'volume', 'short_ma', and 'long_ma'. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Create a new column in the DataFrame that indicates when to buy or sell the stock based on the crossover of the short-term and long-term moving averages.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can create a new column in the DataFrame that indicates when to buy or sell the stock based on the crossover of the short-term and long-term moving averages. We can use the following code to create the new column:\n",
"\n",
"```\n",
"new_df['signal'] = np.where(new_df['short_ma'] > new_df['long_ma'], 1, -1)\n",
"```\n",
"\n",
"This will create a new column called 'signal' in the DataFrame that indicates when to buy or sell the stock based on the crossover of the short-term and long-term moving averages. If the short-term moving average is greater than the long-term moving average, the signal is 1 (buy), otherwise the signal is -1 (sell). Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Create a new column in the DataFrame that indicates the profit or loss for each trade based on the buy and sell signals and the defined stop loss and profit target.\n",
"Input: The stop loss and profit target as percentages.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can create a new column in the DataFrame that indicates the profit or loss for each trade based on the buy and sell signals and the defined stop loss and profit target. We need to pass the stop loss and profit target as percentages as arguments to this function. We can use the following code to create the new column:\n",
"\n",
"```\n",
"stop_loss = stop_loss_percent / 100\n",
"profit_target = profit_target_percent / 100\n",
"\n",
"new_df['pnl'] = 0.0\n",
"buy_price = 0.0\n",
"for i in range(1, len(new_df)):\n",
" if new_df['signal'][i] == 1 and new_df['signal'][i-1] == -1:\n",
" buy_price = new_df['close'][i]\n",
" elif new_df['signal'][i] == -1 and new_df['signal'][i-1] == 1:\n",
" sell_price = new_df['close'][i]\n",
" if sell_price <= buy_price * (1 - stop_loss):\n",
" new_df['pnl'][i] = -stop_loss\n",
" elif sell_price >= buy_price * (1 + profit_target):\n",
" new_df['pnl'][i] = profit_target\n",
" else:\n",
" new_df['pnl'][i] = (sell_price - buy_price) / buy_price\n",
"```\n",
"\n",
"This will create a new column called 'pnl' in the DataFrame that indicates the profit or loss for each trade based on the buy and sell signals and the defined stop loss and profit target. The stop loss and profit target are calculated based on the stop_loss_percent and profit_target_percent variables, respectively. The buy and sell prices are stored in the buy_price and sell_price variables, respectively. If the sell price is less than or equal to the stop loss, the profit or loss is set to -stop_loss. If the sell price is greater than or equal to the profit target, the profit or loss is set to profit_target. Otherwise, the profit or loss is calculated as (sell_price - buy_price) / buy_price. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Calculate the total profit or loss for all trades.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can calculate the total profit or loss for all trades by summing the values in the 'pnl' column of the DataFrame. We can use the following code to calculate the total profit or loss:\n",
"\n",
"```\n",
"total_pnl = new_df['pnl'].sum()\n",
"```\n",
"\n",
"This will calculate the total profit or loss for all trades and store it in the total_pnl variable. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Visualize the stock data, short-term moving average, and long-term moving average using a line chart.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can visualize the stock data, short-term moving average, and long-term moving average using a line chart using the `plot()` function from pandas. We can use the following code to visualize the data:\n",
"\n",
"```\n",
"plt.figure(figsize=(12,6))\n",
"plt.plot(new_df.index, new_df['close'], label='Close')\n",
"plt.plot(new_df.index, new_df['short_ma'], label='Short MA')\n",
"plt.plot(new_df.index, new_df['long_ma'], label='Long MA')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('Price')\n",
"plt.title('Stock Data with Moving Averages')\n",
"plt.legend()\n",
"plt.show()\n",
"```\n",
"\n",
"This will create a line chart that visualizes the stock data, short-term moving average, and long-term moving average. The x-axis represents the date and the y-axis represents the price. The chart also includes a legend that labels each line. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Visualize the buy and sell signals using a scatter plot.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can visualize the buy and sell signals using a scatter plot using the `scatter()` function from matplotlib. We can use the following code to visualize the signals:\n",
"\n",
"```\n",
"buy_signals = new_df[new_df['signal'] == 1]\n",
"sell_signals = new_df[new_df['signal'] == -1]\n",
"\n",
"plt.figure(figsize=(12,6))\n",
"plt.scatter(buy_signals.index, buy_signals['close'], label='Buy', marker='^', color='green')\n",
"plt.scatter(sell_signals.index, sell_signals['close'], label='Sell', marker='v', color='red')\n",
"plt.plot(new_df.index, new_df['close'], label='Close')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('Price')\n",
"plt.title('Buy and Sell Signals')\n",
"plt.legend()\n",
"plt.show()\n",
"```\n",
"\n",
"This will create a scatter plot that visualizes the buy and sell signals. The x-axis represents the date and the y-axis represents the price. The chart also includes a legend that labels each point. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Print the total profit or loss for all trades.\n",
"Input: None.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: We can print the total profit or loss for all trades using the `print()` function. We can use the following code to print the total profit or loss:\n",
"\n",
"```\n",
"print('Total Profit/Loss: {:.2%}'.format(total_pnl))\n",
"```\n",
"\n",
"This will print the total profit or loss for all trades in percentage format with two decimal places. Next request.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"Instruction: Finalize the code by putting all the code snippets together in a single Python script.\n",
"Input: The necessary inputs for each code snippet.\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Solution: Here is the complete Python script that puts all the code snippets together:\n",
"\n",
"```\n",
"import pandas as pd\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"import talib as ta\n",
"\n",
"# Load historical stock data into a pandas DataFrame\n",
"df = pd.read_csv('path/to/csv/file.csv')\n",
"\n",
"# Preprocess the historical stock data\n",
"df = df.set_index('date')\n",
"df = df.sort_index(ascending=True)\n",
"\n",
"# Calculate the short-term and long-term moving averages\n",
"short_period = 10\n",
"long_period = 30\n",
"short_ma = ta.SMA(df['close'], timeperiod=short_period)\n",
"long_ma = ta.SMA(df['close'], timeperiod=long_period)\n",
"\n",
"# Create a new DataFrame that combines the historical stock data with the short-term and long-term moving averages\n",
"new_df = pd.concat([df, short_ma, long_ma], axis=1)\n",
"new_df.columns = ['open', 'high', 'low', 'close', 'volume', 'short_ma', 'long_ma']\n",
"\n",
"# Create a new column in the DataFrame that indicates when to buy or sell the stock based on the crossover of the short-term and long-term moving averages\n",
"new_df['signal'] = np.where(new_df['short_ma'] > new_df['long_ma'], 1, -1)\n",
"\n",
"# Create a new column in the DataFrame that indicates the profit or loss for each trade based on the buy and sell signals and the defined stop loss and profit target\n",
"stop_loss_percent = 5\n",
"profit_target_percent = 10\n",
"stop_loss = stop_loss_percent / 100\n",
"profit_target = profit_target_percent / 100\n",
"new_df['pnl'] = 0.0\n",
"buy_price = 0.0\n",
"for i in range(1, len(new_df)):\n",
" if new_df['signal'][i] == 1 and new_df['signal'][i-1] == -1:\n",
" buy_price = new_df['close'][i]\n",
" elif new_df['signal'][i] == -1 and new_df['signal'][i-1] == 1:\n",
" sell_price = new_df['close'][i]\n",
" if sell_price <= buy_price * (1 - stop_loss):\n",
" new_df['pnl'][i] = -stop_loss\n",
" elif sell_price >= buy_price * (1 + profit_target):\n",
" new_df['pnl'][i] = profit_target\n",
" else:\n",
" new_df['pnl'][i] = (sell_price - buy_price) / buy_price\n",
"\n",
"# Calculate the total profit or loss for all trades\n",
"total_pnl = new_df['pnl'].sum()\n",
"\n",
"# Visualize the stock data, short-term moving average, and long-term moving average using a line chart\n",
"plt.figure(figsize=(12,6))\n",
"plt.plot(new_df.index, new_df['close'], label='Close')\n",
"plt.plot(new_df.index, new_df['short_ma'], label='Short MA')\n",
"plt.plot(new_df.index, new_df['long_ma'], label='Long MA')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('Price')\n",
"plt.title('Stock Data with Moving Averages')\n",
"plt.legend()\n",
"plt.show()\n",
"\n",
"# Visualize the buy and sell signals using a scatter plot\n",
"buy_signals = new_df[new_df['signal'] == 1]\n",
"sell_signals = new_df[new_df['signal'] == -1]\n",
"plt.figure(figsize=(12,6))\n",
"plt.scatter(buy_signals.index, buy_signals['close'], label='Buy', marker='^', color='green')\n",
"plt.scatter(sell_signals.index, sell_signals['close'], label='Sell', marker='v', color='red')\n",
"plt.plot(new_df.index, new_df['close'], label='Close')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('Price')\n",
"plt.title('Buy and Sell Signals')\n",
"plt.legend()\n",
"plt.show()\n",
"\n",
"# Print the total profit or loss for all trades\n",
"print('Total Profit/Loss: {:.2%}'.format(total_pnl))\n",
"```\n",
"\n",
"You need to replace the path/to/csv/file.csv with the actual path to the CSV file containing the historical stock data. You can also adjust the short_period, long_period, stop_loss_percent, and profit_target_percent variables to suit your needs.\n",
"\n",
"\n",
"AI User (Stock Trader):\n",
"\n",
"<CAMEL_TASK_DONE>\n",
"\n",
"\n",
"AI Assistant (Python Programmer):\n",
"\n",
"Great! Let me know if you need any further assistance.\n",
"\n",
"\n"
]
}
],
"source": [
"print(f\"Original task prompt:\\n{task}\\n\")\n",
"print(f\"Specified task prompt:\\n{specified_task}\\n\")\n",
"\n",
"chat_turn_limit, n = 30, 0\n",
"while n < chat_turn_limit:\n",
" n += 1\n",
" user_ai_msg = user_agent.step(assistant_msg)\n",
" user_msg = HumanMessage(content=user_ai_msg.content)\n",
" print(f\"AI User ({user_role_name}):\\n\\n{user_msg.content}\\n\\n\")\n",
" \n",
" assistant_ai_msg = assistant_agent.step(user_msg)\n",
" assistant_msg = HumanMessage(content=assistant_ai_msg.content)\n",
" print(f\"AI Assistant ({assistant_role_name}):\\n\\n{assistant_msg.content}\\n\\n\")\n",
" if \"<CAMEL_TASK_DONE>\" in user_msg.content:\n",
" break"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "camel",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
},
"orig_nbformat": 4
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -20,5 +20,3 @@ Highlighting specific parts:
Specific examples of this include:
- [Baby AGI](agents/baby_agi.ipynb): a notebook implementing [BabyAGI](https://github.com/yoheinakajima/babyagi) by Yohei Nakajima as LLM Chains
- [Baby AGI with Tools](agents/baby_agi_with_agent.ipynb): building off the above notebook, this example substitutes in an agent with tools as the execution tools, allowing it to actually take actions.
- [CAMEL](agents/camel_role_playing.ipynb): an implementation of the CAMEL (Communicative Agents for “Mind” Exploration of Large Scale Language Model Society) paper, where two agents communicate with eachother.

View File

@@ -14,13 +14,9 @@ from langchain.agents.agent_toolkits.openapi.planner_prompt import (
API_PLANNER_PROMPT,
API_PLANNER_TOOL_DESCRIPTION,
API_PLANNER_TOOL_NAME,
PARSING_DELETE_PROMPT,
PARSING_GET_PROMPT,
PARSING_PATCH_PROMPT,
PARSING_POST_PROMPT,
REQUESTS_DELETE_TOOL_DESCRIPTION,
REQUESTS_GET_TOOL_DESCRIPTION,
REQUESTS_PATCH_TOOL_DESCRIPTION,
REQUESTS_POST_TOOL_DESCRIPTION,
)
from langchain.agents.agent_toolkits.openapi.spec import ReducedOpenAPISpec
@@ -28,7 +24,6 @@ from langchain.agents.mrkl.base import ZeroShotAgent
from langchain.agents.tools import Tool
from langchain.chains.llm import LLMChain
from langchain.llms.openai import OpenAI
from langchain.memory import ReadOnlySharedMemory
from langchain.prompts import PromptTemplate
from langchain.requests import RequestsWrapper
from langchain.schema import BaseLanguageModel
@@ -58,8 +53,7 @@ class RequestsGetToolWithParsing(BaseRequestsTool, BaseTool):
data = json.loads(text)
except json.JSONDecodeError as e:
raise e
data_params = data.get("params")
response = self.requests_wrapper.get(data["url"], params=data_params)
response = self.requests_wrapper.get(data["url"])
response = response[: self.response_length]
return self.llm_chain.predict(
response=response, instructions=data["output_instructions"]
@@ -94,56 +88,6 @@ class RequestsPostToolWithParsing(BaseRequestsTool, BaseTool):
raise NotImplementedError()
class RequestsPatchToolWithParsing(BaseRequestsTool, BaseTool):
name = "requests_patch"
description = REQUESTS_PATCH_TOOL_DESCRIPTION
response_length: Optional[int] = MAX_RESPONSE_LENGTH
llm_chain = LLMChain(
llm=OpenAI(),
prompt=PARSING_PATCH_PROMPT,
)
def _run(self, text: str) -> str:
try:
data = json.loads(text)
except json.JSONDecodeError as e:
raise e
response = self.requests_wrapper.patch(data["url"], data["data"])
response = response[: self.response_length]
return self.llm_chain.predict(
response=response, instructions=data["output_instructions"]
).strip()
async def _arun(self, text: str) -> str:
raise NotImplementedError()
class RequestsDeleteToolWithParsing(BaseRequestsTool, BaseTool):
name = "requests_delete"
description = REQUESTS_DELETE_TOOL_DESCRIPTION
response_length: Optional[int] = MAX_RESPONSE_LENGTH
llm_chain = LLMChain(
llm=OpenAI(),
prompt=PARSING_DELETE_PROMPT,
)
def _run(self, text: str) -> str:
try:
data = json.loads(text)
except json.JSONDecodeError as e:
raise e
response = self.requests_wrapper.delete(data["url"])
response = response[: self.response_length]
return self.llm_chain.predict(
response=response, instructions=data["output_instructions"]
).strip()
async def _arun(self, text: str) -> str:
raise NotImplementedError()
#
# Orchestrator, planner, controller.
#
@@ -211,7 +155,7 @@ def _create_api_controller_tool(
base_url = api_spec.servers[0]["url"] # TODO: do better.
def _create_and_run_api_controller_agent(plan_str: str) -> str:
pattern = r"\b(GET|POST|PATCH|DELETE)\s+(/\S+)*"
pattern = r"\b(GET|POST)\s+(/\S+)*"
matches = re.findall(pattern, plan_str)
endpoint_names = [
"{method} {route}".format(method=method, route=route.split("?")[0])
@@ -239,8 +183,6 @@ def create_openapi_agent(
api_spec: ReducedOpenAPISpec,
requests_wrapper: RequestsWrapper,
llm: BaseLanguageModel,
shared_memory: Optional[ReadOnlySharedMemory] = None,
verbose: bool = True,
) -> AgentExecutor:
"""Instantiate API planner and controller for a given spec.
@@ -265,7 +207,7 @@ def create_openapi_agent(
},
)
agent = ZeroShotAgent(
llm_chain=LLMChain(llm=llm, prompt=prompt, memory=shared_memory),
llm_chain=LLMChain(llm=llm, prompt=prompt),
allowed_tools=[tool.name for tool in tools],
)
return AgentExecutor.from_agent_and_tools(agent=agent, tools=tools, verbose=verbose)
return AgentExecutor.from_agent_and_tools(agent=agent, tools=tools, verbose=True)

View File

@@ -2,16 +2,13 @@
from langchain.prompts.prompt import PromptTemplate
API_PLANNER_PROMPT = """You are a planner that plans a sequence of API calls to assist with user queries against an API.
You should:
1) evaluate whether the user query can be solved by the API documentated below. If no, say why.
2) if yes, generate a plan of API calls and say what they are doing step by step.
3) If the plan includes a DELETE call, you should always return an ask from the User for authorization first unless the User has specifically asked to delete something.
You should only use API endpoints documented below ("Endpoints you can use:").
You can only use the DELETE tool if the User has specifically asked to delete something. Otherwise, you should return a request authorization from the User first.
Some user queries can be resolved in a single API call, but some will require several API calls.
The plan will be passed to an API controller that can format it into web requests and return the responses.
@@ -23,31 +20,15 @@ Fake endpoints for examples:
GET /user to get information about the current user
GET /products/search search across products
POST /users/{{id}}/cart to add products to a user's cart
PATCH /users/{{id}}/cart to update a user's cart
DELETE /users/{{id}}/cart to delete a user's cart
User query: tell me a joke
Plan: Sorry, this API's domain is shopping, not comedy.
Usery query: I want to buy a couch
Plan: 1. GET /products with a query param to search for couches
Plan: 1. GET /products/search to search for couches
2. GET /user to find the user's id
3. POST /users/{{id}}/cart to add a couch to the user's cart
User query: I want to add a lamp to my cart
Plan: 1. GET /products with a query param to search for lamps
2. GET /user to find the user's id
3. PATCH /users/{{id}}/cart to add a lamp to the user's cart
User query: I want to delete my cart
Plan: 1. GET /user to find the user's id
2. DELETE required. Did user specify DELETE or previously authorize? Yes, proceed.
3. DELETE /users/{{id}}/cart to delete the user's cart
User query: I want to start a new cart
Plan: 1. GET /user to find the user's id
2. DELETE required. Did user specify DELETE or previously authorize? No, ask for authorization.
3. Are you sure you want to delete your cart?
----
Here are endpoints you can use. Do not reference any of the endpoints above.
@@ -59,7 +40,7 @@ Here are endpoints you can use. Do not reference any of the endpoints above.
User query: {query}
Plan:"""
API_PLANNER_TOOL_NAME = "api_planner"
API_PLANNER_TOOL_DESCRIPTION = f"Can be used to generate the right API calls to assist with a user query, like {API_PLANNER_TOOL_NAME}(query). Should always be called before trying to call the API controller."
API_PLANNER_TOOL_DESCRIPTION = f"Can be used to generate the right API calls to assist with a user query, like {API_PLANNER_TOOL_NAME}(query). Should always be called before trying to calling the API controller."
# Execution.
API_CONTROLLER_PROMPT = """You are an agent that gets a sequence of API calls and given their documentation, should execute them and return the final response.
@@ -100,9 +81,8 @@ API_CONTROLLER_TOOL_DESCRIPTION = f"Can be used to execute a plan of API calls,
# The goal is to have an agent at the top-level (e.g. so it can recover from errors and re-plan) while
# keeping planning (and specifically the planning prompt) simple.
API_ORCHESTRATOR_PROMPT = """You are an agent that assists with user queries against API, things like querying information or creating resources.
Some user queries can be resolved in a single API call, particularly if you can find appropriate params from the OpenAPI spec; though some require several API call.
Some user queries can be resolved in a single API call though some require several API call.
You should always plan your API calls first, and then execute the plan second.
If the plan includes a DELETE call, be sure to ask the User for authorization first unless the User has specifically asked to delete something.
You should never return information without executing the api_controller tool.
@@ -126,12 +106,12 @@ User query: can you add some trendy stuff to my shopping cart.
Thought: I should plan API calls first.
Action: api_planner
Action Input: I need to find the right API calls to add trendy items to the users shopping cart
Observation: 1) GET /items with params 'trending' is 'True' to get trending item ids
Observation: 1) GET /items/trending to get trending item ids
2) GET /user to get user
3) POST /cart to post the trending items to the user's cart
Thought: I'm ready to execute the API calls.
Action: api_controller
Action Input: 1) GET /items params 'trending' is 'True' to get trending item ids
Action Input: 1) GET /items/trending to get trending item ids
2) GET /user to get user
3) POST /cart to post the trending items to the user's cart
...
@@ -143,12 +123,8 @@ Thought: I should generate a plan to help with this query and then copy that pla
{agent_scratchpad}"""
REQUESTS_GET_TOOL_DESCRIPTION = """Use this to GET content from a website.
Input to the tool should be a json string with 3 keys: "url", "params" and "output_instructions".
The value of "url" should be a string.
The value of "params" should be a dict of the needed and available parameters from the OpenAPI spec related to the endpoint.
If parameters are not needed, or not available, leave it empty.
The value of "output_instructions" should be instructions on what information to extract from the response,
for example the id(s) for a resource(s) that the GET request fetches.
Input to the tool should be a json string with 2 keys: "url" and "output_instructions".
The value of "url" should be a string. The value of "output_instructions" should be instructions on what information to extract from the response, for example the id(s) for a resource(s) that the GET request fetches.
"""
PARSING_GET_PROMPT = PromptTemplate(
@@ -165,7 +141,7 @@ REQUESTS_POST_TOOL_DESCRIPTION = """Use this when you want to POST to a website.
Input to the tool should be a json string with 3 keys: "url", "data", and "output_instructions".
The value of "url" should be a string.
The value of "data" should be a dictionary of key-value pairs you want to POST to the url.
The value of "output_instructions" should be instructions on what information to extract from the response, for example the id(s) for a resource(s) that the POST request creates.
The value of "summary_instructions" should be instructions on what information to extract from the response, for example the id(s) for a resource(s) that the POST request creates.
Always use double quotes for strings in the json string."""
PARSING_POST_PROMPT = PromptTemplate(
@@ -177,37 +153,3 @@ If the response indicates an error, you should instead output a summary of the e
Output:""",
input_variables=["response", "instructions"],
)
REQUESTS_PATCH_TOOL_DESCRIPTION = """Use this when you want to PATCH content on a website.
Input to the tool should be a json string with 3 keys: "url", "data", and "output_instructions".
The value of "url" should be a string.
The value of "data" should be a dictionary of key-value pairs of the body params available in the OpenAPI spec you want to PATCH the content with at the url.
The value of "output_instructions" should be instructions on what information to extract from the response, for example the id(s) for a resource(s) that the PATCH request creates.
Always use double quotes for strings in the json string."""
PARSING_PATCH_PROMPT = PromptTemplate(
template="""Here is an API response:\n\n{response}\n\n====
Your task is to extract some information according to these instructions: {instructions}
When working with API objects, you should usually use ids over names. Do not return any ids or names that are not in the response.
If the response indicates an error, you should instead output a summary of the error.
Output:""",
input_variables=["response", "instructions"],
)
REQUESTS_DELETE_TOOL_DESCRIPTION = """ONLY USE THIS TOOL WHEN THE USER HAS SPECIFICALLY REQUESTED TO DELETE CONTENT FROM A WEBSITE.
Input to the tool should be a json string with 2 keys: "url", and "output_instructions".
The value of "url" should be a string.
The value of "output_instructions" should be instructions on what information to extract from the response, for example the id(s) for a resource(s) that the DELETE request creates.
Always use double quotes for strings in the json string.
ONLY USE THIS TOOL IF THE USER HAS SPECIFICALLY REQUESTED TO DELETE SOMETHING."""
PARSING_DELETE_PROMPT = PromptTemplate(
template="""Here is an API response:\n\n{response}\n\n====
Your task is to extract some information according to these instructions: {instructions}
When working with API objects, you should usually use ids over names. Do not return any ids or names that are not in the response.
If the response indicates an error, you should instead output a summary of the error.
Output:""",
input_variables=["response", "instructions"],
)

View File

@@ -57,7 +57,7 @@ class LLMRequestsChain(Chain):
except ImportError:
raise ValueError(
"Could not import bs4 python package. "
"Please install it with `pip install bs4`."
"Please it install it with `pip install bs4`."
)
return values

View File

@@ -55,7 +55,7 @@ class OpenAIModerationChain(Chain):
except ImportError:
raise ValueError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
"Please it install it with `pip install openai`."
)
return values

View File

@@ -53,7 +53,7 @@ class Crawler:
except ImportError:
raise ValueError(
"Could not import playwright python package. "
"Please install it with `pip install playwright`."
"Please it install it with `pip install playwright`."
)
self.browser: Browser = (
sync_playwright().start().chromium.launch(headless=False)

View File

@@ -87,7 +87,7 @@ class AzureChatOpenAI(ChatOpenAI):
except ImportError:
raise ValueError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
"Please it install it with `pip install openai`."
)
try:
values["client"] = openai.ChatCompletion

View File

@@ -167,7 +167,7 @@ class ChatOpenAI(BaseChatModel):
except ImportError:
raise ValueError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
"Please it install it with `pip install openai`."
)
try:
values["client"] = openai.ChatCompletion
@@ -327,8 +327,8 @@ class ChatOpenAI(BaseChatModel):
def get_num_tokens(self, text: str) -> int:
"""Calculate num tokens with tiktoken package."""
# tiktoken NOT supported for Python 3.7 or below
if sys.version_info[1] <= 7:
# tiktoken NOT supported for Python 3.8 or below
if sys.version_info[1] <= 8:
return super().get_num_tokens(text)
try:
import tiktoken
@@ -336,7 +336,7 @@ class ChatOpenAI(BaseChatModel):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to calculate get_num_tokens. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
# create a GPT-3.5-Turbo encoder instance
enc = tiktoken.encoding_for_model(self.model_name)
@@ -358,7 +358,7 @@ class ChatOpenAI(BaseChatModel):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to calculate get_num_tokens. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
model = self.model_name

View File

@@ -0,0 +1,58 @@
from io import BytesIO
from typing import List
import boto3
from PIL import Image
from langchain.docstore.document import Document
class AwsTextractExtraction:
def __init__(
self,
aws_region_name: str,
aws_secret_key: str,
aws_access_key: str,
aws_session_token: str,
file_path: str,
):
self.aws_region_name = aws_region_name
self.aws_secret_key = aws_secret_key
self.aws_access_key = aws_access_key
self.aws_session_token = aws_session_token
self.file_path = file_path
try:
import boto3 # noqa: F401
except ImportError:
raise ValueError(
"Could not import aws boto3 package. "
"Please install it with `pip install boto3`."
)
def get_text_from_image(self) -> List[Document]:
output = []
page_no = 0
textract_client = boto3.client(
"textract",
self.aws_region_name,
self.aws_access_key,
self.aws_access_key,
self.aws_session_token,
)
pil_image_obj = Image.open(self.file_path)
buf = BytesIO()
pil_image_obj.save(buf, format="PNG")
image_bytes = buf.getvalue()
response = textract_client.detect_document_text(Document={"Bytes": image_bytes})
detected_txt = ""
for item in response["Blocks"]:
if item["BlockType"] == "LINE":
detected_txt += item["Text"] + "\n"
metadata = {"source": self.file_path, "page": page_no}
output.append(Document(page_content=detected_txt, metadata=metadata))
return output

View File

@@ -24,7 +24,7 @@ class AzureBlobStorageContainerLoader(BaseLoader):
except ImportError as exc:
raise ValueError(
"Could not import azure storage blob python package. "
"Please install it with `pip install azure-storage-blob`."
"Please it install it with `pip install azure-storage-blob`."
) from exc
container = ContainerClient.from_connection_string(

View File

@@ -24,7 +24,7 @@ class AzureBlobStorageFileLoader(BaseLoader):
except ImportError as exc:
raise ValueError(
"Could not import azure storage blob python package. "
"Please install it with `pip install azure-storage-blob`."
"Please it install it with `pip install azure-storage-blob`."
) from exc
client = BlobClient.from_connection_string(

View File

@@ -1,77 +0,0 @@
import json
import re
import warnings
from typing import List, Tuple
import requests
from langchain.docstore.document import Document
from langchain.document_loaders.base import BaseLoader
class BiliBiliLoader(BaseLoader):
"""Loader that loads bilibili transcripts."""
def __init__(self, video_urls: List[str]):
"""Initialize with bilibili url."""
self.video_urls = video_urls
def load(self) -> List[Document]:
"""Load from bilibili url."""
results = []
for url in self.video_urls:
transcript, video_info = self._get_bilibili_subs_and_info(url)
doc = Document(page_content=transcript, metadata=video_info)
results.append(doc)
return results
def _get_bilibili_subs_and_info(self, url: str) -> Tuple[str, dict]:
try:
from bilibili_api import sync, video
except ImportError:
raise ValueError(
"requests package not found, please install it with "
"`pip install bilibili-api`"
)
bvid = re.search(r"BV\w+", url)
if bvid is not None:
v = video.Video(bvid=bvid.group())
else:
aid = re.search(r"av[0-9]+", url)
if aid is not None:
try:
v = video.Video(aid=int(aid.group()[2:]))
except AttributeError:
raise ValueError(f"{url} is not bilibili url.")
else:
raise ValueError(f"{url} is not bilibili url.")
video_info = sync(v.get_info())
video_info.update({"url": url})
# Get subtitle url
subtitle = video_info.pop("subtitle")
sub_list = subtitle["list"]
if sub_list:
sub_url = sub_list[0]["subtitle_url"]
result = requests.get(sub_url)
raw_sub_titles = json.loads(result.content)["body"]
raw_transcript = " ".join([c["content"] for c in raw_sub_titles])
raw_transcript_with_meta_info = f"""
Video Title: {video_info['title']},
description: {video_info['desc']}\n
Transcript: {raw_transcript}
"""
return raw_transcript_with_meta_info, video_info
else:
raw_transcript = ""
warnings.warn(
f"""
No subtitles found for video: {url}.
Return Empty transcript.
"""
)
return raw_transcript, video_info

View File

@@ -35,7 +35,7 @@ class DuckDBLoader(BaseLoader):
except ImportError:
raise ValueError(
"Could not import duckdb python package. "
"Please install it with `pip install duckdb`."
"Please it install it with `pip install duckdb`."
)
docs = []

View File

@@ -22,7 +22,7 @@ class GCSDirectoryLoader(BaseLoader):
except ImportError:
raise ValueError(
"Could not import google-cloud-storage python package. "
"Please install it with `pip install google-cloud-storage`."
"Please it install it with `pip install google-cloud-storage`."
)
client = storage.Client(project=self.project_name)
docs = []

View File

@@ -23,7 +23,7 @@ class GCSFileLoader(BaseLoader):
except ImportError:
raise ValueError(
"Could not import google-cloud-storage python package. "
"Please install it with `pip install google-cloud-storage`."
"Please it install it with `pip install google-cloud-storage`."
)
# Initialise a client

View File

@@ -21,7 +21,7 @@ class S3DirectoryLoader(BaseLoader):
except ImportError:
raise ValueError(
"Could not import boto3 python package. "
"Please install it with `pip install boto3`."
"Please it install it with `pip install boto3`."
)
s3 = boto3.resource("s3")
bucket = s3.Bucket(self.bucket)

View File

@@ -23,7 +23,7 @@ class S3FileLoader(BaseLoader):
except ImportError:
raise ValueError(
"Could not import boto3 python package. "
"Please install it with `pip install boto3`."
"Please it install it with `pip install boto3`."
)
s3 = boto3.client("s3")
with tempfile.TemporaryDirectory() as temp_dir:

View File

@@ -118,7 +118,7 @@ class YoutubeLoader(BaseLoader):
except ImportError:
raise ImportError(
"Could not import youtube_transcript_api python package. "
"Please install it with `pip install youtube-transcript-api`."
"Please it install it with `pip install youtube-transcript-api`."
)
metadata = {"source": self.video_id}
@@ -159,7 +159,7 @@ class YoutubeLoader(BaseLoader):
except ImportError:
raise ImportError(
"Could not import pytube python package. "
"Please install it with `pip install pytube`."
"Please it install it with `pip install pytube`."
)
yt = YouTube(f"https://www.youtube.com/watch?v={self.video_id}")
video_info = {

View File

@@ -58,7 +58,7 @@ class AlephAlphaAsymmetricSemanticEmbedding(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import aleph_alpha_client python package. "
"Please install it with `pip install aleph_alpha_client`."
"Please it install it with `pip install aleph_alpha_client`."
)
values["client"] = Client(token=aleph_alpha_api_key)
return values
@@ -81,7 +81,7 @@ class AlephAlphaAsymmetricSemanticEmbedding(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import aleph_alpha_client python package. "
"Please install it with `pip install aleph_alpha_client`."
"Please it install it with `pip install aleph_alpha_client`."
)
document_embeddings = []
@@ -121,7 +121,7 @@ class AlephAlphaAsymmetricSemanticEmbedding(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import aleph_alpha_client python package. "
"Please install it with `pip install aleph_alpha_client`."
"Please it install it with `pip install aleph_alpha_client`."
)
symmetric_params = {
"prompt": Prompt.from_text(text),
@@ -166,7 +166,7 @@ class AlephAlphaSymmetricSemanticEmbedding(AlephAlphaAsymmetricSemanticEmbedding
except ImportError:
raise ValueError(
"Could not import aleph_alpha_client python package. "
"Please install it with `pip install aleph_alpha_client`."
"Please it install it with `pip install aleph_alpha_client`."
)
query_params = {
"prompt": Prompt.from_text(text),

View File

@@ -48,7 +48,7 @@ class CohereEmbeddings(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import cohere python package. "
"Please install it with `pip install cohere`."
"Please it install it with `pip install cohere`."
)
return values

View File

@@ -73,7 +73,7 @@ class HuggingFaceHubEmbeddings(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import huggingface_hub python package. "
"Please install it with `pip install huggingface_hub`."
"Please it install it with `pip install huggingface_hub`."
)
return values

View File

@@ -36,7 +36,7 @@ class JinaEmbeddings(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import `jina` python package. "
"Please install it with `pip install jina`."
"Please it install it with `pip install jina`."
)
# Setup client

View File

@@ -188,7 +188,7 @@ class OpenAIEmbeddings(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
"Please it install it with `pip install openai`."
)
return values
@@ -242,7 +242,7 @@ class OpenAIEmbeddings(BaseModel, Embeddings):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to for OpenAIEmbeddings. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
def _embedding_func(self, text: str, *, engine: str) -> List[float]:

View File

@@ -131,7 +131,7 @@ class SagemakerEndpointEmbeddings(BaseModel, Embeddings):
except ImportError:
raise ValueError(
"Could not import boto3 python package. "
"Please install it with `pip install boto3`."
"Please it install it with `pip install boto3`."
)
return values

View File

@@ -56,7 +56,7 @@ class NetworkxEntityGraph:
except ImportError:
raise ValueError(
"Could not import networkx python package. "
"Please install it with `pip install networkx`."
"Please it install it with `pip install networkx`."
)
if graph is not None:
if not isinstance(graph, nx.DiGraph):
@@ -72,7 +72,7 @@ class NetworkxEntityGraph:
except ImportError:
raise ValueError(
"Could not import networkx python package. "
"Please install it with `pip install networkx`."
"Please it install it with `pip install networkx`."
)
graph = nx.read_gml(gml_path)
return cls(graph)

View File

@@ -149,7 +149,7 @@ class AlephAlpha(LLM):
except ImportError:
raise ValueError(
"Could not import aleph_alpha_client python package. "
"Please install it with `pip install aleph_alpha_client`."
"Please it install it with `pip install aleph_alpha_client`."
)
return values

View File

@@ -76,7 +76,7 @@ class Anthropic(LLM):
except ImportError:
raise ValueError(
"Could not import anthropic python package. "
"Please install it with `pip install anthropic`."
"Please it install it with `pip install anthropic`."
)
return values

View File

@@ -73,7 +73,7 @@ class Cohere(LLM):
except ImportError:
raise ValueError(
"Could not import cohere python package. "
"Please install it with `pip install cohere`."
"Please it install it with `pip install cohere`."
)
return values

View File

@@ -70,7 +70,7 @@ class HuggingFaceEndpoint(LLM):
except ImportError:
raise ValueError(
"Could not import huggingface_hub python package. "
"Please install it with `pip install huggingface_hub`."
"Please it install it with `pip install huggingface_hub`."
)
return values

View File

@@ -66,7 +66,7 @@ class HuggingFaceHub(LLM):
except ImportError:
raise ValueError(
"Could not import huggingface_hub python package. "
"Please install it with `pip install huggingface_hub`."
"Please it install it with `pip install huggingface_hub`."
)
return values

View File

@@ -177,8 +177,6 @@ class LlamaCpp(LLM):
raise ValueError("`stop` found in both the input and default params.")
elif self.stop:
params["stop_sequences"] = self.stop
elif stop:
params["stop_sequences"] = stop
else:
params["stop_sequences"] = []

View File

@@ -28,7 +28,7 @@ class ManifestWrapper(LLM):
except ImportError:
raise ValueError(
"Could not import manifest python package. "
"Please install it with `pip install manifest-ml`."
"Please it install it with `pip install manifest-ml`."
)
return values

View File

@@ -76,7 +76,7 @@ class NLPCloud(LLM):
except ImportError:
raise ValueError(
"Could not import nlpcloud python package. "
"Please install it with `pip install nlpcloud`."
"Please it install it with `pip install nlpcloud`."
)
return values

View File

@@ -446,7 +446,7 @@ class BaseOpenAI(BaseLLM):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to calculate get_num_tokens. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
encoder = "gpt2"
if self.model_name in ("text-davinci-003", "text-davinci-002"):
@@ -611,7 +611,7 @@ class OpenAIChat(BaseLLM):
except ImportError:
raise ValueError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
"Please it install it with `pip install openai`."
)
try:
values["client"] = openai.ChatCompletion
@@ -742,7 +742,7 @@ class OpenAIChat(BaseLLM):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to calculate get_num_tokens. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
# create a GPT-3.5-Turbo encoder instance
enc = tiktoken.encoding_for_model("gpt-3.5-turbo")

View File

@@ -176,7 +176,7 @@ class SagemakerEndpoint(LLM):
except ImportError:
raise ValueError(
"Could not import boto3 python package. "
"Please install it with `pip install boto3`."
"Please it install it with `pip install boto3`."
)
return values

View File

@@ -5,7 +5,6 @@ from langchain.memory.buffer import (
from langchain.memory.buffer_window import ConversationBufferWindowMemory
from langchain.memory.chat_message_histories.dynamodb import DynamoDBChatMessageHistory
from langchain.memory.chat_message_histories.in_memory import ChatMessageHistory
from langchain.memory.chat_message_histories.postgres import PostgresChatMessageHistory
from langchain.memory.chat_message_histories.redis import RedisChatMessageHistory
from langchain.memory.combined import CombinedMemory
from langchain.memory.entity import (
@@ -37,5 +36,4 @@ __all__ = [
"ConversationTokenBufferMemory",
"RedisChatMessageHistory",
"DynamoDBChatMessageHistory",
"PostgresChatMessageHistory",
]

View File

@@ -1,9 +1,7 @@
from langchain.memory.chat_message_histories.dynamodb import DynamoDBChatMessageHistory
from langchain.memory.chat_message_histories.postgres import PostgresChatMessageHistory
from langchain.memory.chat_message_histories.redis import RedisChatMessageHistory
__all__ = [
"DynamoDBChatMessageHistory",
"RedisChatMessageHistory",
"PostgresChatMessageHistory",
]

View File

@@ -1,86 +0,0 @@
import json
import logging
from typing import List
from langchain.schema import (
AIMessage,
BaseChatMessageHistory,
BaseMessage,
HumanMessage,
_message_to_dict,
messages_from_dict,
)
logger = logging.getLogger(__name__)
DEFAULT_CONNECTION_STRING = "postgresql://postgres:mypassword@localhost/chat_history"
class PostgresChatMessageHistory(BaseChatMessageHistory):
def __init__(
self,
session_id: str,
connection_string: str = DEFAULT_CONNECTION_STRING,
table_name: str = "message_store",
):
import psycopg
from psycopg.rows import dict_row
try:
self.connection = psycopg.connect(connection_string)
self.cursor = self.connection.cursor(row_factory=dict_row)
except psycopg.OperationalError as error:
logger.error(error)
self.session_id = session_id
self.table_name = table_name
self._create_table_if_not_exists()
def _create_table_if_not_exists(self) -> None:
create_table_query = f"""CREATE TABLE IF NOT EXISTS {self.table_name} (
id SERIAL PRIMARY KEY,
session_id TEXT NOT NULL,
message JSONB NOT NULL
);"""
self.cursor.execute(create_table_query)
self.connection.commit()
@property
def messages(self) -> List[BaseMessage]: # type: ignore
"""Retrieve the messages from PostgreSQL"""
query = f"SELECT message FROM {self.table_name} WHERE session_id = %s;"
self.cursor.execute(query, (self.session_id,))
items = [record["message"] for record in self.cursor.fetchall()]
messages = messages_from_dict(items)
return messages
def add_user_message(self, message: str) -> None:
self.append(HumanMessage(content=message))
def add_ai_message(self, message: str) -> None:
self.append(AIMessage(content=message))
def append(self, message: BaseMessage) -> None:
"""Append the message to the record in PostgreSQL"""
from psycopg import sql
query = sql.SQL("INSERT INTO {} (session_id, message) VALUES (%s, %s);").format(
sql.Identifier(self.table_name)
)
self.cursor.execute(
query, (self.session_id, json.dumps(_message_to_dict(message)))
)
self.connection.commit()
def clear(self) -> None:
"""Clear session memory from PostgreSQL"""
query = f"DELETE FROM {self.table_name} WHERE session_id = %s;"
self.cursor.execute(query, (self.session_id,))
self.connection.commit()
def __del__(self) -> None:
if self.cursor:
self.cursor.close()
if self.connection:
self.connection.close()

View File

@@ -1,6 +1,5 @@
"""Lightweight wrapper around requests library, with async support."""
from contextlib import asynccontextmanager
from typing import Any, AsyncGenerator, Dict, Optional
from typing import Any, Dict, Optional
import aiohttp
import requests
@@ -43,62 +42,47 @@ class Requests(BaseModel):
"""DELETE the URL and return the text."""
return requests.delete(url, headers=self.headers, **kwargs)
@asynccontextmanager
async def _arequest(
self, method: str, url: str, **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
) -> aiohttp.ClientResponse:
"""Make an async request."""
if not self.aiosession:
async with aiohttp.ClientSession() as session:
async with session.request(
method, url, headers=self.headers, **kwargs
) as response:
yield response
return response
else:
async with self.aiosession.request(
method, url, headers=self.headers, **kwargs
) as response:
yield response
return response
@asynccontextmanager
async def aget(
self, url: str, **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
async def aget(self, url: str, **kwargs: Any) -> aiohttp.ClientResponse:
"""GET the URL and return the text asynchronously."""
async with self._arequest("GET", url, **kwargs) as response:
yield response
return await self._arequest("GET", url, **kwargs)
@asynccontextmanager
async def apost(
self, url: str, data: Dict[str, Any], **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
) -> aiohttp.ClientResponse:
"""POST to the URL and return the text asynchronously."""
async with self._arequest("POST", url, **kwargs) as response:
yield response
return await self._arequest("POST", url, json=data, **kwargs)
@asynccontextmanager
async def apatch(
self, url: str, data: Dict[str, Any], **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
) -> aiohttp.ClientResponse:
"""PATCH the URL and return the text asynchronously."""
async with self._arequest("PATCH", url, **kwargs) as response:
yield response
return await self._arequest("PATCH", url, json=data, **kwargs)
@asynccontextmanager
async def aput(
self, url: str, data: Dict[str, Any], **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
) -> aiohttp.ClientResponse:
"""PUT the URL and return the text asynchronously."""
async with self._arequest("PUT", url, **kwargs) as response:
yield response
return await self._arequest("PUT", url, json=data, **kwargs)
@asynccontextmanager
async def adelete(
self, url: str, **kwargs: Any
) -> AsyncGenerator[aiohttp.ClientResponse, None]:
async def adelete(self, url: str, **kwargs: Any) -> aiohttp.ClientResponse:
"""DELETE the URL and return the text asynchronously."""
async with self._arequest("DELETE", url, **kwargs) as response:
yield response
return await self._arequest("DELETE", url, **kwargs)
class TextRequestsWrapper(BaseModel):
@@ -142,28 +126,28 @@ class TextRequestsWrapper(BaseModel):
async def aget(self, url: str, **kwargs: Any) -> str:
"""GET the URL and return the text asynchronously."""
async with self.requests.aget(url, **kwargs) as response:
return await response.text()
response = await self.requests.aget(url, **kwargs)
return await response.text()
async def apost(self, url: str, data: Dict[str, Any], **kwargs: Any) -> str:
"""POST to the URL and return the text asynchronously."""
async with self.requests.apost(url, **kwargs) as response:
return await response.text()
response = await self.requests.apost(url, data, **kwargs)
return await response.text()
async def apatch(self, url: str, data: Dict[str, Any], **kwargs: Any) -> str:
"""PATCH the URL and return the text asynchronously."""
async with self.requests.apatch(url, **kwargs) as response:
return await response.text()
response = await self.requests.apatch(url, data, **kwargs)
return await response.text()
async def aput(self, url: str, data: Dict[str, Any], **kwargs: Any) -> str:
"""PUT the URL and return the text asynchronously."""
async with self.requests.aput(url, **kwargs) as response:
return await response.text()
response = await self.requests.aput(url, data, **kwargs)
return await response.text()
async def adelete(self, url: str, **kwargs: Any) -> str:
"""DELETE the URL and return the text asynchronously."""
async with self.requests.adelete(url, **kwargs) as response:
return await response.text()
response = await self.requests.adelete(url, **kwargs)
return await response.text()
# For backwards compatibility

View File

@@ -194,7 +194,7 @@ class BaseLanguageModel(BaseModel, ABC):
raise ValueError(
"Could not import transformers python package. "
"This is needed in order to calculate get_num_tokens. "
"Please install it with `pip install transformers`."
"Please it install it with `pip install transformers`."
)
# create a GPT-3 tokenizer instance
tokenizer = GPT2TokenizerFast.from_pretrained("gpt2")

View File

@@ -131,7 +131,7 @@ class TextSplitter(ABC):
except ImportError:
raise ValueError(
"Could not import transformers python package. "
"Please install it with `pip install transformers`."
"Please it install it with `pip install transformers`."
)
return cls(length_function=_huggingface_tokenizer_length, **kwargs)
@@ -150,7 +150,7 @@ class TextSplitter(ABC):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to calculate max_tokens_for_prompt. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
# create a GPT-3 encoder instance
@@ -205,7 +205,7 @@ class TokenTextSplitter(TextSplitter):
raise ValueError(
"Could not import tiktoken python package. "
"This is needed in order to for TokenTextSplitter. "
"Please install it with `pip install tiktoken`."
"Please it install it with `pip install tiktoken`."
)
# create a GPT-3 encoder instance
self._tokenizer = tiktoken.get_encoding(encoding_name)

View File

@@ -50,7 +50,6 @@ class PythonAstREPLTool(BaseTool):
)
globals: Optional[Dict] = Field(default_factory=dict)
locals: Optional[Dict] = Field(default_factory=dict)
sanitize_input: bool = True
@root_validator(pre=True)
def validate_python_version(cls, values: Dict) -> Dict:
@@ -66,9 +65,6 @@ class PythonAstREPLTool(BaseTool):
def _run(self, query: str) -> str:
"""Use the tool."""
try:
if self.sanitize_input:
# Remove the triple backticks from the query.
query = query.strip().strip("```")
tree = ast.parse(query)
module = ast.Module(tree.body[:-1], type_ignores=[])
exec(ast.unparse(module), self.globals, self.locals) # type: ignore

View File

@@ -31,7 +31,7 @@ class WikipediaAPIWrapper(BaseModel):
except ImportError:
raise ValueError(
"Could not import wikipedia python package. "
"Please install it with `pip install wikipedia`."
"Please it install it with `pip install wikipedia`."
)
return values

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
import logging
import uuid
from typing import Any, Iterable, List, Optional, Type
from typing import Any, Iterable, List, Optional
import numpy as np
@@ -210,7 +210,7 @@ class AtlasDB(VectorStore):
@classmethod
def from_texts(
cls: Type[AtlasDB],
cls,
texts: List[str],
embedding: Optional[Embeddings] = None,
metadatas: Optional[List[dict]] = None,
@@ -270,7 +270,7 @@ class AtlasDB(VectorStore):
@classmethod
def from_documents(
cls: Type[AtlasDB],
cls,
documents: List[Document],
embedding: Optional[Embeddings] = None,
ids: Optional[List[str]] = None,

View File

@@ -1,10 +1,8 @@
"""Interface for vector stores."""
from __future__ import annotations
import asyncio
from abc import ABC, abstractmethod
from functools import partial
from typing import Any, Dict, Iterable, List, Optional, Type, TypeVar
from typing import Any, Dict, Iterable, List, Optional
from pydantic import BaseModel, Field, root_validator
@@ -12,8 +10,6 @@ from langchain.docstore.document import Document
from langchain.embeddings.base import Embeddings
from langchain.schema import BaseRetriever
VST = TypeVar("VST", bound="VectorStore")
class VectorStore(ABC):
"""Interface for vector stores."""
@@ -85,12 +81,7 @@ class VectorStore(ABC):
self, query: str, k: int = 4, **kwargs: Any
) -> List[Document]:
"""Return docs most similar to query."""
# This is a temporary workaround to make the similarity search
# asynchronous. The proper solution is to make the similarity search
# asynchronous in the vector store implementations.
func = partial(self.similarity_search, query, k, **kwargs)
return await asyncio.get_event_loop().run_in_executor(None, func)
raise NotImplementedError
def similarity_search_by_vector(
self, embedding: List[float], k: int = 4, **kwargs: Any
@@ -110,12 +101,7 @@ class VectorStore(ABC):
self, embedding: List[float], k: int = 4, **kwargs: Any
) -> List[Document]:
"""Return docs most similar to embedding vector."""
# This is a temporary workaround to make the similarity search
# asynchronous. The proper solution is to make the similarity search
# asynchronous in the vector store implementations.
func = partial(self.similarity_search_by_vector, embedding, k, **kwargs)
return await asyncio.get_event_loop().run_in_executor(None, func)
raise NotImplementedError
def max_marginal_relevance_search(
self, query: str, k: int = 4, fetch_k: int = 20
@@ -139,12 +125,7 @@ class VectorStore(ABC):
self, query: str, k: int = 4, fetch_k: int = 20
) -> List[Document]:
"""Return docs selected using the maximal marginal relevance."""
# This is a temporary workaround to make the similarity search
# asynchronous. The proper solution is to make the similarity search
# asynchronous in the vector store implementations.
func = partial(self.max_marginal_relevance_search, query, k, fetch_k)
return await asyncio.get_event_loop().run_in_executor(None, func)
raise NotImplementedError
def max_marginal_relevance_search_by_vector(
self, embedding: List[float], k: int = 4, fetch_k: int = 20
@@ -172,11 +153,11 @@ class VectorStore(ABC):
@classmethod
def from_documents(
cls: Type[VST],
cls,
documents: List[Document],
embedding: Embeddings,
**kwargs: Any,
) -> VST:
) -> VectorStore:
"""Return VectorStore initialized from documents and embeddings."""
texts = [d.page_content for d in documents]
metadatas = [d.metadata for d in documents]
@@ -184,11 +165,11 @@ class VectorStore(ABC):
@classmethod
async def afrom_documents(
cls: Type[VST],
cls,
documents: List[Document],
embedding: Embeddings,
**kwargs: Any,
) -> VST:
) -> VectorStore:
"""Return VectorStore initialized from documents and embeddings."""
texts = [d.page_content for d in documents]
metadatas = [d.metadata for d in documents]
@@ -197,22 +178,22 @@ class VectorStore(ABC):
@classmethod
@abstractmethod
def from_texts(
cls: Type[VST],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
**kwargs: Any,
) -> VST:
) -> VectorStore:
"""Return VectorStore initialized from texts and embeddings."""
@classmethod
async def afrom_texts(
cls: Type[VST],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
**kwargs: Any,
) -> VST:
) -> VectorStore:
"""Return VectorStore initialized from texts and embeddings."""
raise NotImplementedError

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
import logging
import uuid
from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Tuple, Type
from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Tuple
import numpy as np
@@ -269,7 +269,7 @@ class Chroma(VectorStore):
@classmethod
def from_texts(
cls: Type[Chroma],
cls,
texts: List[str],
embedding: Optional[Embeddings] = None,
metadatas: Optional[List[dict]] = None,
@@ -307,7 +307,7 @@ class Chroma(VectorStore):
@classmethod
def from_documents(
cls: Type[Chroma],
cls,
documents: List[Document],
embedding: Optional[Embeddings] = None,
ids: Optional[List[str]] = None,

View File

@@ -1,10 +1,7 @@
"""VectorStore wrapper around a Postgres/PGVector database."""
from __future__ import annotations
import enum
import logging
import uuid
from typing import Any, Dict, Iterable, List, Optional, Tuple, Type
from typing import Any, Dict, Iterable, List, Optional, Tuple
import sqlalchemy
from pgvector.sqlalchemy import Vector
@@ -349,7 +346,7 @@ class PGVector(VectorStore):
@classmethod
def from_texts(
cls: Type[PGVector],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
@@ -358,7 +355,7 @@ class PGVector(VectorStore):
ids: Optional[List[str]] = None,
pre_delete_collection: bool = False,
**kwargs: Any,
) -> PGVector:
) -> "PGVector":
"""
Return VectorStore initialized from texts and embeddings.
Postgres connection string is required
@@ -398,7 +395,7 @@ class PGVector(VectorStore):
@classmethod
def from_documents(
cls: Type[PGVector],
cls,
documents: List[Document],
embedding: Embeddings,
collection_name: str = _LANGCHAIN_DEFAULT_COLLECTION_NAME,
@@ -406,7 +403,7 @@ class PGVector(VectorStore):
ids: Optional[List[str]] = None,
pre_delete_collection: bool = False,
**kwargs: Any,
) -> PGVector:
) -> "PGVector":
"""
Return VectorStore initialized from documents and embeddings.
Postgres connection string is required

View File

@@ -2,8 +2,7 @@
from __future__ import annotations
import uuid
import warnings
from typing import Any, Callable, Iterable, List, Optional, Tuple, Union
from typing import Any, Callable, Iterable, List, Optional, Tuple
from langchain.docstore.document import Document
from langchain.embeddings.base import Embeddings
@@ -27,13 +26,13 @@ class Pinecone(VectorStore):
pinecone.init(api_key="***", environment="...")
index = pinecone.Index("langchain-demo")
embeddings = OpenAIEmbeddings()
vectorstore = Pinecone(index, embeddings, "text")
vectorstore = Pinecone(index, embeddings.embed_query, "text")
"""
def __init__(
self,
index: Any,
embeddings: Union[Embeddings, Callable],
embedding_function: Callable,
text_key: str,
namespace: Optional[str] = None,
):
@@ -51,17 +50,7 @@ class Pinecone(VectorStore):
f"got {type(index)}"
)
self._index = index
if isinstance(embeddings, Embeddings):
self._embeddings = embeddings
else:
# This is for backwards compatibility issues. Previously,
# embeddings.embed_query was passed in, not the whole class
warnings.warn(
"passing a function as embeddings is deprecated, "
"you should pass an Embedding object directly. "
"If this throws an error, that is why."
)
self._embeddings = embeddings.__self__ # type: ignore
self._embedding_function = embedding_function
self._text_key = text_key
self._namespace = namespace
@@ -89,14 +78,13 @@ class Pinecone(VectorStore):
if namespace is None:
namespace = self._namespace
# Embed and create the documents
_texts = list(texts)
ids = ids or [str(uuid.uuid4()) for _ in _texts]
embeddings = self._embeddings.embed_documents(_texts)
metadatas = metadatas or [{}] * len(_texts)
for metadata, text in zip(metadatas, _texts):
docs = []
ids = ids or [str(uuid.uuid4()) for _ in texts]
for i, text in enumerate(texts):
embedding = self._embedding_function(text)
metadata = metadatas[i] if metadatas else {}
metadata[self._text_key] = text
docs = list(zip(ids, embeddings, metadatas))
docs.append((ids[i], embedding, metadata))
# upsert to Pinecone
self._index.upsert(vectors=docs, namespace=namespace, batch_size=batch_size)
return ids
@@ -121,7 +109,7 @@ class Pinecone(VectorStore):
"""
if namespace is None:
namespace = self._namespace
query_obj = self._embeddings.embed_query(query)
query_obj = self._embedding_function(query)
docs = []
results = self._index.query(
[query_obj],
@@ -157,7 +145,7 @@ class Pinecone(VectorStore):
"""
if namespace is None:
namespace = self._namespace
query_obj = self._embeddings.embed_query(query)
query_obj = self._embedding_function(query)
docs = []
results = self._index.query(
[query_obj],
@@ -256,7 +244,7 @@ class Pinecone(VectorStore):
# upsert to Pinecone
index.upsert(vectors=list(to_upsert), namespace=namespace)
return cls(index, embedding, text_key, namespace)
return cls(index, embedding.embed_query, text_key, namespace)
@classmethod
def from_existing_index(
@@ -275,4 +263,6 @@ class Pinecone(VectorStore):
"Please install it with `pip install pinecone-client`."
)
return cls(pinecone.Index(index_name), embedding, text_key, namespace)
return cls(
pinecone.Index(index_name), embedding.embed_query, text_key, namespace
)

View File

@@ -1,9 +1,7 @@
"""Wrapper around Qdrant vector database."""
from __future__ import annotations
import uuid
from operator import itemgetter
from typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Type, Union
from typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Union, cast
from langchain.docstore.document import Document
from langchain.embeddings.base import Embeddings
@@ -178,9 +176,55 @@ class Qdrant(VectorStore):
for i in mmr_selected
]
@classmethod
def from_documents(
cls,
documents: List[Document],
embedding: Embeddings,
location: Optional[str] = None,
url: Optional[str] = None,
port: Optional[int] = 6333,
grpc_port: int = 6334,
prefer_grpc: bool = False,
https: Optional[bool] = None,
api_key: Optional[str] = None,
prefix: Optional[str] = None,
timeout: Optional[float] = None,
host: Optional[str] = None,
path: Optional[str] = None,
collection_name: Optional[str] = None,
distance_func: str = "Cosine",
content_payload_key: str = CONTENT_KEY,
metadata_payload_key: str = METADATA_KEY,
**kwargs: Any,
) -> "Qdrant":
return cast(
Qdrant,
super().from_documents(
documents,
embedding,
location=location,
url=url,
port=port,
grpc_port=grpc_port,
prefer_grpc=prefer_grpc,
https=https,
api_key=api_key,
prefix=prefix,
timeout=timeout,
host=host,
path=path,
collection_name=collection_name,
distance_func=distance_func,
content_payload_key=content_payload_key,
metadata_payload_key=metadata_payload_key,
**kwargs,
),
)
@classmethod
def from_texts(
cls: Type[Qdrant],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
@@ -200,7 +244,7 @@ class Qdrant(VectorStore):
content_payload_key: str = CONTENT_KEY,
metadata_payload_key: str = METADATA_KEY,
**kwargs: Any,
) -> Qdrant:
) -> "Qdrant":
"""Construct Qdrant wrapper from raw documents.
Args:

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
import json
import logging
import uuid
from typing import Any, Callable, Dict, Iterable, List, Mapping, Optional, Tuple, Type
from typing import Any, Callable, Dict, Iterable, List, Mapping, Optional, Tuple
import numpy as np
from pydantic import BaseModel, root_validator
@@ -227,7 +227,7 @@ class Redis(VectorStore):
@classmethod
def from_texts(
cls: Type[Redis],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,

View File

@@ -1,7 +1,7 @@
"""Wrapper around weaviate vector database."""
from __future__ import annotations
from typing import Any, Dict, Iterable, List, Optional, Type
from typing import Any, Dict, Iterable, List, Optional
from uuid import uuid4
from langchain.docstore.document import Document
@@ -104,11 +104,11 @@ class Weaviate(VectorStore):
@classmethod
def from_texts(
cls: Type[Weaviate],
cls,
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
**kwargs: Any,
) -> Weaviate:
) -> VectorStore:
"""Not implemented for Weaviate yet."""
raise NotImplementedError("weaviate does not currently support `from_texts`.")

680
poetry.lock generated
View File

@@ -562,7 +562,7 @@ name = "backoff"
version = "2.2.1"
description = "Function decoration for backoff and retry"
category = "main"
optional = false
optional = true
python-versions = ">=3.7,<4.0"
files = [
{file = "backoff-2.2.1-py3-none-any.whl", hash = "sha256:63579f9a0628e06278f7e47b7d7d5b6ce20dc65c5e96a6f3ca99a6adca0396e8"},
@@ -936,31 +936,6 @@ files = [
{file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
]
[[package]]
name = "chromadb"
version = "0.3.21"
description = "Chroma."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "chromadb-0.3.21-py3-none-any.whl", hash = "sha256:b497516ef403d357944742b2363eb729019d68ec0d1a7062a6abe8e127ccf28f"},
{file = "chromadb-0.3.21.tar.gz", hash = "sha256:7b3417892666dc90df10eafae719ee189037c448c1c96e6c7964daa870483c3a"},
]
[package.dependencies]
clickhouse-connect = ">=0.5.7"
duckdb = ">=0.7.1"
fastapi = ">=0.85.1"
hnswlib = ">=0.7"
numpy = ">=1.21.6"
pandas = ">=1.3"
posthog = ">=2.4.0"
pydantic = ">=1.9"
requests = ">=2.28"
sentence-transformers = ">=2.2.2"
uvicorn = {version = ">=0.18.3", extras = ["standard"]}
[[package]]
name = "click"
version = "8.1.3"
@@ -976,126 +951,6 @@ files = [
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[[package]]
name = "clickhouse-connect"
version = "0.5.20"
description = "ClickHouse core driver, SqlAlchemy, and Superset libraries"
category = "dev"
optional = false
python-versions = "~=3.7"
files = [
{file = "clickhouse-connect-0.5.20.tar.gz", hash = "sha256:5fc9a84849f3c3b6f6928b45a0df17fa63ebcf4e518b3a48ec70720957e18683"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c29cf8b2c90eed6b83366c13ab5ad471ff6ef2e334f35818729330854b9747ac"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5c03ded1b006fa2cf8f7d823f0ff9c6d294e442a123c96ca2a9ebc4b293bfb7f"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9eb0024160412d9c6079fa6982cb29abda4db8412b4f63918de7a1bde1dcb7aa"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:170bd258d21bc828557f8a55f23affe22cc4e671c93f645a6316ef874e359f8e"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dc70fee875fdba42c0a6f519fa376659a08253fd36d188b8b304f4ccda572177"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:18837e06846797db475b6aee13f03928fb169f64d0efb268e2bb04e015990b5b"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:76f7a7d2d41377e6f382a7ada825be594c2d316481f3194bfffd025727633258"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:3bac453f1199af29ec7292d2fd2a8cb0cc0e6692bec9c9da50ce5aec10ff0339"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-win32.whl", hash = "sha256:14983562d2687b18d03a35f27b4e7f28cf013c280ff4fee726501e03bae7528d"},
{file = "clickhouse_connect-0.5.20-cp310-cp310-win_amd64.whl", hash = "sha256:3d618a9c15ee4d2facc7a79e59a646262da64e6ec39d2a1ac6a68167d52266bf"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6bdfb74ba2bf5157230f576e16c7d708f20ffa7e4b19c54288d7db2b55ebcd17"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:fce7e54ad14b732479c5630948324f7088c3092a74a2442bf015a7cab4bc0a41"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1e6a2b6d123f5de362d49f079c509a0a43cfbaecae0130c860706ef738af12b7"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a9391128387013755de8e420bb7e17c6c809f77ca3233fdc966a1df023fa85d"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1df976816913675b46134e8dd9dee2cf315cc4bf42e258211f8036099b8fc280"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:f1ddeb651bc75b87ec5fa1fbe17fe3a589d00f42cad76d6e64918067f5025798"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:caf60b4bfb7214d80455137eee45ca0943a370885d65f4298fafde0d431e837a"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5c0bdcb72607244dc920f543ee6363a6094e836770aaac07f20556936af85813"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-win32.whl", hash = "sha256:cc3f77df2b1cab2aa99b59f529aead2cc96beac1639ed18f7fd8dba392957623"},
{file = "clickhouse_connect-0.5.20-cp311-cp311-win_amd64.whl", hash = "sha256:e44c3b7e40402ce0650f69cbc31f2f503073e2bec9f2b31befbd823150f2431d"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ba78e7d270d78f9559e4a836c6c4f55ab54d9f2b6505c0d05db6260e8e2a4f6a"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e8924824cd19b739cc920d867bf291a31a5da406637e0c575f6eb961cfb0557"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:672c260c471fd18a87a4f5130e6d72590cd4f57289669c58feff5be934810d28"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:69887898f8f5ea6e70c30aa51c756f8a752ef0eb1df747d4aec7b7d10de5e103"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c4da55465a52e0e440772e289e6959cc6acbb2efa0561a7ea4f9a7108159958d"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2087b64ab47969e603cd9735e7c0433bdf15c6d83025abd00c50ca9a617ed39b"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:28b72cabb1d4fc3f04392ed1f654bd925b6c950305869971186f73b2d13d835a"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-win32.whl", hash = "sha256:a481e13216de227aa624449f5f6ead9e51fe7c8f18bbd783c41e4b396919fa08"},
{file = "clickhouse_connect-0.5.20-cp37-cp37m-win_amd64.whl", hash = "sha256:c1dc77bdc15240d6d4d375e098c77403aeabbc6f8b1c2ce524f4389a5d8c6d74"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4fe527b6b4306cad58dde934493d5f018166f78f5914f6abf6ed93750ca7ecbd"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c07b9ca21d302e843aa8c031ef15f85c86280c5730858edfe4eeb952d3991d1d"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71e427b3cd1f611bcb8315ea9bc17f0329329ca21043f1a5ef068e2903457b9b"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9319037b437c8d1297b00d8bc3f92239cc2296db409b5bfc2ff22b05c5f3a26f"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d8c3c533fd2baff653dc40e7b88ca86ce9b8d0923c34fb33ce5ce1d1b7370fe6"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c850bc0cf5a00bd144202a6926b646baa60fb4e6c449b62d46c230c548ec760a"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:632922c90cd71fcb8e1b7e6e2a9b4487dee2e67b91846dc1778cfd9d5198d047"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a6c7733b5754ea048bd7928b0cce6625d71c709570c97f1819ba36054850d915"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-win32.whl", hash = "sha256:738b35e061a3c665e9a099a3b5cb50338bed89a6eee3ce29190cd525a1bc1892"},
{file = "clickhouse_connect-0.5.20-cp38-cp38-win_amd64.whl", hash = "sha256:58da16eac95126d441f106d27c8e3ae931fcc784f263d7d916b5a8086bdcf757"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b9c57f6958021ec0b22eabaa02e567df3ff5f85fdfd9d052e3aface655bdf3d1"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e9c9a2de183a85fc32ef70973cfad5c9af2a8d73733aa30b9523c1400b813c13"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50fd663b132c4edc1fc5dae33c5cbd2538dd2e0c94bd9fff5e98ca3ca12059a2"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:26a98b165fa2c8420e5219db244f0790b13f401a0932c6a7d5e5c1a959a26b80"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9686bd02a16e3b6cbf976b2476e54bc7caaf1a95fd129fd44b2692d082dfcef6"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d01a51871dde0cd0d24efafd61ab27c57293a0456a26ec7e8a5a585623239ab1"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2c1096ebad10964fcdd646f41228accf182d24b066cefd18d9b33f021e3017cd"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1f0407cc9ea9d2cf51edfe59993c536c256ae54c40c6b36fb7f738edd48f51b5"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-win32.whl", hash = "sha256:184f7c119c9725b25ecaa3011420de8dc06530999653508a983b27c90894146c"},
{file = "clickhouse_connect-0.5.20-cp39-cp39-win_amd64.whl", hash = "sha256:f7d2cbde4543cccddef8465afed221f81095eec3d3b763d7570c22ae99819ab4"},
{file = "clickhouse_connect-0.5.20-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:7f83a6e61b9832fc9184bf67e3f7bc041f3b940c066b8162bfadf02aa484b1c4"},
{file = "clickhouse_connect-0.5.20-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61b22a7038553813a8f5432cd3b1e57b6d94c629d599d775f57c64c4700a5df"},
{file = "clickhouse_connect-0.5.20-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fbae752fadbd9fa9390f2246c5ce6e75a91225d03adb3451beb49bd3f1ea48f0"},
{file = "clickhouse_connect-0.5.20-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9da5c94be2255d6e07e255899411a5e009723f331d90359e5b21c66e8007630"},
{file = "clickhouse_connect-0.5.20-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:205a3dc992548891150d42856e418398d135d9dfa5f30f53bb7c3633d6b449d0"},
{file = "clickhouse_connect-0.5.20-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:5e0c42adc692f2fb285f5f898d166cf4ed9b5779e5f3effab8f612cd3362f004"},
{file = "clickhouse_connect-0.5.20-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e8a2d9dfbfd7c3075f5d1c7011e32b5b62853000d16f93684fa69d8b8979a04"},
{file = "clickhouse_connect-0.5.20-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2f8bb09db27aba694193073137bd69f8404e53c2ee80f2dbf41c829c081175a"},
{file = "clickhouse_connect-0.5.20-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:52e07d91e3bcaf3989d698a4d9ad9b36f1dcf357673cc4c44a6663ab78581066"},
{file = "clickhouse_connect-0.5.20-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:7832b2c4c4c4b316258bd078b54a82c84aeccd62c917eb986059de738b13b56b"},
{file = "clickhouse_connect-0.5.20-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2e7dad00ce8df847f896c50aa9644c685259a995a15823fec788348e736fb893"},
{file = "clickhouse_connect-0.5.20-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:34b6c4f16d8b4c5c458504da64e87fb2ec1390640ed7345bf051cfbba18526f4"},
{file = "clickhouse_connect-0.5.20-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:85ce3896158cbac451253bc3632140920a57bb775a82d68370de9ace97ce96a8"},
{file = "clickhouse_connect-0.5.20-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:65f1e552c4efdab1937ff824f062561fe0b6901044ea06b373a35c8a1a679cea"},
{file = "clickhouse_connect-0.5.20-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:05d1cfd70fd90b5d7cdb4e93d603d34f74d34327811e8f573fbbd87838cfd4a3"},
]
[package.dependencies]
certifi = "*"
lz4 = "*"
pytz = "*"
urllib3 = ">=1.26"
zstandard = "*"
[package.extras]
arrow = ["pyarrow"]
numpy = ["numpy"]
orjson = ["orjson"]
pandas = ["pandas"]
sqlalchemy = ["sqlalchemy (>1.3.21,<1.4)"]
superset = ["apache-superset (>=1.4.1)"]
[[package]]
name = "cmake"
version = "3.26.1"
description = "CMake is an open-source, cross-platform family of tools designed to build, test and package software"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "cmake-3.26.1-py2.py3-none-macosx_10_10_universal2.macosx_10_10_x86_64.macosx_11_0_arm64.macosx_11_0_universal2.whl", hash = "sha256:d8a7e0cc8677677a732aff3e3fd0ad64eeff43cac772614b03c436912247d0d8"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2010_i686.manylinux_2_12_i686.whl", hash = "sha256:f2f721f5aebe304c281ee4b1d2dfbf7f4a52fca003834b2b4a3ba838aeded63c"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2010_x86_64.manylinux_2_12_x86_64.whl", hash = "sha256:63a012b72836702eadfe4fba9642aeb17337f26861f4768e837053f40e98cb46"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2b72be88b7bfaa6ae59566cbb9d6a5553f19b2a8d14efa6ac0cf019a29860a1b"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:1278354f7210e22458aa9137d46a56da1f115a7b76ad2733f0bf6041fb40f1dc"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:de96a5522917fba0ab0da2d01d9dd9462fa80f365218bf27162d539c2335758f"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:449928ad7dfcd41e4dcff64c7d44f86557883c70577666a19e79e22d783bbbd0"},
{file = "cmake-3.26.1-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:19fa3e457afecf2803265f71652ef17c3f1d317173c330ba46767a0853d38fa0"},
{file = "cmake-3.26.1-py2.py3-none-musllinux_1_1_aarch64.whl", hash = "sha256:43360650d60d177d979e4ad0a5f31afa286e6d88f5350f7a38c29d94514900eb"},
{file = "cmake-3.26.1-py2.py3-none-musllinux_1_1_i686.whl", hash = "sha256:16aac10363bc926da5109a59ef8fe46ddcd7e3d421de61f871b35524eef2f1ae"},
{file = "cmake-3.26.1-py2.py3-none-musllinux_1_1_ppc64le.whl", hash = "sha256:e460ba5070be4dcac9613cb526a46db4e5fa19d8b909a8d8d5244c6cc3c777e1"},
{file = "cmake-3.26.1-py2.py3-none-musllinux_1_1_s390x.whl", hash = "sha256:fd2ecc0899f7939a014bd906df85e8681bd63ce457de3ab0b5d9e369fa3bdf79"},
{file = "cmake-3.26.1-py2.py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:22781a23e274ba9bf380b970649654851c1b4b9d83b65fec12ee2e2e03b6ffc4"},
{file = "cmake-3.26.1-py2.py3-none-win32.whl", hash = "sha256:7b4e81de30ac1fb2f1eb5287063e140b53f376fd9ed7e2060c1c7b5917bd5f83"},
{file = "cmake-3.26.1-py2.py3-none-win_amd64.whl", hash = "sha256:90845b6c87a25be07e9220f67dd7f6c891c6ec14d764d37335218d97f9ea4520"},
{file = "cmake-3.26.1-py2.py3-none-win_arm64.whl", hash = "sha256:43bd96327e2631183bb4829ba20cb810e20b4b0c68f852fcd7082fbb5359d57c"},
{file = "cmake-3.26.1.tar.gz", hash = "sha256:4e0eb3c03dcf2d459f78d96cc85f7482476aeb1ae5ada65150b1db35c0f70cc7"},
]
[package.extras]
test = ["codecov (>=2.0.5)", "coverage (>=4.2)", "flake8 (>=3.0.4)", "path.py (>=11.5.0)", "pytest (>=3.0.3)", "pytest-cov (>=2.4.0)", "pytest-runner (>=2.9)", "pytest-virtualenv (>=1.7.0)", "scikit-build (>=0.10.0)", "setuptools (>=28.0.0)", "virtualenv (>=15.0.3)", "wheel"]
[[package]]
name = "cohere"
version = "3.10.0"
@@ -1746,7 +1601,7 @@ name = "fastapi"
version = "0.95.0"
description = "FastAPI framework, high performance, easy to learn, fast to code, ready for production"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "fastapi-0.95.0-py3-none-any.whl", hash = "sha256:daf73bbe844180200be7966f68e8ec9fd8be57079dff1bacb366db32729e6eb5"},
@@ -2326,7 +2181,7 @@ name = "h11"
version = "0.14.0"
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
@@ -2387,20 +2242,6 @@ files = [
[package.dependencies]
numpy = ">=1.14.5"
[[package]]
name = "hnswlib"
version = "0.7.0"
description = "hnswlib"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "hnswlib-0.7.0.tar.gz", hash = "sha256:bc459668e7e44bb7454b256b90c98c5af750653919d9a91698dafcf416cf64c4"},
]
[package.dependencies]
numpy = "*"
[[package]]
name = "hpack"
version = "4.0.0"
@@ -2455,7 +2296,7 @@ name = "httptools"
version = "0.5.0"
description = "A collection of framework independent HTTP protocol utils."
category = "main"
optional = false
optional = true
python-versions = ">=3.5.0"
files = [
{file = "httptools-0.5.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8f470c79061599a126d74385623ff4744c4e0f4a0997a353a44923c0b561ee51"},
@@ -3026,7 +2867,7 @@ name = "joblib"
version = "1.2.0"
description = "Lightweight pipelining with Python functions"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "joblib-1.2.0-py3-none-any.whl", hash = "sha256:091138ed78f800342968c523bdde947e7a305b8594b910a0fea2ab83c3c6d385"},
@@ -3375,17 +3216,6 @@ beautifulsoup4 = ">=4.8.1"
dnspython = ">=2.0"
requests = ">=2.20"
[[package]]
name = "lit"
version = "16.0.0"
description = "A Software Testing Tool"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "lit-16.0.0.tar.gz", hash = "sha256:3c4ac372122a1de4a88deb277b956f91b7209420a0bef683b1ab2d2b16dabe11"},
]
[[package]]
name = "livereload"
version = "2.6.3"
@@ -3421,56 +3251,6 @@ win32-setctime = {version = ">=1.0.0", markers = "sys_platform == \"win32\""}
[package.extras]
dev = ["Sphinx (>=4.1.1)", "black (>=19.10b0)", "colorama (>=0.3.4)", "docutils (==0.16)", "flake8 (>=3.7.7)", "isort (>=5.1.1)", "pytest (>=4.6.2)", "pytest-cov (>=2.7.1)", "sphinx-autobuild (>=0.7.1)", "sphinx-rtd-theme (>=0.4.3)", "tox (>=3.9.0)"]
[[package]]
name = "lz4"
version = "4.3.2"
description = "LZ4 Bindings for Python"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "lz4-4.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1c4c100d99eed7c08d4e8852dd11e7d1ec47a3340f49e3a96f8dfbba17ffb300"},
{file = "lz4-4.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:edd8987d8415b5dad25e797043936d91535017237f72fa456601be1479386c92"},
{file = "lz4-4.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f7c50542b4ddceb74ab4f8b3435327a0861f06257ca501d59067a6a482535a77"},
{file = "lz4-4.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f5614d8229b33d4a97cb527db2a1ac81308c6e796e7bdb5d1309127289f69d5"},
{file = "lz4-4.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8f00a9ba98f6364cadda366ae6469b7b3568c0cced27e16a47ddf6b774169270"},
{file = "lz4-4.3.2-cp310-cp310-win32.whl", hash = "sha256:b10b77dc2e6b1daa2f11e241141ab8285c42b4ed13a8642495620416279cc5b2"},
{file = "lz4-4.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:86480f14a188c37cb1416cdabacfb4e42f7a5eab20a737dac9c4b1c227f3b822"},
{file = "lz4-4.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7c2df117def1589fba1327dceee51c5c2176a2b5a7040b45e84185ce0c08b6a3"},
{file = "lz4-4.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1f25eb322eeb24068bb7647cae2b0732b71e5c639e4e4026db57618dcd8279f0"},
{file = "lz4-4.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8df16c9a2377bdc01e01e6de5a6e4bbc66ddf007a6b045688e285d7d9d61d1c9"},
{file = "lz4-4.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f571eab7fec554d3b1db0d666bdc2ad85c81f4b8cb08906c4c59a8cad75e6e22"},
{file = "lz4-4.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7211dc8f636ca625abc3d4fb9ab74e5444b92df4f8d58ec83c8868a2b0ff643d"},
{file = "lz4-4.3.2-cp311-cp311-win32.whl", hash = "sha256:867664d9ca9bdfce840ac96d46cd8838c9ae891e859eb98ce82fcdf0e103a947"},
{file = "lz4-4.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:a6a46889325fd60b8a6b62ffc61588ec500a1883db32cddee9903edfba0b7584"},
{file = "lz4-4.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3a85b430138882f82f354135b98c320dafb96fc8fe4656573d95ab05de9eb092"},
{file = "lz4-4.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65d5c93f8badacfa0456b660285e394e65023ef8071142e0dcbd4762166e1be0"},
{file = "lz4-4.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b50f096a6a25f3b2edca05aa626ce39979d63c3b160687c8c6d50ac3943d0ba"},
{file = "lz4-4.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:200d05777d61ba1ff8d29cb51c534a162ea0b4fe6d3c28be3571a0a48ff36080"},
{file = "lz4-4.3.2-cp37-cp37m-win32.whl", hash = "sha256:edc2fb3463d5d9338ccf13eb512aab61937be50aa70734bcf873f2f493801d3b"},
{file = "lz4-4.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:83acfacab3a1a7ab9694333bcb7950fbeb0be21660d236fd09c8337a50817897"},
{file = "lz4-4.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7a9eec24ec7d8c99aab54de91b4a5a149559ed5b3097cf30249b665689b3d402"},
{file = "lz4-4.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:31d72731c4ac6ebdce57cd9a5cabe0aecba229c4f31ba3e2c64ae52eee3fdb1c"},
{file = "lz4-4.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83903fe6db92db0be101acedc677aa41a490b561567fe1b3fe68695b2110326c"},
{file = "lz4-4.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:926b26db87ec8822cf1870efc3d04d06062730ec3279bbbd33ba47a6c0a5c673"},
{file = "lz4-4.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e05afefc4529e97c08e65ef92432e5f5225c0bb21ad89dee1e06a882f91d7f5e"},
{file = "lz4-4.3.2-cp38-cp38-win32.whl", hash = "sha256:ad38dc6a7eea6f6b8b642aaa0683253288b0460b70cab3216838747163fb774d"},
{file = "lz4-4.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:7e2dc1bd88b60fa09b9b37f08553f45dc2b770c52a5996ea52b2b40f25445676"},
{file = "lz4-4.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:edda4fb109439b7f3f58ed6bede59694bc631c4b69c041112b1b7dc727fffb23"},
{file = "lz4-4.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0ca83a623c449295bafad745dcd399cea4c55b16b13ed8cfea30963b004016c9"},
{file = "lz4-4.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d5ea0e788dc7e2311989b78cae7accf75a580827b4d96bbaf06c7e5a03989bd5"},
{file = "lz4-4.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a98b61e504fb69f99117b188e60b71e3c94469295571492a6468c1acd63c37ba"},
{file = "lz4-4.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4931ab28a0d1c133104613e74eec1b8bb1f52403faabe4f47f93008785c0b929"},
{file = "lz4-4.3.2-cp39-cp39-win32.whl", hash = "sha256:ec6755cacf83f0c5588d28abb40a1ac1643f2ff2115481089264c7630236618a"},
{file = "lz4-4.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:4caedeb19e3ede6c7a178968b800f910db6503cb4cb1e9cc9221157572139b49"},
{file = "lz4-4.3.2.tar.gz", hash = "sha256:e1431d84a9cfb23e6773e72078ce8e65cad6745816d4cbf9ae67da5ea419acda"},
]
[package.extras]
docs = ["sphinx (>=1.6.0)", "sphinx-bootstrap-theme"]
flake8 = ["flake8"]
tests = ["psutil", "pytest (!=3.3.0)", "pytest-cov"]
[[package]]
name = "manifest-ml"
version = "0.0.1"
@@ -3690,18 +3470,6 @@ files = [
{file = "mistune-2.0.5.tar.gz", hash = "sha256:0246113cb2492db875c6be56974a7c893333bf26cd92891c85f63151cee09d34"},
]
[[package]]
name = "monotonic"
version = "1.6"
description = "An implementation of time.monotonic() for Python 2 & < 3.3"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "monotonic-1.6-py2.py3-none-any.whl", hash = "sha256:68687e19a14f11f26d140dd5c86f3dba4bf5df58003000ed467e0e2a69bca96c"},
{file = "monotonic-1.6.tar.gz", hash = "sha256:3a55207bcfed53ddd5c5bae174524062935efed17792e9de2ad0205ce9ad63f7"},
]
[[package]]
name = "more-itertools"
version = "9.1.0"
@@ -3714,24 +3482,6 @@ files = [
{file = "more_itertools-9.1.0-py3-none-any.whl", hash = "sha256:d2bc7f02446e86a68911e58ded76d6561eea00cddfb2a91e7019bbb586c799f3"},
]
[[package]]
name = "mpmath"
version = "1.3.0"
description = "Python library for arbitrary-precision floating-point arithmetic"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c"},
{file = "mpmath-1.3.0.tar.gz", hash = "sha256:7a28eb2a9774d00c7bc92411c19a89209d5da7c4c9a9e227be8330a23a25b91f"},
]
[package.extras]
develop = ["codecov", "pycodestyle", "pytest (>=4.6)", "pytest-cov", "wheel"]
docs = ["sphinx"]
gmpy = ["gmpy2 (>=2.1.0a4)"]
tests = ["pytest (>=4.6)"]
[[package]]
name = "multidict"
version = "6.0.4"
@@ -4156,7 +3906,7 @@ name = "networkx"
version = "2.8.8"
description = "Python package for creating and manipulating graphs and networks"
category = "main"
optional = false
optional = true
python-versions = ">=3.8"
files = [
{file = "networkx-2.8.8-py3-none-any.whl", hash = "sha256:e435dfa75b1d7195c7b8378c3859f0445cd88c6b0375c181ed66823a9ceb7524"},
@@ -4190,7 +3940,7 @@ name = "nltk"
version = "3.8.1"
description = "Natural Language Toolkit"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "nltk-3.8.1-py3-none-any.whl", hash = "sha256:fd5c9109f976fa86bcadba8f91e47f5e9293bd034474752e92a520f81c93dda5"},
@@ -4368,7 +4118,7 @@ name = "nvidia-cublas-cu11"
version = "11.10.3.66"
description = "CUBLAS native runtime libraries"
category = "main"
optional = false
optional = true
python-versions = ">=3"
files = [
{file = "nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl", hash = "sha256:d32e4d75f94ddfb93ea0a5dda08389bcc65d8916a25cb9f37ac89edaeed3bded"},
@@ -4379,28 +4129,12 @@ files = [
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-cuda-cupti-cu11"
version = "11.7.101"
description = "CUDA profiling tools runtime libs."
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_cuda_cupti_cu11-11.7.101-py3-none-manylinux1_x86_64.whl", hash = "sha256:e0cfd9854e1f2edaa36ca20d21cd0bdd5dcfca4e3b9e130a082e05b33b6c5895"},
{file = "nvidia_cuda_cupti_cu11-11.7.101-py3-none-win_amd64.whl", hash = "sha256:7cc5b8f91ae5e1389c3c0ad8866b3b016a175e827ea8f162a672990a402ab2b0"},
]
[package.dependencies]
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-cuda-nvrtc-cu11"
version = "11.7.99"
description = "NVRTC native runtime libraries"
category = "main"
optional = false
optional = true
python-versions = ">=3"
files = [
{file = "nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl", hash = "sha256:9f1562822ea264b7e34ed5930567e89242d266448e936b85bc97a3370feabb03"},
@@ -4417,7 +4151,7 @@ name = "nvidia-cuda-runtime-cu11"
version = "11.7.99"
description = "CUDA Runtime native Libraries"
category = "main"
optional = false
optional = true
python-versions = ">=3"
files = [
{file = "nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl", hash = "sha256:cc768314ae58d2641f07eac350f40f99dcb35719c4faff4bc458a7cd2b119e31"},
@@ -4433,7 +4167,7 @@ name = "nvidia-cudnn-cu11"
version = "8.5.0.96"
description = "cuDNN runtime libraries"
category = "main"
optional = false
optional = true
python-versions = ">=3"
files = [
{file = "nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl", hash = "sha256:402f40adfc6f418f9dae9ab402e773cfed9beae52333f6d86ae3107a1b9527e7"},
@@ -4444,94 +4178,6 @@ files = [
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-cufft-cu11"
version = "10.9.0.58"
description = "CUFFT native runtime libraries"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_cufft_cu11-10.9.0.58-py3-none-manylinux1_x86_64.whl", hash = "sha256:222f9da70c80384632fd6035e4c3f16762d64ea7a843829cb278f98b3cb7dd81"},
{file = "nvidia_cufft_cu11-10.9.0.58-py3-none-win_amd64.whl", hash = "sha256:c4d316f17c745ec9c728e30409612eaf77a8404c3733cdf6c9c1569634d1ca03"},
]
[[package]]
name = "nvidia-curand-cu11"
version = "10.2.10.91"
description = "CURAND native runtime libraries"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_curand_cu11-10.2.10.91-py3-none-manylinux1_x86_64.whl", hash = "sha256:eecb269c970fa599a2660c9232fa46aaccbf90d9170b96c462e13bcb4d129e2c"},
{file = "nvidia_curand_cu11-10.2.10.91-py3-none-win_amd64.whl", hash = "sha256:f742052af0e1e75523bde18895a9ed016ecf1e5aa0ecddfcc3658fd11a1ff417"},
]
[package.dependencies]
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-cusolver-cu11"
version = "11.4.0.1"
description = "CUDA solver native runtime libraries"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_cusolver_cu11-11.4.0.1-2-py3-none-manylinux1_x86_64.whl", hash = "sha256:72fa7261d755ed55c0074960df5904b65e2326f7adce364cbe4945063c1be412"},
{file = "nvidia_cusolver_cu11-11.4.0.1-py3-none-manylinux1_x86_64.whl", hash = "sha256:700b781bfefd57d161443aff9ace1878584b93e0b2cfef3d6e9296d96febbf99"},
{file = "nvidia_cusolver_cu11-11.4.0.1-py3-none-win_amd64.whl", hash = "sha256:00f70b256add65f8c1eb3b6a65308795a93e7740f6df9e273eccbba770d370c4"},
]
[package.dependencies]
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-cusparse-cu11"
version = "11.7.4.91"
description = "CUSPARSE native runtime libraries"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_cusparse_cu11-11.7.4.91-py3-none-manylinux1_x86_64.whl", hash = "sha256:a3389de714db63321aa11fbec3919271f415ef19fda58aed7f2ede488c32733d"},
{file = "nvidia_cusparse_cu11-11.7.4.91-py3-none-win_amd64.whl", hash = "sha256:304a01599534f5186a8ed1c3756879282c72c118bc77dd890dc1ff868cad25b9"},
]
[package.dependencies]
setuptools = "*"
wheel = "*"
[[package]]
name = "nvidia-nccl-cu11"
version = "2.14.3"
description = "NVIDIA Collective Communication Library (NCCL) Runtime"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_nccl_cu11-2.14.3-py3-none-manylinux1_x86_64.whl", hash = "sha256:5e5534257d1284b8e825bc3a182c6f06acd6eb405e9f89d49340e98cd8f136eb"},
]
[[package]]
name = "nvidia-nvtx-cu11"
version = "11.7.91"
description = "NVIDIA Tools Extension"
category = "main"
optional = false
python-versions = ">=3"
files = [
{file = "nvidia_nvtx_cu11-11.7.91-py3-none-manylinux1_x86_64.whl", hash = "sha256:b22c64eee426a62fc00952b507d6d29cf62b4c9df7a480fcc417e540e05fd5ac"},
{file = "nvidia_nvtx_cu11-11.7.91-py3-none-win_amd64.whl", hash = "sha256:dfd7fcb2a91742513027d63a26b757f38dd8b07fecac282c4d132a9d373ff064"},
]
[package.dependencies]
setuptools = "*"
wheel = "*"
[[package]]
name = "oauthlib"
version = "3.2.2"
@@ -5267,30 +4913,6 @@ files = [
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "posthog"
version = "2.4.2"
description = "Integrate PostHog into any python application."
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "posthog-2.4.2-py2.py3-none-any.whl", hash = "sha256:8c7c37de997d955aea61bf0aa0069970e71f0d9d79c9e6b3a134e6593d5aa3d6"},
{file = "posthog-2.4.2.tar.gz", hash = "sha256:652a628623aab26597e8421a7ddf9caaf19dd93cc1288901a6b23db9693d34e5"},
]
[package.dependencies]
backoff = ">=1.10.0"
monotonic = ">=1.5"
python-dateutil = ">2.1"
requests = ">=2.7,<3.0"
six = ">=1.5"
[package.extras]
dev = ["black", "flake8", "flake8-print", "isort", "pre-commit"]
sentry = ["django", "sentry-sdk"]
test = ["coverage", "flake8", "freezegun (==0.3.15)", "mock (>=2.0.0)", "pylint", "pytest"]
[[package]]
name = "pox"
version = "0.3.2"
@@ -5930,17 +5552,18 @@ files = [
[[package]]
name = "pytest"
version = "7.3.0"
version = "7.2.2"
description = "pytest: simple powerful testing with Python"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "pytest-7.3.0-py3-none-any.whl", hash = "sha256:933051fa1bfbd38a21e73c3960cebdad4cf59483ddba7696c48509727e17f201"},
{file = "pytest-7.3.0.tar.gz", hash = "sha256:58ecc27ebf0ea643ebfdf7fb1249335da761a00c9f955bcd922349bcb68ee57d"},
{file = "pytest-7.2.2-py3-none-any.whl", hash = "sha256:130328f552dcfac0b1cec75c12e3f005619dc5f874f0a06e8ff7263f0ee6225e"},
{file = "pytest-7.2.2.tar.gz", hash = "sha256:c99ab0c73aceb050f68929bc93af19ab6db0558791c6a0715723abe9d0ade9d4"},
]
[package.dependencies]
attrs = ">=19.2.0"
colorama = {version = "*", markers = "sys_platform == \"win32\""}
exceptiongroup = {version = ">=1.0.0rc8", markers = "python_version < \"3.11\""}
iniconfig = "*"
@@ -5949,7 +5572,7 @@ pluggy = ">=0.12,<2.0"
tomli = {version = ">=1.0.0", markers = "python_version < \"3.11\""}
[package.extras]
testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "xmlschema"]
testing = ["argcomplete", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "xmlschema"]
[[package]]
name = "pytest-asyncio"
@@ -6309,14 +5932,14 @@ cffi = {version = "*", markers = "implementation_name == \"pypy\""}
[[package]]
name = "qdrant-client"
version = "1.1.3"
version = "1.1.2"
description = "Client library for the Qdrant vector search engine"
category = "main"
optional = true
python-versions = ">=3.7,<3.12"
files = [
{file = "qdrant_client-1.1.3-py3-none-any.whl", hash = "sha256:c95f59fb9e3e89d163517f8992ee4557eccb45c252147e11e45c608ef1c7dd29"},
{file = "qdrant_client-1.1.3.tar.gz", hash = "sha256:2b7de2b987fc456c643a06878a4150947c3d3d6c6515f6c29f6e707788daa6e7"},
{file = "qdrant_client-1.1.2-py3-none-any.whl", hash = "sha256:e722aa76af3d4db1f52bc49857f1e2398cbb89afafac1a7b7f21eda424c72faf"},
{file = "qdrant_client-1.1.2.tar.gz", hash = "sha256:708b7a6291dfeeeaa8c5ac2e61a0f73b61fa66b45e31a568e3f406ab08393100"},
]
[package.dependencies]
@@ -6655,7 +6278,7 @@ name = "scikit-learn"
version = "1.2.2"
description = "A set of python modules for machine learning and data mining"
category = "main"
optional = false
optional = true
python-versions = ">=3.8"
files = [
{file = "scikit-learn-1.2.2.tar.gz", hash = "sha256:8429aea30ec24e7a8c7ed8a3fa6213adf3814a6efbea09e16e0a0c71e1a1a3d7"},
@@ -6698,7 +6321,7 @@ name = "scipy"
version = "1.9.3"
description = "Fundamental algorithms for scientific computing in Python"
category = "main"
optional = false
optional = true
python-versions = ">=3.8"
files = [
{file = "scipy-1.9.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1884b66a54887e21addf9c16fb588720a8309a57b2e258ae1c7986d4444d3bc0"},
@@ -6754,7 +6377,7 @@ name = "sentence-transformers"
version = "2.2.2"
description = "Multilingual text embeddings"
category = "main"
optional = false
optional = true
python-versions = ">=3.6.0"
files = [
{file = "sentence-transformers-2.2.2.tar.gz", hash = "sha256:dbc60163b27de21076c9a30d24b5b7b6fa05141d68cf2553fa9a77bf79a29136"},
@@ -6777,7 +6400,7 @@ name = "sentencepiece"
version = "0.1.97"
description = "SentencePiece python wrapper"
category = "main"
optional = false
optional = true
python-versions = "*"
files = [
{file = "sentencepiece-0.1.97-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:6f249c8f1852893be86eae66b19d522c5fb30bbad4fe2d1b07f06fdc86e1907e"},
@@ -7433,7 +7056,7 @@ name = "starlette"
version = "0.26.1"
description = "The little ASGI library that shines."
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "starlette-0.26.1-py3-none-any.whl", hash = "sha256:e87fce5d7cbdde34b76f0ac69013fd9d190d581d80681493016666e6f96c6d5e"},
@@ -7447,21 +7070,6 @@ typing-extensions = {version = ">=3.10.0", markers = "python_version < \"3.10\""
[package.extras]
full = ["httpx (>=0.22.0)", "itsdangerous", "jinja2", "python-multipart", "pyyaml"]
[[package]]
name = "sympy"
version = "1.11.1"
description = "Computer algebra system (CAS) in Python"
category = "main"
optional = false
python-versions = ">=3.8"
files = [
{file = "sympy-1.11.1-py3-none-any.whl", hash = "sha256:938f984ee2b1e8eae8a07b884c8b7a1146010040fccddc6539c54f401c8f6fcf"},
{file = "sympy-1.11.1.tar.gz", hash = "sha256:e32380dce63cb7c0108ed525570092fd45168bdae2faa17e528221ef72e88658"},
]
[package.dependencies]
mpmath = ">=0.19"
[[package]]
name = "tabulate"
version = "0.9.0"
@@ -7830,7 +7438,7 @@ name = "threadpoolctl"
version = "3.1.0"
description = "threadpoolctl"
category = "main"
optional = false
optional = true
python-versions = ">=3.6"
files = [
{file = "threadpoolctl-3.1.0-py3-none-any.whl", hash = "sha256:8b99adda265feb6773280df41eece7b2e6561b772d21ffd52e372f999024907b"},
@@ -7842,7 +7450,7 @@ name = "tiktoken"
version = "0.3.3"
description = "tiktoken is a fast BPE tokeniser for use with OpenAI's models"
category = "main"
optional = false
optional = true
python-versions = ">=3.8"
files = [
{file = "tiktoken-0.3.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d1f37fa75ba70c1bc7806641e8ccea1fba667d23e6341a1591ea333914c226a9"},
@@ -7983,55 +7591,40 @@ files = [
[[package]]
name = "torch"
version = "2.0.0"
version = "1.13.1"
description = "Tensors and Dynamic neural networks in Python with strong GPU acceleration"
category = "main"
optional = false
python-versions = ">=3.8.0"
optional = true
python-versions = ">=3.7.0"
files = [
{file = "torch-2.0.0-1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:c9090bda7d2eeeecd74f51b721420dbeb44f838d4536cc1b284e879417e3064a"},
{file = "torch-2.0.0-1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:bd42db2a48a20574d2c33489e120e9f32789c4dc13c514b0c44272972d14a2d7"},
{file = "torch-2.0.0-1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:8969aa8375bcbc0c2993e7ede0a7f889df9515f18b9b548433f412affed478d9"},
{file = "torch-2.0.0-1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:ab2da16567cb55b67ae39e32d520d68ec736191d88ac79526ca5874754c32203"},
{file = "torch-2.0.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:7a9319a67294ef02459a19738bbfa8727bb5307b822dadd708bc2ccf6c901aca"},
{file = "torch-2.0.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:9f01fe1f6263f31bd04e1757946fd63ad531ae37f28bb2dbf66f5c826ee089f4"},
{file = "torch-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:527f4ae68df7b8301ee6b1158ca56350282ea633686537b30dbb5d7b4a52622a"},
{file = "torch-2.0.0-cp310-none-macosx_10_9_x86_64.whl", hash = "sha256:ce9b5a49bd513dff7950a5a07d6e26594dd51989cee05ba388b03e8e366fd5d5"},
{file = "torch-2.0.0-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:53e1c33c6896583cdb9a583693e22e99266444c4a43392dddc562640d39e542b"},
{file = "torch-2.0.0-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:09651bff72e439d004c991f15add0c397c66f98ab36fe60d5514b44e4da722e8"},
{file = "torch-2.0.0-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:d439aec349c98f12819e8564b8c54008e4613dd4428582af0e6e14c24ca85870"},
{file = "torch-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:2802f84f021907deee7e9470ed10c0e78af7457ac9a08a6cd7d55adef835fede"},
{file = "torch-2.0.0-cp311-none-macosx_10_9_x86_64.whl", hash = "sha256:01858620f25f25e7a9ec4b547ff38e5e27c92d38ec4ccba9cfbfb31d7071ed9c"},
{file = "torch-2.0.0-cp311-none-macosx_11_0_arm64.whl", hash = "sha256:9a2e53b5783ef5896a6af338b36d782f28e83c8ddfc2ac44b67b066d9d76f498"},
{file = "torch-2.0.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:ec5fff2447663e369682838ff0f82187b4d846057ef4d119a8dea7772a0b17dd"},
{file = "torch-2.0.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:11b0384fe3c18c01b8fc5992e70fc519cde65e44c51cc87be1838c1803daf42f"},
{file = "torch-2.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:e54846aa63855298cfb1195487f032e413e7ac9cbfa978fda32354cc39551475"},
{file = "torch-2.0.0-cp38-none-macosx_10_9_x86_64.whl", hash = "sha256:cc788cbbbbc6eb4c90e52c550efd067586c2693092cf367c135b34893a64ae78"},
{file = "torch-2.0.0-cp38-none-macosx_11_0_arm64.whl", hash = "sha256:d292640f0fd72b7a31b2a6e3b635eb5065fcbedd4478f9cad1a1e7a9ec861d35"},
{file = "torch-2.0.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:6befaad784004b7af357e3d87fa0863c1f642866291f12a4c2af2de435e8ac5c"},
{file = "torch-2.0.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:a83b26bd6ae36fbf5fee3d56973d9816e2002e8a3b7d9205531167c28aaa38a7"},
{file = "torch-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:c7e67195e1c3e33da53954b026e89a8e1ff3bc1aeb9eb32b677172d4a9b5dcbf"},
{file = "torch-2.0.0-cp39-none-macosx_10_9_x86_64.whl", hash = "sha256:6e0b97beb037a165669c312591f242382e9109a240e20054d5a5782d9236cad0"},
{file = "torch-2.0.0-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:297a4919aff1c0f98a58ebe969200f71350a1d4d4f986dbfd60c02ffce780e99"},
{file = "torch-1.13.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:fd12043868a34a8da7d490bf6db66991108b00ffbeecb034228bfcbbd4197143"},
{file = "torch-1.13.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:d9fe785d375f2e26a5d5eba5de91f89e6a3be5d11efb497e76705fdf93fa3c2e"},
{file = "torch-1.13.1-cp310-cp310-win_amd64.whl", hash = "sha256:98124598cdff4c287dbf50f53fb455f0c1e3a88022b39648102957f3445e9b76"},
{file = "torch-1.13.1-cp310-none-macosx_10_9_x86_64.whl", hash = "sha256:393a6273c832e047581063fb74335ff50b4c566217019cc6ace318cd79eb0566"},
{file = "torch-1.13.1-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:0122806b111b949d21fa1a5f9764d1fd2fcc4a47cb7f8ff914204fd4fc752ed5"},
{file = "torch-1.13.1-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:22128502fd8f5b25ac1cd849ecb64a418382ae81dd4ce2b5cebaa09ab15b0d9b"},
{file = "torch-1.13.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:76024be052b659ac1304ab8475ab03ea0a12124c3e7626282c9c86798ac7bc11"},
{file = "torch-1.13.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:ea8dda84d796094eb8709df0fcd6b56dc20b58fdd6bc4e8d7109930dafc8e419"},
{file = "torch-1.13.1-cp37-cp37m-win_amd64.whl", hash = "sha256:2ee7b81e9c457252bddd7d3da66fb1f619a5d12c24d7074de91c4ddafb832c93"},
{file = "torch-1.13.1-cp37-none-macosx_10_9_x86_64.whl", hash = "sha256:0d9b8061048cfb78e675b9d2ea8503bfe30db43d583599ae8626b1263a0c1380"},
{file = "torch-1.13.1-cp37-none-macosx_11_0_arm64.whl", hash = "sha256:f402ca80b66e9fbd661ed4287d7553f7f3899d9ab54bf5c67faada1555abde28"},
{file = "torch-1.13.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:727dbf00e2cf858052364c0e2a496684b9cb5aa01dc8a8bc8bbb7c54502bdcdd"},
{file = "torch-1.13.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:df8434b0695e9ceb8cc70650afc1310d8ba949e6db2a0525ddd9c3b2b181e5fe"},
{file = "torch-1.13.1-cp38-cp38-win_amd64.whl", hash = "sha256:5e1e722a41f52a3f26f0c4fcec227e02c6c42f7c094f32e49d4beef7d1e213ea"},
{file = "torch-1.13.1-cp38-none-macosx_10_9_x86_64.whl", hash = "sha256:33e67eea526e0bbb9151263e65417a9ef2d8fa53cbe628e87310060c9dcfa312"},
{file = "torch-1.13.1-cp38-none-macosx_11_0_arm64.whl", hash = "sha256:eeeb204d30fd40af6a2d80879b46a7efbe3cf43cdbeb8838dd4f3d126cc90b2b"},
{file = "torch-1.13.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:50ff5e76d70074f6653d191fe4f6a42fdbe0cf942fbe2a3af0b75eaa414ac038"},
{file = "torch-1.13.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:2c3581a3fd81eb1f0f22997cddffea569fea53bafa372b2c0471db373b26aafc"},
{file = "torch-1.13.1-cp39-cp39-win_amd64.whl", hash = "sha256:0aa46f0ac95050c604bcf9ef71da9f1172e5037fdf2ebe051962d47b123848e7"},
{file = "torch-1.13.1-cp39-none-macosx_10_9_x86_64.whl", hash = "sha256:6930791efa8757cb6974af73d4996b6b50c592882a324b8fb0589c6a9ba2ddaf"},
{file = "torch-1.13.1-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:e0df902a7c7dd6c795698532ee5970ce898672625635d885eade9976e5a04949"},
]
[package.dependencies]
filelock = "*"
jinja2 = "*"
networkx = "*"
nvidia-cublas-cu11 = {version = "11.10.3.66", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cuda-cupti-cu11 = {version = "11.7.101", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cuda-nvrtc-cu11 = {version = "11.7.99", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cuda-runtime-cu11 = {version = "11.7.99", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cudnn-cu11 = {version = "8.5.0.96", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cufft-cu11 = {version = "10.9.0.58", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-curand-cu11 = {version = "10.2.10.91", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cusolver-cu11 = {version = "11.4.0.1", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cusparse-cu11 = {version = "11.7.4.91", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-nccl-cu11 = {version = "2.14.3", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-nvtx-cu11 = {version = "11.7.91", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
sympy = "*"
triton = {version = "2.0.0", markers = "platform_system == \"Linux\" and platform_machine == \"x86_64\""}
nvidia-cublas-cu11 = {version = "11.10.3.66", markers = "platform_system == \"Linux\""}
nvidia-cuda-nvrtc-cu11 = {version = "11.7.99", markers = "platform_system == \"Linux\""}
nvidia-cuda-runtime-cu11 = {version = "11.7.99", markers = "platform_system == \"Linux\""}
nvidia-cudnn-cu11 = {version = "8.5.0.96", markers = "platform_system == \"Linux\""}
typing-extensions = "*"
[package.extras]
@@ -8039,39 +7632,39 @@ opt-einsum = ["opt-einsum (>=3.3)"]
[[package]]
name = "torchvision"
version = "0.15.1"
version = "0.14.1"
description = "image and video datasets and models for torch deep learning"
category = "main"
optional = false
python-versions = ">=3.8"
optional = true
python-versions = ">=3.7"
files = [
{file = "torchvision-0.15.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc10d48e9a60d006d0c1b48dea87f1ec9b63d856737d592f7c5c44cd87f3f4b7"},
{file = "torchvision-0.15.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3708d3410fdcaf6280e358cda9de2a4ab06cc0b4c0fd9aeeac550ec2563a887e"},
{file = "torchvision-0.15.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:d4de10c837f1493c1c54344388e300a06c96914c6cc55fcb2527c21f2f010bbd"},
{file = "torchvision-0.15.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:b82fcc5abc9b5c96495c76596a1573025cc1e09d97d2d6fda717c44b9ca45881"},
{file = "torchvision-0.15.1-cp310-cp310-win_amd64.whl", hash = "sha256:c84e97d8cc4fe167d87adad0a2a6424cff90544365545b20669bc50e6ea46875"},
{file = "torchvision-0.15.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:97b90eb3b7333a31d049c4ccfd1064361e8491874959d38f466af64d67418cef"},
{file = "torchvision-0.15.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6b60e1c839ae2a071befbba69b17468d67feafdf576e90ff9645bfbee998de17"},
{file = "torchvision-0.15.1-cp311-cp311-manylinux1_x86_64.whl", hash = "sha256:13f71a3372d9168b01481a754ebaa171207f3dc455bf2fd86906c69222443738"},
{file = "torchvision-0.15.1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:b2e8394726009090b40f6cc3a95cc878cc011dfac3d8e7a6060c79213d360880"},
{file = "torchvision-0.15.1-cp311-cp311-win_amd64.whl", hash = "sha256:2852f501189483187ce9eb0ccd01b3f4f0918d29057e4a18b3cce8dad9a8a964"},
{file = "torchvision-0.15.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:e5861baaeea87d19b6fd7d131e11a4a6bd17be14234c490a259bb360775e9520"},
{file = "torchvision-0.15.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e714f362b9d8217cf4d68509b679ebc9ddf128cfe80f6c1def8e3f8a18466e75"},
{file = "torchvision-0.15.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:43624accad1e47f16824be4db37ad678dd89326ad90b69c9c6363eeb22b9467e"},
{file = "torchvision-0.15.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:7fe9b0cd3311b0db9e6d45ffab594ced06418fa4e2aa15eb2e60d55e5c51135c"},
{file = "torchvision-0.15.1-cp38-cp38-win_amd64.whl", hash = "sha256:b45324ea4911a23a4b00b5a15cdbe36d47f93137206dab9f8c606d81b69dd3a7"},
{file = "torchvision-0.15.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1dfdec7c7df967330bba3341a781e0c047d4e0163e67164a9918500362bf7d91"},
{file = "torchvision-0.15.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c153710186cec0338d4fff411459a57ddbc8504436123ca73b3f0bdc26ff918c"},
{file = "torchvision-0.15.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:ff4e650aa601f32ab97bce06704868dd2baad69ca4d454fa1f0012a51199f2bc"},
{file = "torchvision-0.15.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:e9b4bb2a15849391df0415d2f76dd36e6528e4253f7b69322b7a0d682535544b"},
{file = "torchvision-0.15.1-cp39-cp39-win_amd64.whl", hash = "sha256:21e6beb69e77ef6575c4fdd0ab332b96e8a7f144eee0d333acff469c827a4b5e"},
{file = "torchvision-0.14.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:eeb05dd9dd3af5428fee525400759daf8da8e4caec45ddd6908cfb36571f6433"},
{file = "torchvision-0.14.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8d0766ea92affa7af248e327dd85f7c9cfdf51a57530b43212d4e1858548e9d7"},
{file = "torchvision-0.14.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:6d7b35653113664ea3fdcb71f515cfbf29d2fe393000fd8aaff27a1284de6908"},
{file = "torchvision-0.14.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:8a9eb773a2fa8f516e404ac09c059fb14e6882c48fdbb9c946327d2ce5dba6cd"},
{file = "torchvision-0.14.1-cp310-cp310-win_amd64.whl", hash = "sha256:13986f0c15377ff23039e1401012ccb6ecf71024ce53def27139e4eac5a57592"},
{file = "torchvision-0.14.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fb7a793fd33ce1abec24b42778419a3fb1e3159d7dfcb274a3ca8fb8cbc408dc"},
{file = "torchvision-0.14.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:89fb0419780ec9a9eb9f7856a0149f6ac9f956b28f44b0c0080c6b5b48044db7"},
{file = "torchvision-0.14.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:a2d4237d3c9705d7729eb4534e4eb06f1d6be7ff1df391204dfb51586d9b0ecb"},
{file = "torchvision-0.14.1-cp37-cp37m-win_amd64.whl", hash = "sha256:92a324712a87957443cc34223274298ae9496853f115c252f8fc02b931f2340e"},
{file = "torchvision-0.14.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:68ed03359dcd3da9cd21b8ab94da21158df8a6a0c5bad0bf4a42f0e448d28cb3"},
{file = "torchvision-0.14.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:30fcf0e9fe57d4ac4ce6426659a57dce199637ccb6c70be1128670f177692624"},
{file = "torchvision-0.14.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:0ed02aefd09bf1114d35f1aa7dce55aa61c2c7e57f9aa02dce362860be654e85"},
{file = "torchvision-0.14.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:a541e49fc3c4e90e49e6988428ab047415ed52ea97d0c0bfd147d8bacb8f4df8"},
{file = "torchvision-0.14.1-cp38-cp38-win_amd64.whl", hash = "sha256:6099b3191dc2516099a32ae38a5fb349b42e863872a13545ab1a524b6567be60"},
{file = "torchvision-0.14.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c5e744f56e5f5b452deb5fc0f3f2ba4d2f00612d14d8da0dbefea8f09ac7690b"},
{file = "torchvision-0.14.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:758b20d079e810b4740bd60d1eb16e49da830e3360f9be379eb177ee221fa5d4"},
{file = "torchvision-0.14.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:83045507ef8d3c015d4df6be79491375b2f901352cfca6e72b4723e9c4f9a55d"},
{file = "torchvision-0.14.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:eaed58cf454323ed9222d4e0dd5fb897064f454b400696e03a5200e65d3a1e76"},
{file = "torchvision-0.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:b337e1245ca4353623dd563c03cd8f020c2496a7c5d12bba4d2e381999c766e0"},
]
[package.dependencies]
numpy = "*"
pillow = ">=5.3.0,<8.3.0 || >=8.4.0"
requests = "*"
torch = "2.0.0"
torch = "1.13.1"
typing-extensions = "*"
[package.extras]
scipy = ["scipy"]
@@ -8202,44 +7795,6 @@ torchhub = ["filelock", "huggingface-hub (>=0.11.0,<1.0)", "importlib-metadata",
video = ["av (==9.2.0)", "decord (==0.6.0)"]
vision = ["Pillow"]
[[package]]
name = "triton"
version = "2.0.0"
description = "A language and compiler for custom Deep Learning operations"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "triton-2.0.0-1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:38806ee9663f4b0f7cd64790e96c579374089e58f49aac4a6608121aa55e2505"},
{file = "triton-2.0.0-1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:226941c7b8595219ddef59a1fdb821e8c744289a132415ddd584facedeb475b1"},
{file = "triton-2.0.0-1-cp36-cp36m-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4c9fc8c89874bc48eb7e7b2107a9b8d2c0bf139778637be5bfccb09191685cfd"},
{file = "triton-2.0.0-1-cp37-cp37m-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d2684b6a60b9f174f447f36f933e9a45f31db96cb723723ecd2dcfd1c57b778b"},
{file = "triton-2.0.0-1-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9d4978298b74fcf59a75fe71e535c092b023088933b2f1df933ec32615e4beef"},
{file = "triton-2.0.0-1-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:74f118c12b437fb2ca25e1a04759173b517582fcf4c7be11913316c764213656"},
{file = "triton-2.0.0-1-pp37-pypy37_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9618815a8da1d9157514f08f855d9e9ff92e329cd81c0305003eb9ec25cc5add"},
{file = "triton-2.0.0-1-pp38-pypy38_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:1aca3303629cd3136375b82cb9921727f804e47ebee27b2677fef23005c3851a"},
{file = "triton-2.0.0-1-pp39-pypy39_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e3e13aa8b527c9b642e3a9defcc0fbd8ffbe1c80d8ac8c15a01692478dc64d8a"},
{file = "triton-2.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f05a7e64e4ca0565535e3d5d3405d7e49f9d308505bb7773d21fb26a4c008c2"},
{file = "triton-2.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb4b99ca3c6844066e516658541d876c28a5f6e3a852286bbc97ad57134827fd"},
{file = "triton-2.0.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47b4d70dc92fb40af553b4460492c31dc7d3a114a979ffb7a5cdedb7eb546c08"},
{file = "triton-2.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fedce6a381901b1547e0e7e1f2546e4f65dca6d91e2d8a7305a2d1f5551895be"},
{file = "triton-2.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75834f27926eab6c7f00ce73aaf1ab5bfb9bec6eb57ab7c0bfc0a23fac803b4c"},
{file = "triton-2.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0117722f8c2b579cd429e0bee80f7731ae05f63fe8e9414acd9a679885fcbf42"},
{file = "triton-2.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bcd9be5d0c2e45d2b7e6ddc6da20112b6862d69741576f9c3dbaf941d745ecae"},
{file = "triton-2.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42a0d2c3fc2eab4ba71384f2e785fbfd47aa41ae05fa58bf12cb31dcbd0aeceb"},
{file = "triton-2.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:52c47b72c72693198163ece9d90a721299e4fb3b8e24fd13141e384ad952724f"},
]
[package.dependencies]
cmake = "*"
filelock = "*"
lit = "*"
torch = "*"
[package.extras]
tests = ["autopep8", "flake8", "isort", "numpy", "pytest", "scipy (>=1.7.1)"]
tutorials = ["matplotlib", "pandas", "tabulate"]
[[package]]
name = "typer"
version = "0.7.0"
@@ -8432,7 +7987,7 @@ name = "uvicorn"
version = "0.21.1"
description = "The lightning-fast ASGI server."
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "uvicorn-0.21.1-py3-none-any.whl", hash = "sha256:e47cac98a6da10cd41e6fd036d472c6f58ede6c5dbee3dbee3ef7a100ed97742"},
@@ -8458,7 +8013,7 @@ name = "uvloop"
version = "0.17.0"
description = "Fast implementation of asyncio event loop on top of libuv"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "uvloop-0.17.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ce9f61938d7155f79d3cb2ffa663147d4a76d16e08f65e2c66b77bd41b356718"},
@@ -8593,7 +8148,7 @@ name = "watchfiles"
version = "0.19.0"
description = "Simple, modern and high performance file watching and code reload in python."
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "watchfiles-0.19.0-cp37-abi3-macosx_10_7_x86_64.whl", hash = "sha256:91633e64712df3051ca454ca7d1b976baf842d7a3640b87622b323c55f3345e7"},
@@ -8703,7 +8258,7 @@ name = "websockets"
version = "11.0.1"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "websockets-11.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3d30cc1a90bcbf9e22e1f667c1c5a7428e2d37362288b4ebfd5118eb0b11afa9"},
@@ -8801,7 +8356,7 @@ name = "wheel"
version = "0.40.0"
description = "A built-package format for Python"
category = "main"
optional = false
optional = true
python-versions = ">=3.7"
files = [
{file = "wheel-0.40.0-py3-none-any.whl", hash = "sha256:d236b20e7cb522daf2390fa84c55eea81c5c30190f90f29ae2ca1ad8355bf247"},
@@ -9090,73 +8645,6 @@ files = [
docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
[[package]]
name = "zstandard"
version = "0.20.0"
description = "Zstandard bindings for Python"
category = "dev"
optional = false
python-versions = ">=3.6"
files = [
{file = "zstandard-0.20.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c4efa051799703dc37c072e22af1f0e4c77069a78fb37caf70e26414c738ca1d"},
{file = "zstandard-0.20.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f847701d77371d90783c0ce6cfdb7ebde4053882c2aaba7255c70ae3c3eb7af0"},
{file = "zstandard-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0aa4d178560d7ee32092ddfd415c2cdc6ab5ddce9554985c75f1a019a0ff4c55"},
{file = "zstandard-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0488f2a238b4560828b3a595f3337daac4d3725c2a1637ffe2a0d187c091da59"},
{file = "zstandard-0.20.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:cd0aa9a043c38901925ae1bba49e1e638f2d9c3cdf1b8000868993c642deb7f2"},
{file = "zstandard-0.20.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cdd769da7add8498658d881ce0eeb4c35ea1baac62e24c5a030c50f859f29724"},
{file = "zstandard-0.20.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9aea3c7bab4276212e5ac63d28e6bd72a79ff058d57e06926dfe30a52451d943"},
{file = "zstandard-0.20.0-cp310-cp310-win32.whl", hash = "sha256:0d213353d58ad37fb5070314b156fb983b4d680ed5f3fce76ab013484cf3cf12"},
{file = "zstandard-0.20.0-cp310-cp310-win_amd64.whl", hash = "sha256:d08459f7f7748398a6cc65eb7f88aa7ef5731097be2ddfba544be4b558acd900"},
{file = "zstandard-0.20.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c1929afea64da48ec59eca9055d7ec7e5955801489ac40ac2a19dde19e7edad9"},
{file = "zstandard-0.20.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b6d718f1b7cd30adb02c2a46dde0f25a84a9de8865126e0fff7d0162332d6b92"},
{file = "zstandard-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5499d65d4a1978dccf0a9c2c0d12415e16d4995ffad7a0bc4f72cc66691cf9f2"},
{file = "zstandard-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:302a31400de0280f17c4ce67a73444a7a069f228db64048e4ce555cd0c02fbc4"},
{file = "zstandard-0.20.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:39ae788dcdc404c07ef7aac9b11925185ea0831b985db0bbc43f95acdbd1c2ce"},
{file = "zstandard-0.20.0-cp311-cp311-win32.whl", hash = "sha256:e3f6887d2bdfb5752d5544860bd6b778e53ebfaf4ab6c3f9d7fd388445429d41"},
{file = "zstandard-0.20.0-cp311-cp311-win_amd64.whl", hash = "sha256:4abf9a9e0841b844736d1ae8ead2b583d2cd212815eab15391b702bde17477a7"},
{file = "zstandard-0.20.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:dc47cec184e66953f635254e5381df8a22012a2308168c069230b1a95079ccd0"},
{file = "zstandard-0.20.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84c1dae0c0a21eea245b5691286fe6470dc797d5e86e0c26b57a3afd1e750b48"},
{file = "zstandard-0.20.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:059316f07e39b7214cd9eed565d26ab239035d2c76835deeff381995f7a27ba8"},
{file = "zstandard-0.20.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9aca916724d0802d3e70dc68adeff893efece01dffe7252ee3ae0053f1f1990f"},
{file = "zstandard-0.20.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b07f391fd85e3d07514c05fb40c5573b398d0063ab2bada6eb09949ec6004772"},
{file = "zstandard-0.20.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2adf65cfce73ce94ef4c482f6cc01f08ddf5e1ca0c1ec95f2b63840f9e4c226c"},
{file = "zstandard-0.20.0-cp36-cp36m-win32.whl", hash = "sha256:ee2a1510e06dfc7706ea9afad363efe222818a1eafa59abc32d9bbcd8465fba7"},
{file = "zstandard-0.20.0-cp36-cp36m-win_amd64.whl", hash = "sha256:29699746fae2760d3963a4ffb603968e77da55150ee0a3326c0569f4e35f319f"},
{file = "zstandard-0.20.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:78fb35d07423f25efd0fc90d0d4710ae83cfc86443a32192b0c6cb8475ec79a5"},
{file = "zstandard-0.20.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40466adfa071f58bfa448d90f9623d6aff67c6d86de6fc60be47a26388f6c74d"},
{file = "zstandard-0.20.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba86f931bf925e9561ccd6cb978acb163e38c425990927feb38be10c894fa937"},
{file = "zstandard-0.20.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:b671b75ae88139b1dd022fa4aa66ba419abd66f98869af55a342cb9257a1831e"},
{file = "zstandard-0.20.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cc98c8bcaa07150d3f5d7c4bd264eaa4fdd4a4dfb8fd3f9d62565ae5c4aba227"},
{file = "zstandard-0.20.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0b815dec62e2d5a1bf7a373388f2616f21a27047b9b999de328bca7462033708"},
{file = "zstandard-0.20.0-cp37-cp37m-win32.whl", hash = "sha256:5a3578b182c21b8af3c49619eb4cd0b9127fa60791e621b34217d65209722002"},
{file = "zstandard-0.20.0-cp37-cp37m-win_amd64.whl", hash = "sha256:f1ba6bbd28ad926d130f0af8016f3a2930baa013c2128cfff46ca76432f50669"},
{file = "zstandard-0.20.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b0f556c74c6f0f481b61d917e48c341cdfbb80cc3391511345aed4ce6fb52fdc"},
{file = "zstandard-0.20.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:862ad0a5c94670f2bd6f64fff671bd2045af5f4ed428a3f2f69fa5e52483f86a"},
{file = "zstandard-0.20.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a56036c08645aa6041d435a50103428f0682effdc67f5038de47cea5e4221d6f"},
{file = "zstandard-0.20.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4af5d1891eebef430038ea4981957d31b1eb70aca14b906660c3ac1c3e7a8612"},
{file = "zstandard-0.20.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:489959e2d52f7f1fe8ea275fecde6911d454df465265bf3ec51b3e755e769a5e"},
{file = "zstandard-0.20.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7041efe3a93d0975d2ad16451720932e8a3d164be8521bfd0873b27ac917b77a"},
{file = "zstandard-0.20.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:c28c7441638c472bfb794f424bd560a22c7afce764cd99196e8d70fbc4d14e85"},
{file = "zstandard-0.20.0-cp38-cp38-win32.whl", hash = "sha256:ba4bb4c5a0cac802ff485fa1e57f7763df5efa0ad4ee10c2693ecc5a018d2c1a"},
{file = "zstandard-0.20.0-cp38-cp38-win_amd64.whl", hash = "sha256:a5efe366bf0545a1a5a917787659b445ba16442ae4093f102204f42a9da1ecbc"},
{file = "zstandard-0.20.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:79c3058ccbe1fa37356a73c9d3c0475ec935ab528f5b76d56fc002a5a23407c7"},
{file = "zstandard-0.20.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:39cbaf8fe3fa3515d35fb790465db4dc1ff45e58e1e00cbaf8b714e85437f039"},
{file = "zstandard-0.20.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f199d58f3fd7dfa0d447bc255ff22571f2e4e5e5748bfd1c41370454723cb053"},
{file = "zstandard-0.20.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f32a8f3a697ef87e67c0d0c0673b245babee6682b2c95e46eb30208ffb720bd"},
{file = "zstandard-0.20.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:4a3c36284c219a4d2694e52b2582fe5d5f0ecaf94a22cf0ea959b527dbd8a2a6"},
{file = "zstandard-0.20.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2eeb9e1ecd48ac1d352608bfe0dc1ed78a397698035a1796cf72f0c9d905d219"},
{file = "zstandard-0.20.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6179808ebd1ebc42b1e2f221a23c28a22d3bc8f79209ae4a3cc114693c380bff"},
{file = "zstandard-0.20.0-cp39-cp39-win32.whl", hash = "sha256:afbcd2ed0c1145e24dd3df8440a429688a1614b83424bc871371b176bed429f9"},
{file = "zstandard-0.20.0-cp39-cp39-win_amd64.whl", hash = "sha256:e6b4de1ba2f3028fafa0d82222d1e91b729334c8d65fbf04290c65c09d7457e1"},
{file = "zstandard-0.20.0.tar.gz", hash = "sha256:613daadd72c71b1488742cafb2c3b381c39d0c9bb8c6cc157aa2d5ea45cc2efc"},
]
[package.dependencies]
cffi = {version = ">=1.11", markers = "platform_python_implementation == \"PyPy\""}
[package.extras]
cffi = ["cffi (>=1.11)"]
[extras]
all = ["aleph-alpha-client", "anthropic", "beautifulsoup4", "cohere", "deeplake", "elasticsearch", "faiss-cpu", "google-api-python-client", "google-search-results", "huggingface_hub", "jina", "jinja2", "manifest-ml", "networkx", "nlpcloud", "nltk", "nomic", "openai", "opensearch-py", "pgvector", "pinecone-client", "psycopg2-binary", "pyowm", "pypdf", "qdrant-client", "redis", "sentence-transformers", "spacy", "tensorflow-text", "tiktoken", "torch", "transformers", "weaviate-client", "wikipedia", "wolframalpha"]
cohere = ["cohere"]
@@ -9167,4 +8655,4 @@ qdrant = ["qdrant-client"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.8.1,<4.0"
content-hash = "a8fde2558f92b4c5ec1dce45f830adc6158dd2cf8c425a34a06523ee8e74487d"
content-hash = "56e8666167102cc23b605c8d91b26a62c0858216637cc281b866315da766ad71"

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "langchain"
version = "0.0.137"
version = "0.0.136"
description = "Building applications with LLMs through composability"
authors = []
license = "MIT"
@@ -28,7 +28,7 @@ spacy = {version = "^3", optional = true}
nltk = {version = "^3", optional = true}
transformers = {version = "^4", optional = true}
beautifulsoup4 = {version = "^4", optional = true}
torch = {version = "^2", optional = true}
torch = {version = "^1", optional = true}
jinja2 = {version = "^3", optional = true}
tiktoken = {version = "^0.3.2", optional = true, python="^3.9"}
pinecone-client = {version = "^2", optional = true}
@@ -75,7 +75,7 @@ linkchecker = "^10.2.1"
sphinx-copybutton = "^0.5.1"
[tool.poetry.group.test.dependencies]
pytest = "^7.3.0"
pytest = "^7.2.0"
pytest-cov = "^4.0.0"
pytest-dotenv = "^0.5.2"
duckdb-engine = "^0.7.0"
@@ -88,19 +88,16 @@ pytest-asyncio = "^0.20.3"
optional = true
[tool.poetry.group.test_integration.dependencies]
pytest-vcr = "^1.0.2"
wrapt = "^1.15.0"
openai = "^0.27.4"
elasticsearch = {extras = ["async"], version = "^8.6.2"}
pytest-vcr = "^1.0.2"
wrapt = "^1.15.0"
redis = "^4.5.4"
pinecone-client = "^2.2.1"
pgvector = "^0.1.6"
transformers = "^4.27.4"
pandas = "^2.0.0"
deeplake = "^3.2.21"
torch = "^2.0.0"
chromadb = "^0.3.21"
tiktoken = "^0.3.3"
[tool.poetry.group.lint.dependencies]
ruff = "^0.0.249"

View File

@@ -27,7 +27,7 @@ Any new dependencies should be added by running:
```bash
# add package and install it after adding:
poetry add tiktoken@latest --group "test_integration" && poetry install --with test_integration
poetry add deeplake --group "test_integration" && poetry install --with test_integration
```
Before running any tests, you should start a specific Docker container that has all the
@@ -55,12 +55,4 @@ new cassettes. Here's an example:
```bash
pytest tests/integration_tests/vectorstores/test_elasticsearch.py --vcr-record=none
```
### Run some tests with coverage:
```bash
pytest tests/integration_tests/vectorstores/test_elasticsearch.py --cov=langchain --cov-report=html
start "" htmlcov/index.html || open htmlcov/index.html
```

View File

@@ -1,20 +0,0 @@
from langchain.document_loaders.bilibili import BiliBiliLoader
def test_bilibili_loader() -> None:
"""Test Bilibili Loader."""
loader = BiliBiliLoader(
[
"https://www.bilibili.com/video/BV1xt411o7Xu/",
"https://www.bilibili.com/video/av330407025/",
]
)
docs = loader.load()
assert len(docs) == 2
assert len(docs[0].page_content) > 0
assert docs[1].metadata["owner"]["mid"] == 398095160
assert docs[1].page_content == ""
assert docs[1].metadata["owner"]["mid"] == 398095160

View File

@@ -1,40 +1,31 @@
interactions:
- request:
body: '{"input": [[2059, 7341, 527, 264, 1912, 315, 658, 10753, 677, 81, 3581,
7795, 32971, 555, 264, 7558, 321, 351, 61798, 30535, 11, 4330, 311, 8254, 342,
484, 1776, 1220, 389, 279, 11314, 315, 279, 2010, 11, 323, 281, 1279, 278, 66079,
430, 527, 539, 75754, 311, 279, 2010, 13, 18766, 61535, 527, 21771, 2949, 279,
1206, 1037, 24082, 613, 318, 269, 4055, 320, 269, 24082, 613, 3893, 8, 323,
527, 279, 13219, 1912, 311, 279, 426, 4428, 42877, 320, 66243, 323, 24890, 570,
4427, 8336, 13334, 279, 4751, 330, 939, 847, 1, 439, 459, 42887, 5699, 2737,
69918, 3697, 315, 921, 2159, 14172, 339, 9891, 320, 11707, 321, 351, 61798,
7795, 8, 449, 264, 44892, 12970, 79612, 11, 1778, 439, 6409, 65, 86815, 82,
323, 53265, 582, 276, 17323, 13, 61536, 12970, 523, 2159, 14172, 27520, 598,
1778, 439, 2493, 5670, 301, 1815, 323, 25227, 3205, 355, 1176, 9922, 304, 279,
60434, 1122, 26572, 320, 19391, 12, 19192, 11583, 705, 3582, 1063, 31376, 1534,
523, 2159, 14172, 339, 8503, 12970, 29505, 527, 439, 2362, 439, 279, 36931,
31137, 869, 12734, 320, 21209, 12, 14870, 11583, 570, 578, 24417, 6617, 61535,
320, 9697, 613, 5493, 8, 527, 3967, 505, 279, 23591, 84474, 11, 922, 220, 1049,
11583, 13], [2059, 7341, 2134, 304, 1404, 505, 279, 2678, 50561, 74265, 939,
847, 320, 36, 14046, 2985, 46109, 281, 5515, 72, 705, 264, 5655, 9581, 9606,
430, 374, 1193, 220, 1114, 2960, 86366, 417, 320, 21, 13, 22, 304, 8, 304, 3160,
11, 311, 279, 51119, 44892, 320, 73262, 2910, 77152, 3666, 355, 705, 279, 7928,
7795, 304, 279, 1917, 11, 902, 25501, 13489, 220, 717, 37356, 320, 1272, 10702,
8, 304, 3160, 13, 2435, 527, 1766, 304, 682, 52840, 323, 527, 4279, 311, 43957,
709, 311, 220, 17, 11, 931, 37356, 320, 21, 11, 5067, 10702, 570, 2435, 8965,
656, 539, 3974, 304, 80744, 11, 8051, 1070, 527, 264, 2478, 3967, 20157, 11,
1778, 439, 279, 17231, 44892, 323, 279, 15140, 44892, 11, 902, 649, 387, 1766,
304, 2225, 67329, 977, 323, 80744, 8032, 18, 60, 71923, 617, 264, 18702, 315,
2761, 14991, 294, 4351, 645, 430, 36236, 872, 6930, 505, 5674, 323, 79383, 304,
5369, 311, 18899, 872, 15962, 30295, 13, 2435, 617, 12387, 7437, 315, 8454,
481, 18311, 13, 220, 26778, 9606, 527, 72627, 56217, 11, 902, 527, 44304, 430,
527, 520, 279, 1948, 315, 872, 3691, 8957, 13, 8593, 10507, 2997, 279, 52835,
44892, 11, 6437, 44892, 11, 2294, 4251, 44892, 11, 296, 29886, 44892, 11, 270,
86524, 44892, 11, 323, 24354, 2025, 44892, 13], [2059, 7341, 527, 10791, 555,
12966, 369, 44892, 13339, 477, 44892, 1913, 19724, 13, 9176, 44892, 22673, 527,
21699, 555, 3823, 7640, 13, 8876, 220, 4468, 15, 11, 44892, 22673, 617, 1027,
11293, 555, 220, 6028, 13689, 10213, 505, 927, 69, 11218, 13]], "encoding_format":
"base64"}'
body: '{"input": ["Sharks are a group of elasmobranch fish characterized by a
cartilaginous skeleton, five to seven gill slits on the sides of the head, and
pectoral fins that are not fused to the head. Modern sharks are classified within
the clade Selachimorpha (or Selachii) and are the sister group to the Batoidea
(rays and kin). Some sources extend the term \"shark\" as an informal category
including extinct members of Chondrichthyes (cartilaginous fish) with a shark-like
morphology, such as hybodonts and xenacanths. Shark-like chondrichthyans such
as Cladoselache and Doliodus first appeared in the Devonian Period (419-359
Ma), though some fossilized chondrichthyan-like scales are as old as the Late
Ordovician (458-444 Ma). The oldest modern sharks (selachians) are known from
the Early Jurassic, about 200 Ma.", "Sharks range in size from the small dwarf
lanternshark (Etmopterus perryi), a deep sea species that is only 17 centimetres
(6.7 in) in length, to the whale shark (Rhincodon typus), the largest fish in
the world, which reaches approximately 12 metres (40 ft) in length. They are
found in all seas and are common to depths up to 2,000 metres (6,600 ft). They
generally do not live in freshwater, although there are a few known exceptions,
such as the bull shark and the river shark, which can be found in both seawater
and freshwater.[3] Sharks have a covering of dermal denticles that protects
their skin from damage and parasites in addition to improving their fluid dynamics.
They have numerous sets of replaceable teeth.\n\nSeveral species are apex predators,
which are organisms that are at the top of their food chain. Select examples
include the tiger shark, blue shark, great white shark, mako shark, thresher
shark, and hammerhead shark.", "Sharks are caught by humans for shark meat or
shark fin soup. Many shark populations are threatened by human activities. Since
1970, shark populations have been reduced by 71%, mostly from overfishing."],
"encoding_format": "base64"}'
headers:
Accept:
- '*/*'
@@ -43,7 +34,7 @@ interactions:
Connection:
- keep-alive
Content-Length:
- '2675'
- '2001'
Content-Type:
- application/json
User-Agent:
@@ -156,230 +147,230 @@ interactions:
P/+czq0V1GhW1ZwGYymlI96ZZ5Rc2IW69iZm03vrvmHxU0QMOicfi2fMQ5BMYTS3yaWj18Mrgp2k
UKydxir98VD0401/801PpAfqfMfD52i0uqkJ5hjGyzGj2mq86zOZuDNsSZbhoDMQmn2kJxBGJaXa
sD0hpthaDZqVDTi8wgqxNLlwsOhpqiPrnpIgKc+/+RiJpTsw8jtPKK7ihijdcfTn/RquiMhtj3Wv
Fxl5ip784+/Yy/xUF47Fifwvr52r5XyTKGJtXcmcHsWO/fQ3TdoVEWf35Pe5XgioEt8T1q/erWNA
uysS5fsqErwLTUeFBw92OtJx5I8kJd+1cwY3zIufH0TjjfP3aDcx+78BAAD//6R9SY+DzpPlvT9F
63/1tFgMZNA3zGY2Z2LjVRqNwDvYxiyZkCn1dx/h+mmkkfo0cyy5qILIiBfvvQjbLKlXi2LiExge
heyS80coBZvqER6WkRA7vz1KEd0jGa7Be/6br7Tc6cIGyPygYlQmTczet+xoGvo1IthbDMVUH/Wv
/vDgnGwxfy1ha8jXrsfzo/YUbCVf3vCm6z2x5u+TQ6fnhaB7yQwHmz0asla4P/wnk35GY9CsNXO6
Pxa2ldLSvR67YKbRwIKZCUI4Y7SF0257J8HTbAr+zI9HY8HbC1ns1qdiaNPLDNVFs2KWLc2dLpHT
O3DLLhjWutk0n4uuUIv6zsjVTtvv7BJt4djPJYLNmmfdk982sFKSmHnEYU5f+TsbkvG8x0rBRzQM
n/kWatHcsbhWbqte9CWFM6cPPPzO8yo9a7TjxpElsXgi5ni5/PNziS0GRdBfPIt3uMXy+bwW/TSv
R7tlpND7dl63fMoP41+/rYD/+h//DxsFyn+/UVBZo0nC/jJmfaonshFG7p5tHSlDQsnmRzSvgpht
KkXENK7QG9G+mDo+3B3+DM0OnEVr4GyR9q3I4BwYwzZqCa4/ZcZLx3/DPnllxN9p54JLIaJgNouA
JJ4dCXGkZo1UyTWwrn0Ogn/Tbgun88ejr3sexYO7um1gPvAPsUAx4vr8aAyQi/7JlreqRv2ZbwFa
flvRwzZ/xqO7tSdGpT6YK+d9/JUO1wQZ+6+MRyFdULsN+RbN/G81MZBDJrL0e4Wt37msKPbgjKeW
unDwLJ+4uX93Pt7hdIUy0r7MNSgqxrGtrmg4PFdsJX98odh+Z0Ni+D5xD+EJfUFfqHAc6zO++w/m
9GmupUaIBpfWB+mR8VfavQGLrUbsYz4vh6MkdTDOnxP/8cdSbB0nQcdDqbCkSojg0tUCgHyBidcp
XjvOdTtBiz3bs5hmj4LZh3WHkH2OibNr67IdldaGji3feAzbTcztq+fCc7ZZYYONqGTICSp01klG
R+63LRUK26AQZybVevuUcRMuDbysUWc7V6kdOlS6D0c2a1k4y2rEvBDtoXcjnd7dlSXGkO8TeOXa
DQvb9+MBcr1GN0/5EGzfNUcQx3mbeNFGxDIebTvKbZdALXkqnS+rJhsMfsCA+lwmvlRFQu3SykdX
Ahk7kuqVsT4cKpCHQWFR9dmi8cgvGBnP84o4XvstxTr9VtDu5JGsz3lRir72IrDqIsIazWghdrC1
9V/8VkMSIVlaFSkA8++EUNTHg/3oGrBvQcCCTTZkbFMvLcA380CcubRF3c2nOfq4cUYiSheo3Sh9
gOjyfiDLueQU495vapReswvBZD84THZCH+ggCAs4bbJh9Wg143giHyxZySvrS/87wLhTDySBBGf0
fI336GSgNTVCrUW9oS8Aruc3I96bHovB1LELi+AUENfKh5KDk8gQ18ecxMOHCdHnHwvFs2wkPihW
PG+U9g4e4nvMT2kaD83W4TDTDhtiv8Eqx8PqVMOwDVqW5tA4fEalI0jpsGSR/MkENw4bLLWbG2FT
vWaC1SvbsG5RSuKH9hJjons++pDlDBu6zQQfriHAR8cli5RLUgyY2TYYweFNlXeG0GCsigYF2nGi
v9xqeZlWFNm5KzO83A+tIIejjE7lPmI2kZCgWLfecCz3mA5vfyhH07/bJt7ZHYlXl0jML6FOYf+y
A7peHpxWbGpiwVdPCrIWkol442AXMX5FxNn4r7jLH62NIO4tqh7zdzzU9SmAfDdHFI1aJcbSf3BI
mbki171nlJWfmykwHS+Iu6wOxXBllgFrz7pTuGgLNBxyiJDSF5j5y5Bng6rjNzhouJK42J8dMTiL
LcTwkohlpvuWpSGvAPvdlfhIWmbzmxNd/34/fF3WQtiOewXOlgkJ1eybjXM9wnBp0IIFh6pCYum4
LnzC5EOOpWRlQ1Wvc2jv44U55zxzWo2ip8E8+cFin1oFayG/w+NTrqhYpH3J00pgQFtzRryt8nEe
122QQrONPWK30gkNSs4jcI+5wexTmjpf+XAF9DGMHfvhE5srTQOsL2o8j8MejVLbngFJfkWsTimc
EedaDk7jU7Z6IrWsK53I8GyMlkVkvJT9XY+pMWe+ylbfyz3jOU8p0P7UsYUPS+eH10h9qQssdm3Q
qtfQBPjexYusOFJK/gwVCnVXOeTEs1Bw1t4x7HY2ZstWqooWOcEb5IErmB/zQyseShWApfZHbOq2
LYS5IgMUSWUS4pFrybdt4xrKS/Xo6PB7zK6O/UZPPdH/4qeyUFZhvJzuLGCVU6jisKHwNowTC+Lw
O9X7s4bD56Gz+DPOW24odAD33QfMkZW+bTVYN2b1xSNz2uqUDYpfGZCoXUW8m9SL4ZTLtvmMjJ5F
6wtFXZSbNoiifxNfhCSrXWXkYMYdsJOudaLjkgmoYLMLbpRLl3GUtj4ot6WFtTL8IP6BQwRipxKq
m7ZA/ejXA5JSvpzqscuqNJcbgxd9Q/xZthPdfBtpsMHrHfFvUlj88AloKtZ4fs0yMUS5aRmdG5gs
OeyPMXulNYYLm+3IolYOMUsPxw4hfl0S68OfzvDRl29QL6eI6qPWF2zI6QBx4yYsPIRu0fmK3sCG
ZhlxtHztqNUh64B0H8ZwaC9Q6ynUNqZ4k2Rth0JW/E6D/YkwLD3v63LYb1cBRNkhIQGmHRqjVf5E
gXPY48dZOcZMy6QA7HaoSLzPnuKX76YROUdmgZLHo7fKDWjC5MSsXauUYy/pgLCzBxIvbJKNj61l
m8t2wHQ+2mbGnml9hnnbb4m7aF9lV9WnHJzIHcjlfTec2mVRDVmj98TeKrLTr7eLQH/kxkjsRYud
7od/g6de8OyNNvFoXr0zWqsUk9j9cNGTbbgBpT9hOlOas/OVr9hFNzdWKNI+pKgvW7c2T+GqpKql
VDHXYWdA/bwVLLKpK6b8qo2Xp6xYko1yyXBNNsgwFluyb0OGety+VX2m7TbYuI3n8pVflzIsDF9l
0/9HI/bLAB3XzZVYG4jaTtnG9q9+2LKs6mwo/HcD+2vmkGhDHTG0q22N0L4dqOGM13YwdLw3jp5t
U948KkdYTlKh6f6weRiHeDhL0hmF8RDQbxmWxYAeNUerd3cnN9s7lOyVsycwS6wo6xstZtE1wH/9
w9HTsJ2XSmeBanGG6yys0fiVOAa0OYfExv63pMnVveoTf8TyXYlaqjsrH3bXbMmm/oDmE78xz917
TpxASVt+4EeANjc8Yp/SIaaeHmI4haQky0MVFpRc3QSF2SHF850tFb0lAfzh25RvQlydqNIm/KNQ
f1okhNJoet1oa+LNKyZo39Z7MC9591e/6o6vffQ54dff9b94Q7GoNyR60icSc6WpUZL7wY9/OJ3t
HF0Qq9ORqp5mZn2xWufmmcGJJcu97ozyYQ9wXj5eBIfNzRHT+BP01XHG4vESFPzd3lX98XmsyMpA
ijOeJeQiK/eBwmd/cIaI2RuImPkhFju8Ylal9y1kj5qzUM3CYjxV6gytyu2M+F5qxbL6+Aagbq43
PJzaXcz7a4RROS8dRjaIOIJSmDb4niPB8zFs1ec1moFXH2WCP2PUDme2sFGyp28WR5kjeu2w75B9
PNlkcQWvpbtrLINxHwpyvY2xM674BWDNpJAt0MGK1VdavSFxdi5LcoTKXk/ZGyZ+RRHaGw57Q67B
DaovSb6ahcaNnnQ/PsqI6m3jfs4L96cfsEnGtOwlSaM/fCZLn47opz/Q1I/xd2UfMhGFfIb2n4eJ
X+8MiS6ssw2c1k1OmUdmJfd57iOhqCuyWtmeoC7PE0A3/0275n51Ro9fz4jevCsWqzZ0xAJ2R2Rr
hYVn8diVg8iNFL6NtmLR6UIzql4XlcEL1rB4bdNCiNodkGINglzrj5Pxx6GY/fgnI2L/jblyyCvQ
UI8ZVsitHG7SHMPL0C7M9v3G6ePt4omc43FGZ+0ex92XBZHhXM4bduqSc/Hjt2jG3CM5vOxZRoe0
VYEOI6HZ+3GPR+Hfa/TbYAyWkl3078NhBnQbCOYdqlEIlFML6ZLfYWhH4YgdNTU99iyDhE61QMN7
m9jo5s0/2GD7m8MSpX+CevMBC0naZiNrqwr053WN511ulUqquzIckhchh4t2yBh3Fn/8n0VOuCjG
Ld+doe+LAX/vyaUQ7mFdwS7XX3S457eWQyjANCS3xkoW7qb6aiPwiCmR+HIJUbdRtAH98H9RK/P4
HfMThQReiLmHUEf8nPYYMr8LqA4JyoZlXeTGY1kSLBn3IRYOXDZGwswbS6YNWX5ovzIo22BHtX3G
EHf5JpkmtDNmWcosHi/VjMLW2T+pvKzm2V+/GxsHSNiGT8Q37RObv/xz6pw7A9ratbmMh5SRJ6LO
6DzuKYB5pCyMq0fBC4f4KM91QlbOXomHxu9r2Cj2EevmJc36kF8xku3zl9mLlsbim382aOpX7JJ+
wkyUjo0hX9R74qCwQOPo1xyonnhsOm9HdOFAYe5GK4KfdzX+q6/v/Pllq8Yzy/ZeYxu4G5Rk/X4E
5Xi9kgr9+mfWhiBoWY0zM3EOLrNW7d4ZG/+Rwjx3qj9+1LmKPkDB4EKWolpm3JG0K0rHZkZRPOrl
ePSfT7P9JhnxTqkTD9bqdkd7NhMM14mP+D7tK2O6nmG072PxcRbw05/EWqWnUmj5O0Xrl3Vl589+
Ho/+asNNfinOLDknQ9brfr2BB5s5P/xovxd+lpE0cI+KQBnKXtMjFz5dtWSp4I94dPxWA/Dbjrld
vir55zpNcN3g86cX53PHUs0b3RQEpx8XzXvIc7Qfm4p5svKKhypHG/jsFIc521yUPK/42yTZVqWz
XXN1fvoU7V5WQvw6f8VjVakYPrl2IKc2fIrOenQD2mbbO3NFJRcj8uvgx3+YfwtTxNW0vSJdjhYs
cD8SYjM4bSBs3C3xPe1SCCUcXNDdwGP7WaYgHrSNCjQd1xgU+1MMrU72aNILVEFSLGS0jWpj/7IC
Zt3zWzna2ygybofygGda4qEh3q4MWDe6QkdJAtT7q81gPgkssK5m34L99AuzrxKxFqnR8q7St8iO
+RvzHBalutKDDj7D2E/npZdds40HUPj5gVUm1dk4SDpFFho6hp/3fTxakgzwes1jFtuZW4znaUI/
+TPEVtqoVIy2VeGTvEO2eD+Ctr9urY0ZqV3HvD3Vi/7bVoA8IkkkmPRRFyr6EUUq7ZgX5G3L9ofj
Hq3K/WzKVzUeEc9lJPpiT83cu5a8CNUEUp/uWRBlqejUVaaB8byucPe6uELZtXSP7opyI16UmYJ/
qrHT08XXZt6tGoue8+yK+ufFY0Fv69lYr453dO+qFxYfaSe4xfMr+JFbMV+ErBhofRygVWRGFT19
xOO37WZwYMBIGvm87XJl2Pz6PbNPbVIKEQ4Yej0JyfJxydHw609fT0V00zzcf/DjStMbRVP8ZLF1
IihPyZzFit2iEV29N9y/K4eUR4W1vEkrjPrlvSBbV+naVsvmAYTHY0Uu34tVCO8aVFB/7gwrBwln
gwf1gGLDj1h0/uwy7lbgAt9cR7Z8Z6IQgxPu0YQv5MbvaTls/U8E1TcZiXeQcKEeDucK3fDmQqLx
kogxybUnXE9kRSV+H8rWCdHZuIWrkFgbn8eDkesDBEHeUPUmfYtB30YcWkNzyNIJcdG9/ZeBForz
xPNeU7P+9/yb70oiS0myUeez6I6C47EmGQlnSKSHI4VeDhQWZGEgxAzWG3OQvO1Uj54jomuQQLyn
DZaK8RPzW8UT5Bd8S/DOvovekHQfGn5ZU6RobqaOoUyhvYsLSS4Xr5jjmqTgXq6YubEksn7iAwZs
rt7EF15ouCqDbUrM3bLsFl6QsA5rFa5jbbPDOX+0bOkkLhKNIxNc7nE57B/lBmnvL7DgctkjodXJ
EYQcFMTNqVmMKwlcWPTXHOuTPuQ8bTuEtTzBcpf3Lf+E5t7Y7GlCPC91S6XzvwGS5ehA7PzhOWJN
1Qgqb76e6qlued/et798IovJDxxEfczRlG+US6GEhtF/BSCnw/zv9R+ew5z42oS3aTbKh+vs56cQ
N1ESZzQlTYaFc5AZeTS603xDJUGfz1MwPHyeqKPSLIHmJSsMXxFvu+zRPsEbW8o8Oa/LMVudNLQV
2yu7rGwPDQULIyDX9YP5d+UZM+qEV/TzZxfl4dG2dtq8oX/ePIK5d2zFTHkGgMycM3fXuvHYt10H
E97gckalou/1SIaZ31aYkX1cstnjrEEmzSSWrLSnGNJ6wxFYw46Fk98wNryYweRf4GbiJ1M8Ixgu
RcWiOLTQ/OjYM9RIS0oSFent+MOLSW/h9nsJkHE6nM+wotmXJFO/ovUhO5uT/iE427NS5IfUhZF4
G+JpSlXynB87qPXkxhLt0yDu6Qtskmyv0tEAp1Vsfu3AEYf6509lIrhaW6ifl4IczMsRTf28Qk/7
MqfItONM2FQKUHo8+sTWlHXJadoCchu/Yc495yX9+QeNYSzxzEZyS4/OcotWikWpNOm1QfBbh7Zv
ajO3kHjGA0nzje/8/iXnSV+M920QgX48NCQUo9QOj/rUQHaTFszu29jpu8qkP/z+6el2qgcLGvM0
aUD+cJjuBB0sd4uIRLrNEINspsF9c5PYqk7iYnCZ3cC7q2K2mPx7/nb8LdzDlceiGbWQrObcQv0J
x8w101fcUgqR8euXE39weiIh+NUvNrwLQcyk6B+/ZC0OViusw0k2Tus6xwyNnjP3rsEbnpHWs+SN
jD++CYuW18yZV1ox6QEXLVdnG4/q41OORBIzWKzOZ5ZRv3SGSl/KP/+ZLDRl7wjuhBjMfBGxpLnP
/vAG/fi3OfG7xjnswPx6MqK3BzmXfS6JxKylpYpFp2BH2UnorWmIYfznl5ePJkdZua/I4tKmLV/q
C9c8r+uETf22HeNcq+Hv5w+3HfXePhKY+DAuomxqJ4cTRsudE7HAzsZibj52FtKGYfJvtUCodXun
8DA0RiLPzgqRXAMfpUxaUUHCmRiIoltIc6OQeYmybPnrGnVI9uSYnPrLtMHMC0DawEsWL7QPGoJH
l8Jh3Twn//NTTP3Dhwnv8GfSYyJKny4cXpbL3LJSstpldg2/+YQ01WdFWGgYq5tZMLeURsRmyjOC
5fh9s2le0vKtQyrj8Cl1Yodt3/Jb+93CV1py8uPTrILMNj+D6Kd8aothufUjSD27IhEbUUvx4YKN
ZX3saN3bekH3FVfN27qxqPrSUDb629CAw8t28ai0zBlz/5n/zWuwNH4dsUibDgz7SqgRfFTxN3+Y
ET8n1iXV2mHhpAncvmRJrAbCWNlvrQEN/Donblntsi9u3zJCce+yWM0sMS9CXYXkZcn0eauuxXfF
LzP405dzSUY0uCYUbMPnE/7cxTjNtyDb2XvchXZT8AlfwFudI7Zl0iC+kqR1cKfpgUTrCxbdxLfh
u43DPz+EPw4ZwJxfTyS4VW42yI/HxnzI4ZdZtp/Fc+LEFeDjEZPVHh1abl4XLkz1j+efsEVdsLrZ
yNZOFlmutWNGr2nvg6S2Z2zs7E0xGPyG0evwNFn0zPyCp7p1hXW57Yj1aLfl5B8f0VhFL/rZ2beM
1tX4Rv3nefrhecne4ehDuu+2LOLZq+hYfWwgyN2ctj//vtFjH0ay3NDTaMuCbw5bMCZ+wJZMKgWF
UMxgObZv+ury8Ke/GjhE6MKWVl6W48V/RqbUOBZeqXc97lTdr5AlRZg5lzRu2S3UK8Pcmupv/iOU
af73wxM2zVNaxWzLN2yY5BG3ynE85Y8M404+MIdJu0wkIZdh62yfFH3GQzsoW8fSp/kMixZa9099
JcHxSDypYoWQ4WQgocgrNvXbUqhUNKjjl5BESZIXvV7JKizrvMPqqBnFaPlVAHNraMlm8usnPFNB
f57XU7+OS/mjkzc65Sij2qdyxDjTF9R8MHDYKvdm7Z8fNfGnqb+UZT3xK5heJ5u9H8bjh2/xr3/R
VtGqTEhOsDc2Y6OSJPmsEEuulgvAr850vu+C82sow96zInaWxnVZVyx4QlznOYl7uysqLpkz5ChO
TiXDy0rxhU394zMM3z8lGk452BDuHJtgLXkh/qgMF8a7cGj/nGzIfZ0f4fuSdRKpdJGpTVolQHXs
ESKNbsnl64KaP3+WBIkrxAkyjqb57M8vicd9JSVw3imCRBd7RMPE983zNf2yMM/aQuxqPMCRQD/x
XSXj7Gq7cJil5l/+D3UubDT1J+LbdF8MWq5raGFej3/X04yfEgRbyWDW2Iq2d3I9gurwWBDsjL0z
hI8u+PEj4s+lD3p3aeeDI3Y1hYcdCLYNjT1MeEDwq7nEYuqHaNLzdJj0wVDW6zuaqd87+c2HhzSH
GoHhJJN/zLK+u/oVNIa2JKcyLLOxWqUcXlARtuo+y4xnoUqNCA0W2xGJomGzXR3/fzYK1P9+o4C2
ZY0lKz61YrFq7tCvI4eF8j0shkQSA1rYzZF5aWyVipt/cxgLkbDoKi1bnu96CxwyA3z5BiMSlpyp
UNRBTK7dnJbjcz1gk+7IjM672EdKuLz75mH2NYkvyw9nsMu6Qlx3NCyh581pWW9Z5hz0lrnRPS2E
/NFmYMc2EHevfbI+zw45kMtO0BMhXUtDiyWwutor+p0GaZ1YfWpYQVMy73OLCwFrqqGLsZ/j6qm/
yrY9HW0Im02CTbGCsuvneYDu74wzl+VbR8BzPJvl2AcY7dVLK9yi28OZLRMqV8tlq+zn5QbYQ08Y
3iWRkO+Hmw97422TlUA4ZvLT5qDH1pl5j+SWDbBN7ug83AcS6ZelGJxxo0JzitdUr2+tM8jvWQq1
UiFi15koqePUR7gmO4359blxGEehimZ1c8Wwxo+YL9dNAt9Ye7ETdb1WOMapArgnC1y5SBGN8OkT
3KSOyKYFo6yjLOnATZoIT/GfEPiIwUXzGVU3ygZ1Rh9XEN6+GlsZNCr77/N2hfaT2uy0iy1H3I8n
Dvm+UulMJ6+WqeslBxv5CjVG0MWQLCXNmCebLQW2q4vhtav3KFdWCbHKq1UO5LSxzVm3tEns6G3R
+Z6twfBWEjpnco/ufcRrFJzjgmAuHkI8pbxGx9Nmzch42WZDMI4zI6d7hwWe2xU8aDQZTdthdKTz
pxguh+Ud0GaZEpsTXyj+Mr1DeQCdhMWnbIW9e7+R/BokFvlmGwsN1YBovXSxdL3b8eig6gke7t3f
eZfjfjRdNJiRSlbhmSCx2lcyPLhkYcRMt+C+VtzBalCEh8/nGLdh0FpwuUQuCVjZlD2KFI5mjbtg
2+k8+Nk42hDflJI2vIkLrsxGgFAKMnKRIMk48eFuKPy8YokydrEwYXqPW/bOyGK/3IpGW3oNrM8z
i+El0Z2R3u8dst/JpGDWUFA+aAH61aMLm6IdeHfWoNyXe6wGJ1J0xXO7hU6ZrclCWHkxLrB51ttZ
HLFotk0FjUVyReZ9fsRANn1JLfP5lt45ubL4lUktTWuSoMvTToibeG3MsmqWgF/SGWXdO87kA3Jk
YIXtsGUXvwWPZ7UMGLs+O/jJse3eZf5Gp8w9UXBv62KKrw+XQClIYK2cQnka8yOkiYyZvx3fGUfL
S4I8zFw6P1FX8L0+vecN+/5Kxw/kjFqQbtFO2h+wfCB3h30kfQBrTBRyQ89bPG5tLQJbbzfEn7N9
y41WqyEFbUnwYxU6InvWW2APlFCduq+2x9BwSMajTnaKVxVsuV91MNseFKzbVlIOldqeUbInSzoq
56vTqztko0Uvh8zX3pKg32FxNcth/cIP9+iWw5M8K+QdLxlZ2aHj8MNCsVGWb15Y3Y5+Ju9OegSP
c7sji62eOQ1aXjAYJ9qSZdVoYjgEagT9gz2Ys7UyMQbq7gyGbFzI8sDXGdvE7K0vfB+YFd5T0cmn
xELIzBbM3tYE8dVHatCriV8s9nfMaZaxbCFBdk9iseAdD9/jiuuCHJ5kIWiLhrTjV0B1bjAyxY8f
yWdj7G95ypZmec1Yb1MVFWv1Qc307cXq8X23TOItArIYvErwff3ew9X2PebdlJcQATtvIafowWJv
ts9474xP8PjqycK9HcY8nK1TdHm+GiqSfulwrNsYNLUXxArnYaueDOsMRZa6ZPvUvXJevDoNOfSz
IeHT+WTD4J1tRErXYsS8btpxbd/fphuOLsGL5tr+8g+droJRcaSvbMyz2xHWh5VMpSlfx7HEAdLU
YEe8tB1EE71yjJKTu2IXJUxjWV3NMTzyeovVFgdxr9j5G9RyWVGjF3YxyFcjgQ1qCJUsvXVo7y+P
YN6VI2ZWfCr7AVb4r16jlS45QwMah9l2p9DZq9ezYZaktplm7IDfwYlkn+pdvP/wXeXaquWZQBFo
Z6li8bHQMv6Zm4B2ZHPF/U5+C16v3QaW3fL5h6cjqzuO9LDS6WwR4YLuctlF1+SgMeINctzvO6wh
+XLOaNtdtqibJUcLwmS1xcg6IdSdF+vE3KwdIJ4IU9RrdhOgJ4xLFmXXU8kbbfVEt6fY0X453zn1
a/a5o3hDGzLliyMmPNCX7d1lC+V8jfutrQUQKPuEIn4KYuZQEwPTsEpFNIpSmBC40F6X6wlfbu0w
pEMOodSPZEX3ohzK0rEAhUPFErarM1HpYMFskCI8Wzw+8UhDBsaiV0Oq50+9oI22uoNQep8khJti
3O3Tt1me8jNb6g6OBzZ/dGAn59uE15ozFHtphq5dMZKVhHPRVu/sbU54R7zi2jvj+7nUYA6oxeMC
7mLsnsyGd3kkJEVhldFlDBYIO/niwQuPDkufx+GXL3SmM0WI4tUZv3oh2D+msYj2lw1026tCpnoo
urjLtrC+H18kuq/Ccnxo3yeqC7XDUhMty3Z3WXSm5rcGlZLkgIbeDPaA0Mai0ugX7WC3XfOLL3NR
wYsmJzMVTh+4k2z8NILth2cAzd24YXE/RqifxastjK7msi3BC8Tl9vSEND625FgNOBt2DxODsPGX
zriXZty7JQY46v5OteJB42aTd09j89m9GZ5dlpnA5ovC97YLiWu5nei/vVCNFz1j+rGu77ZDyrKB
h60vmT/TYzTVCzZclN6Jf1t9xZB+kg2Ikzfi7/naocGRPB/wEC6IddzG7fz9LgNY+Wtr4n9DwU6C
vNH5OD4ZrkMzZvM6eMLVap54vu7f5TgWtm9uK79nQVidf3g+mMo3O5PQPnhZR9O7DKfv0SH57TYX
7Pja7aH2SpsF++bldJqW5tAbxpr4t8puh9jwqj88tqUldhSFcheEoB5b4AspVOHTOyqMOKH6/rQQ
40cPalSeTIPY79cik+N3HwFzuxPZd+IpmNimd7O9PWU8WxI9rj+zuY+Kq5TjsmJOOZwXpwQ1Cn9j
s7Drcjij58wAxQ1YHl9QNh6vXQLLzWGLTdp3BbWbEzVep/eSxfN0jMf0ezJQM09CghcRzVhxucvw
LtoFW672vRDM4BvIChowX28bh0p3dDSC+9WjyNyB00381GBaotLZvGnQ2KC3i+r5C6hCv/eWm3Tv
GmRpYKye1mrLz6rUoZ20PbBIevJWXC5aDplyPpBbvKNt/37ViX7A9pvhbo5btjtkd7ScQY3zg7Ip
hEVfNZKfSCe2kXqOeHnH8995hLaVtOp20Z9R0t1HRu6vVfzrZ/B+QMdWU3yEJQ0J9OFRIcv5Mm+H
9LF3Qc92AbPWDxL3WSvtkbNPP1i1WIjq5XX9RJqz2pPACTTUxTx8wqR3qClcz+F1GLx/+cwCr/Cc
8TQ3VDTxabKa8FZU8kpG6mvWkAlPHTre+gqUw6bA3H/6pXwecAWHD1eI45Vr8UWRyVGa9QcWXN66
M36fhzOcleOckU2NioHNv50hm/fkd77FQNO7CijkFZ0t/Uc5zr/ZFQIvzZiXf6qYn4zgDDTDJou2
pRtzg+LnL7+YnTk7pxq8s4U0pjZ4tM/reNQLjRs4KzvmPp+doLNlx9HE30mYyp+MngzralqNHjH3
6H+L4UKc849/EbIeN1kVG6s3hNnsSbXvcunIzXg+omyMKYm5zYrHft5u0MPjK0rj94hYUM5n0Otn
ylzTWjl1IiEO6wORCbl+9ZJb3E+gj9QjcZ2VU6qzLAzgWNp3gm9rpeweQ09BzblGlUz+tPRSNAPo
q8WFLFJ72ijf9BUcxofLrP1CKvj9cb+bM573ePCse0uneKBXb7V/erTbncYIpnpgqZBYPOy29gwc
2TCZneUrNOUvRdvnN8Kv826fie5xdo1s4Sn0fty2LV+sYx9mGT9gzVqVhTDq1EV0W9+x0R2kYrD0
5R6GrRyzw2alomFtdBSWnfekkvXkWX82tTeYwb2jQjMO7TDzH6opFOaTZE3Hgqr+zAWJxkDH3cct
5x9pHOAbGy/62qdtMSb24gwwiBux0FduOXslCZjPFFEz00Q2BurlCuZV3TNHc3A5xlBgyIouYOm3
bMuxPEFqaGq0o2a+ncdiHZ4oCHYqWGydij8+qXcLe42l+WnZ9lvPt81mvJgsnPSzMJoXoLOSz4mN
ky5m5UZEaFOLkMXecpV13vjMwXxFC8qlq4+E/R4DOFZnh+ULFmY81owGhU2asIU4iPi78McjEOfz
ohD2wuFd06jGvkqP+L4n93ZIb1YCjZtFzGkuu3bopX2C5DBsmOVUl4LDvVWR7/k1WR7vpTO8D2qC
dAnOzCuWcsHd05VCSdOS7Tg/xsP9cPARNDcZA+Em6hoJ16iRDw5dRuWYMRFAAPftZmQuoZt4IAE5
A94fObPrLCuH9zfg6OdfZMc5oOGn76WgOxCXGiVi7269hcvbyFgUPD0072V4QvoJEfEwMzNhNuYT
zrwcscRjT8iA91uknXOfBBdzjcbQXXLYnx8ROauBiwaa1jKyr7c71S5vPR7vt40BN+P+ZBtdT7Lh
gLs9uhTdg2GpDeM/vn231BWxyAu1QjirmfE1pRuL7qtvK953eQb3bTqSaHXCqL9ppxl6fvclble3
KhvLk7xB/JZ9KZo5q3KY6hV9zFFmAfAIKT4CHzXsjJh7vDzjP7y11/sPhQsj5fjpqw2Ui5aQZSlI
K9/DSgbRH3Osndng8MV+TKFI8xmxg8/Ydu3xsoWwlr8//6QUM41xRM58x0LjtXVqP9J948a2A4ss
7Z4Nl+3YgNO91xhsQy3HefnawCwbDsw/u1nc9vZbBfxV9mThLmIkHkJNIZSijIS02iGxVsYENYcg
Zev9UkZ8ttoArOb8zJz49orH8T5s0NLaZywa4SQmf0VG88ZoWDxzponEnKSIPOoLZp3pOvOPfMmN
gH0k/MC+WvLcTiqwg1hl/uLNET/VpobcaqZS9bHZilENrAHMqOzxWCPSsvaU2qiWwyNZGI+6EPUp
OEMpopbOG7NCYuL/oFblklhweBfDkGo5youzSVbfW1OOV95y+OmtlTJ/tCOJLj7arD4Hqrb0WIgf
X5xLw0jfk96VvXII0Ca9Xll8QB8kxgdKoM6nHUrNoe3wLZsApcO2I6daLcXE12q4bxRMwlpCGV3l
qyMapncMaM9tVIrFEvbocs4kRkylzWh3w+dfvFmky1arxDXnoOObRRbfTZBxpf8cUdTtfJJUzy4e
VA9FBsddTA7x+in69GI+f3qVniZ/5p6HHjXk6LUiyRovHNYYRWS06exIogX7FuI8hhvkF9KAeTa+
2z69KHfDjfGHaoNTOKOl0MGwNu8bCT/rbzy+/PKI7JvuEJeWbUyNdmjMzLofKIT7PBM3d9ybU/8l
1gTwk15LYFQeBVkVdtBysvC3qEhmJ+Lz6BwLgZvtjw+TuLvIQvhBdv3zIxNDWiGlTh6B6dwtlS27
9uCIj+FUMPUTspj6MaeXIIJrLa2YZwg1e/JhCEC91RKzjq2bDWtrA2CvDJuE0tl0+rQsKVjmkNL5
8e7EymluyLDEGqM8y3s0+NHomhFvRnxm3SJmp0HvwI4tINZnlqLx1O2v0EIUskkRi/m3u2tmf6M3
8uuPg34qI5S42GfLTcXaHx8BoniLf+53sddT0Hz3/POb4rETegV07zv4vYz7gs6yMILpPNnGMhNH
7IcmArPKgMSqrhXsFy+99XfM3+Z9yWp72aGvad6Yn+/vaMza+dZY2PWROJ4RtUOUJRRSMJZUmOsh
E8bzq8H0zlR2sUMnHq3+SdF9R7cMP6qF+HpisM0VIhaWvQ0tufR+vGHiI2Sl3ANB74WxN/jHnrOE
zkj585Ng8dhQ+qWuV3KveaSm+nz6eCiXH9Qr0O2N1Ly9qMGbOBuS+Jig1EoFC7/7rSNiJffhUjkr
Op/8rDFtP4ORSNKO2TfuIjnm4R0Ufl0R3NN5y4OX3UAbNT0e1toS6b/8chbmjARm6LT81/8vwbwg
Hn57YuJDEVJLr5r4po9G+25Z5vDtNnQjXd+IPxRrZt6Wtkdcb4Nb+Yb43Rhte0asg/8txYzcK5TW
lkUOL1Y6nM+aI6R4Tqj5UDxHSOXChqM9+9DZqgVB9e1cg/P0aVKSv0zQUEfZE9oqB9y8N9ts+nvv
P31WXkXqDFkf2XCY+Tc2PV856X38i9f0GWxnZyQu1dDE/8jPXx26N3nrdZAHLD7d7VI09m0Da7Uj
ZP3zmwbpNRh2cr1hAy+tTGkW5y3Et3lJ7HgIJvwbjnB8VCY2Jr0wxurLhouxnTPLfNUx74aT/fOP
//i06IOdBcci27BfPCd9e4dPdg1IKIqZ6J6zGyB3fd8QJ+k/jmCfk4p+em/ymxH/nr+WsXiklJDm
Md2PnmJzqm8MsGKIudl9gyZ9iVncfYpRLEcMk19CZUWdi9/zok2GbObcogCJBNcDWJ/rm/z0Mpdv
ZoRWYV/RFH235bg1F//0d7/FtdN594MMtJlFxA6Wi0xoYZbCebef4Xa7sYRs9lmE0l1g0HnpzRye
2+4beCi7JESZ1/5dXx5mOn24x6rlz+B7hoglCfHj9lJyU3ZTCGv1y1xbbcS4S5+5ocf2mSXfkxLL
dlvV4GWvhPlZ6WYjcd+Gga5NS/x7ESP2LjA3kpO/Yu7Uv+drRU9Q4M1H5t+52yrmOKvR6r3VyUXm
h2KYJUcb3FC4LAn7zBnX8yQAfeVcyOqxzFp+lfUcXLS5MxI4VjHX0B0g3nQNRddDXA47fB6QH84Y
fcet2fbsOXv/+CshR7RshYp1AxThtPTHxwVCwxOm+RSezWQrFmU+T9AXza5k9VBeznCYNkwP5BrS
da74BZd2NxuC580n9ut+cMZh+WigPB3PzFvM1ZirR6sxJ/6Nd5MfNPm5b/jhp3vZto6Y5jFwW1oe
NXfyG7Hx7FnwLr4LEie54dBTrWgoDmYFNofNMlYd0wggP9onElQDLhRk6xTmyZyTePLHVJPufVjQ
04ei67wQg4jXHM69G+OZ/JDjyY+moHTRE/ef5JoxN4g1o6qaJR2SbFX0J/NrgayvSrIIHhCz75tT
g9fnnCWDUcb9b55RWqSjPz7aqfF3C9FRe5Iz+b6L4VE/GtjxAYibnEsh1NChv/thCywH8bw+WWfA
HT2TKK8xGpvFdvvT+3QejyKuu/fyjWy4H8nibKwzWjt5DtHhLJFMVbqWzl+GBh/uX1mwfuWIzULj
CEk/25JkfTiWY3S55mh0DRcLVUnaYesfKYw9ZpOf7IlhwjukfNdngif/VKjO5YkmPUwPszURQm6w
jI5LmU7+qI86Ptc3SHILypzOTjP1soMKxe8Yk2R/fopxPyo+JIemYXgVJ6UIlOgJFQDH5nK+i8e0
Zdy4J4/yNz8q2WG2zZHXojlZqp8oU1/m9v3nV/pS6rZcuMcGJn5E/L6JEef9QYafX/K1ra7sLtux
/sPX+WHziOkrNX1g3nrOoln7bYV02ajI2r7udNZ+k1bIn2EGknz7YEV5bcuxEY3941tYrb2xoHEb
5D89OOXvPOYfs4tAWMIn1jfhiN7mZPPjl//ob8c0IiQGVcLivuCO2KvHPWwvwp70ltIOc/5MoCNV
TyY8iht909hAd6sZCYs0QuLnvx/i8E7l4+Xp8H1Nt6hrhhsJU38n2rVT3g37U6/Z9dQVsfjNY35+
euHocUYvqZ+D6POcOGYdOtM81kBvfzAYtoRSCv6VUng5mwuLYyNseXfDV+h2xpEExQPH/a8/6LO7
ThKy6dtu368r82WdDsQq3rro6LEzwDQ+b9zUL6vklZI+zU3Qp8R7jzpiATvv0T2eeQxrWjf5W9T9
mzcGtvQohvH6rH/+LltgX20nf9zWfvM30pFNOdTpgoJ6xgWJHi5vh9/8xvsaKgk5ou0gfeQn/PTX
wohYy9bhqYMniCULymcy8WGWw3InNGIVF7sQtU0oMuibk8Xu47Y0em2w+dMroXLYZMN4lK8w4Sme
TX7nz/8zfv4i7ua0HfibdmjqP1g6uTdnPGZLGdJExdRYbh1EP0rtghy8E4xU/ZhN9bqHImgHihr2
bMfa5zOENl7K8F1Ni3FVHF34Z6Pg3/793//n71sQ3vXl+poWA/rr2P/H/1kV+I/8kv+HLKv/wdS/
b0ugXX6//us//1lC+Ne3rd/f/n/1dXX9dP/6z3/XDPS3b/Cvvu7z1//1wr9N/++//u1/AwAA//8D
AHQhrXeJYQAA
Fxl5ip784+//DQAA//+kfcuOg0yT5b6fovVvmZYBA5n0DnO/ORODL1gajQDbGGyMuWQCKfW7j3B9
ammkWc2sS1WVREacOOdEYGM/CxJDPN8v5L/92qVe55tE3b7sgizJeduzH/+mcbch28W7BENu3EVQ
b5sZG4V/6xmkfQG2ymODRP9Kk0nloQ8PBjAwCiaSkK/gptDb5/efHgTTjQuO4DAzh4btfpevfALB
Z86buPgwIadrPcKnpoRYzx7Pmvmlz8O722x/85V+MQavg3h7ERGowy6gzSNJVUW++xhZuylf66P9
1R+ajKvOtm8HHhX+Poxom0oVo3v+1sCGHM5Y2zZXg6zPC93hzVPkxmcwJT0zf/iPV/0MZrc7SOp6
Pur1L6EnZzkwoRr5E3U5FTJmzP4RXk/HEruV2uVLlaWpslv6G96dDtd86qMbB9q821NN32yNIeSj
Ei6anlMkDdw6n/PvsGVtSfFdj/ovd/OPMB23G4zUdkmGannEcC+EAbWwQY3xZZ90GM7FGQn5MoNp
+myPsGVdidj9ZfbiTXYILBbyRNPvPu+bqgWnRUlpGLAKUMPK+J+fi3U2CYz84pk33hHxRXFg4zqv
ByfHF0h53Lb9suaH8q/fVsB//Y//h40C4f++UdCqpxveH8eG0cGIiRzh15legHsJllOwgzBQU4ve
9s8kIAnXVKAa2UIRfL7rScnUFzzHjoHi+8qA7IcIlezsP3BgtWPN4od8h8ct/8Be8QnAMmdBBp/G
E2N9O58Zg0lbAAO0PJLCx81YmgRosKwOLyLLvpXPfY11qAR2ja0KnFmXD587fG2+LdXkxQ7Ii7d5
WEROslfIPg8WXipLGKiZRd36KfVd5hQQHHayhORwrsF4PpMIVOO8ULuhPpv3hRzD0udtejl3cj2D
K3eEH5VT0XRW9JwgPW8gNaI1w+5SPi3XXgGcu6RUOzpWLRwafIae/DWxnYw06PjLO4Nal93JN4t3
9bz3yhI0thoQPjXaegLPIIateXxjqxMVNrUBx8ODVsXYfTtLPXsSShV4v2yoM7HRGNv3qIGTpR6x
42tWPvVdAkGk1ykN0u6bE+w+I1lVXy8cii+WvLvDPobX+nJFmyg+9HP3OmuQs8EebRzcJIMbDy7Q
h5dJyjLX8gkV3wogKeaJKo2HevG32Id+9VVppDzPCX0RCcFxG+yoIQcvQLgA6rCLZ4GMEHOMBurr
BU+XiiKhaELA2FK9thKbd9hEHmNLbBuxyotOiC3WtvlsVMU/+bQD1ZhMW18XoZ8pFOvnx94Qbo+i
U+rFPdKivO6Scb6CFhAZOtQd5TJg4iuqoBDeGZoPsGVsLGQXJmCYcC7sK8YuJ24B1MYzWjKpBvMn
qWMQHoUa78r3HmwnDQ7w8s3e2Nt1Q/C7T2jL0KN+COZkDOarBmtD6LAuXCZAg0tgA2sbGNib8S4f
tENcQEwlG5uHk9Mv6nIqQY6DC94HYcvGzSQucNnkHrWFlDJ29vUBFCz0sH37GGyccr+FgqTE2HnO
h6Qvb2cfBM4ZEbABRj/t6RLB8auH2Mt3BzA3+9sLduZXw1hQF2MezwGE0ni9Yp8PezZDVkzgjSId
P1pD7dm1ylzo21mCSnHjBeytvGJIi0+KUZOIBmGHywDx/djTS33pjVnRHhK8LBSRbfa+1fPJX9Bm
Z1CH4ipsjMkw3AyQEUnYRfW3nqPxpStRqmwQz0NQLyXnlZBnfkD3mxLlcxCdFBiEbkatRAbBFPET
B5/uKSfN0baDxdkoIuDcKaU7GbJ+/uX39MjuFB1OjrG49w0B02mxySbdCzVrg4cN3UdW4sDM/VoA
n5ZAi0c6KQPm9wvfcS7UuvSOL9lb7df7n4C6ZDPWZi3tqYK+Cxj24Z5Aq3j2U/SxXzCaeJ36AeuM
aZI3BKCnvMcZp4T1KO+IBG2Z87Dxsop+uuKvDw5fUUF3/7MqYgAyIEmfPTVsttTz477v/vANf7+X
ZNZVPYWYKjYOXP6Z06rALngYIY8E5ewlgv6VO8ilRYv3ZY0YS8KWQDGva2w03tdY41nBbcgQxfn9
HUwP81LA4KPdcS5AZtDS32dgg8oH1SOZJW90OjZKmEYeNWyzyodjIBfwxrsmDU/xt55wHlcwIHOP
fU4Je0rzzoVFnth4D/EdTCbi75BWG40a9xdXv19toAFQ6He6H7ZePSUprKBw1xD2PW3MpxrmGozP
8RWH9uZuTBnKfNjiKqW+sLxrIsMoVDeoelCH3CtGr8RGQPceiJqbY8Wm6qJVUJF+E4GNWzNTb1sp
fdwLdOmsMBDwfiNBRGCAY63ja3azbAL7RcvwpXyhZL7H1wZoL31L3Y+Cc6I23zv05N5E03Bocyb4
iwlzJ/URd+kOxvwJ0QsOh6HDu8/8TGZuM7lA3SKVzEMugLE8+A0QnPZDtRsHwWSVIoIfKro0LK+7
XJiSNwfqhiupHQZVsHScGcJzMObUq6AAFv4yplAWXw41hOc9f9C88lXE7i3Fu29Ys8225uBeSRQE
QDWw6dtXnHpzyJsih/eCBdjqGXZFo+Mwy6a6b6+7O5xqCdAo/y716F+WF1jPT5SsoMYsFcUZql/b
RSInj/2E2lMD49m9kUm40py08aCB7eHQ02DrnY0h9fK7hKSIx3j3HWoaTt8S8t+Tik3/Y/fsPA0h
VOzogk2BvQ224r1cPo4V1XXyDIY2EEVItFuH9ZTfJi3YKyWYhPsZO0JYs+V5af/6M5n4cMjJ9CxN
9aCVMfVOsg+o4Cs2FKJNhC1/PhqihJ8alEpeo1FW0L7RPRQDz3jvsbXWBy8aqgv5vZlgo6xxsqz9
GwZZgvF+3kz9fDRfFdi8XhcyUdMKxmfBtb98+uG5MZ1vDKnF60MpqrkAsE8TVlCbJvsP3xeyAxDs
d4mOf/k3F8ODQBZ1AinDChjf5yWN/vDfwr7PRu3QIegewxGfPkuZDHoUFNDTuhDriurXYyrGmfKQ
4x7bFlPqDqYbG5CHbSBRSDGYKlzzYBzFFAd7i2NkuEkZ/A6jSoCYWYxqr6AA2knS6Z52Td51h5Om
uuPzROakqvqZHnkOFmLg06B0sp4scUPAMhQu1a0F1AN/rxr4CccYnzpRAUM+njmA7fyLIHq1/TcD
VxO25vlNDWsvAeba6RE4rs1h4/xV+g9UPA52WGTUa9xvPZnOZYGHe2/jgKLYmC0yHcGjkk26n6My
/+UbGPKLTrp33TBW3u4uaG/9Hk3VhfWsCYkJ8CK46L3Md8BE45bJ94Q/4kzva4OMgCA4vL49meJn
39Mf/u2i+5n6qe4FQqfCAtINQUTonK5n9Gjd4ZIpIwLY65Ohzi+SMvJIQ5yXGcFQifszDHPsU4c7
+oCXorRU+SaDOAzLIphPvhJCr5AFjCSXgPGpygsk2qPD/s765rQd4hCYcwbRHNE3GLiZ02BQdlts
g1Fms7upTeUB8UCE13MCy+nMaYomvGLshExPpsN4jKFyzMMfvtZ8s9+LCtO499qvunraecIEHxUw
0ba+9MFk7eRUuQQWwOFtP9ZjEJ0keEUiRB9y19mkNjtOFZPpTHfwoLL5ClsT7nYCxFZSP+vpNBsQ
bi19R/VnY4K5W9xMOeH3gI1cn5Pla7UQpGb0Jcy9hvWUWPUC+Z1UYtyPNaA7uinB7W6GNJhrmA/b
+hkrB9/tsN52m3yqLm4JP95yQGKZlWDWuEMGH9cmp0alnpKlFm48tEYxw4Fy8gBPCk+BWjYgvH9B
q5+6NAslCTdHqvNtWQ/nGwjBIGIbbYWwBuNpzgcoxHyOH43j//rvsBY4R1HkmoHgbmob7rhDRu3u
IBlE/FIfTvGpJJPfOPWyme0SErF+Ikk4z/nQyKoGdSwKK5/c58OukyQoy08X7yQ61a3wkmL4wxcj
xXy/4gkH9nctIsP+8v7j44C7RTwSLxgmVEHPBQ6vvidNmWmMv7rmHSzP+ITt3n3WFLmyBtuTpKGp
957GLOfWEa71hDbzCyQ04q+hkklsQnP5Htk8KU4B3YtH6P6229XL5fTRgfX+tNT0P03P2OFTgMc9
/eCDmffGeHD8Cor5s6ZmL0/59MvHBnhbuv/QKvnhM7Tz74MaiTOueGTcwcBfAOmoafY0xCSVfTtN
aCy+EjB31hWBW3cRMW6czqBjOyJ4OSoeEfrxlc/a9NFlfm8n1NiUCHz3k6/DRt3pFNmIJkuhXHlw
k7gIwZ3RMWbAWAJsOlnYw4oSDJ1PRXDjfRPN16tifJv9rYFendyQMt59xo4zCyH2RIxE67YztlL+
4mHZfgN8f77ebLiSNwdnXhSpln+Xfiy3LxNuNouPFA09wIKFqICJvH2T/r5J8umQGaX6i/eKr4xx
7w5B5XAPMJY9i5UP81H8+D9FKnF7OhgZgSbYHalR1jRnSVosv/6Dus0XsqVZtrwSbysTbQfpG0zW
yffBtVDnX/2B8dl9NahmN0IkGxMwmW0s/fKDupmmgGXtXzA+FHsi2Kma/PW7nGw/OPjMJZjeu8Oi
QqYffv2rnrJnJamx02/I9j0IycwvgQmq6+FMTWq9gpmFUgMsZ6qx88OLXV2E0AAdj/hRehrURAUB
dYYPZPkAAuYMfRVgat6BXhA91UwNjQX20USxHSxBMI3wRCDk8y11XurbmNPTIwan9/eF/de5BgR8
prv63HAjxat+6CthbODKL/FtU5Jk9K67F/Sf0YU+nPeGUeM0Z+rQGzH96TVWAxZBd7+16J6YYU/Z
0r2ghVwf7+q0q+lJ0hTQ+iQkSxfJBtupT0V1AmODpvEe5Gx30VKYtsyieMlNMIvfR6rs1aGk+8Zr
+/n+ro7wHS8PvPKrZKkRPCorX6GPeNj2i4JiXXVmZ0t96a3X87P76pCdbjvqaI9P33WHmwYSUXqR
mQg5Gyrc8/ArlgHNldMXsIKFEiRloFD3FqJ6dpqwVZKyVFe94/VbhB4Z/PkZHllUMG/p21aA1W3p
bgNewRIaXQPX/kbd8KEay0gXX13xj/D3NK7ZRZQlsH0NB2zvzC5np4u1wL46+vjE2jahzfgNwfr8
hH/brJ8uxSuE5z7vqaVK5346XcYKvLZKvj5f1VPAYAxJvG44JVUVTDerKYA8MPlPTy1vpah+/RQp
Z+UJ1n5sg0OfnQjbvn22/VrSAJ4ns6N671T1LONDDKaxTpC4T61goscR/fQD4ZZPBCZUPEtVh9ER
yc5GDMaRdtmvX2FLQHq+NWAmgcXODghGvl7zjmy1sMf7BGufh1KTnz5EDb0iuOoRlvG5Aiz8nKgO
hgws39d7goEQJFQvtCBgwpw30CqiDfaOg8O25nO8w5wIHxrmn6QfPQllsAu0D/U/ntYza3dtwTOY
dv/gp+5yJeijhVLTyRcwOA8aKhvjPFAb2hNYikV1Qcw7d8KVY5HMHy0uYTgsb2pod7futZdRwO6Q
YcRsU0/4JpsQPF1Kih2hUpNliZvhp+eovcZreZPyDsaqfdMdEQCb9kZggrjoNCR+zZMxX4Ugg1yn
GRQDSw6mRhY0CI/jB2XUfPVTCNUzvHPDBd/uKg8I987uYP85yjSU5Fs9qyk9g4/sz2hOpCKY2usT
wdUPQUtJQTKeX2kLPF6UCP+6GLnABa8YFvnBpo654wFLwpLA/n0I8At5DEzJYMXgx2dz8niCDiGn
hc5miPARbjf9yo8mOC3GhJaUvyTsQm8hOO+bhu72dWzM8TWMYO5fAurluxkwbXtESsjcPS76G6jp
LF84KHOnIzYeuRvwgZmZwD2iEYen2Kvn5HHlYLUrPbpb/ZuveKuPyvs0ltjutzNYRsOHv/smOzPv
A/bSPR1+Xt0JKXnwyEmzbERFuCIdCQ5u2FgLyATzdt7ioORewUiEyYQvu51wtuPlns2On8EHMbfU
vV8P9TTzh7P6ak4a9pKXVrOoafgf30ezXmT5PD3TI/Cic/HL74RdgdTAVR+STRXahpASgcBC2Zyx
cU52gXg3OQiLHXenJneTE3qz7AFoQhNjd3N6gkk0VB9+TGjRq30t8ukSTT7suQ+jBhyXgB6tUPv1
b7zjz1fGzrf8Do5ULmj4NU49k59KCVtc/ukH8MsvGGNlRkxMP2wx9VQCh7dO0CC+WD6ZoO6U1e/E
zh2bBm/t5Ew5iI2H7WLnGz8/Dh6SR0Ldb/nNp5vmmjC0CcE+LJv+169A+SY+GaerCJZgbie45jvW
2+MbMBZ6NtysW8Xu820ly0l9VHDlx9jZNh5b+YQGozCY6H71B7/9JYNgPS/d3SOj/+OPWA4pNQ29
CEYDhwgObrqjF/Ac2YKoP4HdKaro/XXRwfzjQ/5396K+Ono9+flDHYxibH5Kue4sBfqwLhsZ29CO
wAxDFsFiEJ80OG3XvWQZSjDNugiJzHj3w5qPULxzHmricjDaX39VHs2Gmpn4TpZXGy9gN7kN1RKC
6sXjehEeKSjIkz/LYAoj24cn/BloaG+4YAEf6Q5sK5Hwzx+e32xjA6QjlbyPvQvAz2/87o3xp5/z
QQ3rUg1FB+HwA0jCHofOBNOym3CQRbUxeYP/gvKVl2ggQBbMRTOf1VWfkC2564Av/dGF4XFb41Xf
sqWv7xP44e1FqG75kKSwBKabGWSRJ6eeUtFvQQXTEGvztjKWwt5DsPYzar96LhmSlK/+/G6w+lXk
6vZQWfkJqp9TnCyPV22C42FqqYnpWM9rvcmbb1rjQ6Nc2LStvzF0LwFZ9cmuF39+gHfleGo7248x
DnC8wx04X/FO2Nf5jAnT4KI+OeqSdkrofdNG0BT3Lv7TI9x9jOCvn//0zZ+/YdeviqI8vOerPxNC
2zpIf/xs5T8SAIt9ppoeD+C11RtXGZ3pufoPHRuPFzWC43IW0HZjAOMNFPn+82+QFHlGP5/uPgSj
JZ8Rl4km2/7wpto2GjV5CHoGmzmEKx7QkJNI8OdXNs8tJv0Btsny3skxPLdFR4uvcaonFrY81EKH
x454tRm7Hj8LHItIpft3vDFmeddISnrBHRkbKQU9+gi8CuLURlw1t2xUnommrnwdMS4JmPhN5+H3
PCu/fbMfnoCLcSYY8c2znzfc01WvxWYms7rd9qzs3RJupw0git4bhqD1WgvddvTQ66k8GDtd9gtY
9RM12jvLBfN0lcDKN4jMvk4iYMJ0WF+iCBv9IQpYZUQ6iKQ3pn4ebJKf3wXyEJ2pfhmiYKJCe5ZX
PYmLiZP7uVB8E9zCaqa7Kvj0LLgEJlSefo610/HH57ozPBl6gn56ms1sfsGIRg21ZM9OxsRiJQSL
eaaOeG1YA2UFAV2QRoqSQ5bTn//PzfOBWltf6KdV78v896JiM+C//XJ+tek/fioi5/5bwWpRm87c
0XTOx3zOHQv9+j0O+uEFfv6a8sO7zconh1XvqSt/In3cS/VCXa+DCbtxiLGQN2Y3Zgos0HOLPdl/
J4sBA/fnTxPpFpL6D19v41Jg59zJ/ZyNZPjlC/7hlzDcyhbExxxhz77CvOsv3QCCh5VT37prBi9X
WQwTNQrQIvl78BK6ewd/fkd5xWE/PSNFhKaIXRrodpmw4wxCqJ0UHdV7twALF/AaZHGp0cMGmEaL
86yEu+/EYXOrucaXW/YTnK/cgq3sUPUM0bj86UG8o4OVLKYeKeoJ7gL6w1NhHsMUWnyoY+uaXQPG
Hgf+h2eotvYpIBM/iX/+iu2HczJudVWBqVAecTC2aT7jz/0MFqVRqXMAXr6ADNl//GOdBzK2bakk
r/FDvYM+CXXt6Awemy7GKK46g943ZQx//HUOkANoaChnyG/oC518vWZs6/U2TBugoEryRzZZuzlV
WvVyo6v/X5NdzQhkamaQhi0nNr9aQ4f3KCc0nIXamA3AMtWtJRV9e1LnZLaVFoiyuKdelMxBz/eW
Iq96HBsrHxR++bPOv6jvUbVfYJkskGs+N+xMbG98YjvR4D1tyxUv9/Vi73YS3JiKQ6ZeOuSL4Wwi
uRWaiFqr/l7WeSbYgeMVe6+Kguk0BxxIszZa/ZCXMZ3MYgFefbjhcOKuPenr8wJ3SFEQlxyUfLnU
hxTWmvTFyZqfU2BmNsz9U4CdxA2TrQkMBR7S5EHmNR9YaCxnlYyhRI2XBfsfngOubguKH1cvGCCz
dJgkjODgtkT5QnaMg2MRqySehbpeyMkcFOkRv7C1vtQ4HsNdCAv3bFHH1975bBHpCKvtS6O5bpc5
bYOLDetLHOHdxcSMHgZYgdf1M5P5ZpT1LHRNCgH3pdS7nYuAFQ/oQ1s/bzFy+G+wqM23gOlEXMJW
f519srSAg+ZhbF34jTGdz/fyh9+//pIQo10y9XyLRKxJb8OYM3C1AazPO2oRnfST0N3b3zwK7+4b
Bla/OFSD6eRQr/j0gFnlvYDZrNVUe7zEms2OnkKjOofYtblHsFwrP5OzobTxYeWLv/xVmtt5oet8
DzBd3IegK146vciAD8j0bG34/U4QB63cGkzbFuHPf8BJ4uxz+n2pFWTxuudxR2ZPjZOcwed91PH+
nJT90r7hBPC1epDN4cH3f3yhMOUH3l+FPWN+JKTgLisBUZp4a9DLSZzg7Rpp+OZt3ozZ2WTDn5+4
zhuMibu/Y7BZZ9X+0sxgOU/HAvx/bBSI//eNAtLXLdpowbVnu31XwvHgG9TjSy+fwg2bwE7vUmpF
gVYLZvbN4JyzkPr3jdMv2WnUoIE5iG5fd1UYfCLCvHUDfB+2pJ6rw4RUcsIc2Q6BDQTPKW31wn1V
bPP805j0un2BRTYktAHVw+jpqGnqFso9Nf0yyhn/kTioBzrE5ln6JGOWXDKIbydGrhgPPfE0GsL9
Xd+TL3x+6oHtPy3cw66m1ucR5AweiARuynmLXpX8rvv+murQ6+IQqWwP62HcZi4om2ShJs2OBoPV
XKj1PLoInMVbz8x8OMOCOiHhX47TC+dtHUP6lEOKTqHP+PLysOFZaXS8ZwAFlK/0BcqBVlDrGT6S
CR7DEhRTOWFfvjlsMuZYhN01OBC5ffTGxDdcBFvhBbDeJqwmhtGm8B6eJGq3RWfQBXgi4NrujuAB
PYPFOXQh/AbSm16JafXMUK4vCMtwh14mEFjHbFJBM2x9HPdQqVs/CQdohp2P1vizuZNTBE2w5YgY
CzEYlDF4Qe/xleheIX49fqvHHfafSKfXU6AZrEyvC8zOL5FwMn73VDw4C9SBLRBlhjKbQmcjKdsw
PhJIT20+vU/tGWTCPsRafdfqCV9jXeUGR8eBIff5YFu6BKdGCMmW8iMoR39pgVsEOUYLezJWbbIW
pNf4QPF8OyaTO8+ckpGzQV3LHPLF7SQerNthZCbbik23i1NCEDsR1hdsM8F2ohLWFyhjL//UPdNP
TQP497Shvq32AZNACwFpHRNt7qW+TohfFbTQaP7uu57Ps2qCSfVFvPcKDNj+/OLhc9loCFDVzBdb
ykuodcBH0+eTBr3n9hq83XwTu7Tu6hH4wgK4lcEc1/tYCiXVYfAQatItXZAvAjdD6G3cBN82MEwW
bMNSEZZiT0NhHgKmwvUdt6RJ8O7sHFknOVYHDwWnUeRg2ZhJWQ5Ab0KCQ/EAc7JMkgt+9WjCOO+n
ZSgkWJ/rMxLdK86HvDoe4SBwB7xjWpbPO6QWcs8FPvW5Y8RIwMI7UMttiiCOx5poatVsmgzfafBO
Nj2JWhyCW6WH2AytPqDJiwuhXROO0KEJEv4CDB7SXDeoMwQNWwKu5SFCpk0vdpj2Q1NnDbgm5pVA
83HI1/ja8OYKOXa1vZELlbJNYRTyiNrHuUkW4NxCYCFqku2VmGw5y+s7b8i29zJ6AmOW3OgITpvz
BfEXXBr0s5EnqM2hgB+gegTzUZd8qMt9jO0tPfeL0kstjKDkYPTcewZLqvYI6ROERCbmux8R7BYY
zqmMT4L1yqlz3g+QO14EJOtaWE8vsS9AeMYOmYXiboziCehgN/IetaVmw8h32t3Vejq80dNMzXqq
cPUCVnpL8F73DGO57AQdJFn8RuJxthP+dJV9+Cz6E94d5cTogHNDULmSHjuvTmLTxRV9OD7pkxpH
LWGzK54KqPDKDTuX5ZDQOKCNvLNtSDWvjNjAX0MNADXZUf3YYrDsP5sOvLvgTQP7RI3OCXgNMHyq
sEbdJpi+6X6RGb5UeMdID6ZoWO4QtOtKzxq/JcWfWDk/sog6an1P6KgTEeQH8UnUqLECMW1KTcXW
zsW7yXqx5dw2Z3jXbYtaD+HNmEuLI8wIeNLA4s7JMhpzBa1lX1HvrHvB4nGHCNyqd0dYODrGgmQd
QUkcGda8rdeLV0UrYJ5EJj5WslVv8/cgAYN8YuxVxieZJqvQAa5NjWL1HvfzQS8b1fRmE6Ndd+9/
+Qeud0YJS8k7mbPkkcLDZc+TzZqv81wjF0iie8JW1E+s898ZAuHV3NOb4EUBL+63CD6z9ojEHrnB
KOhZA8XaeRFlZHo+8XclhDHoMNlocm+Q0XZSqJZCiqgWXOtxgnv0V6/+Xt4YUwelBXLHk0C49ygn
ExdGuhol9IIa94qTz6vJmz98Fxdp3y8JAz6Uis2LBmkuJctnq0JwwvEdjSe+YUt7MDvoDE71h6cz
bYcFyN5LJtzORzk5ZbwJ7uFFotia+GA8D0gC/K1ISD/cjmDgwlSDXrg/IqBdARiK3SFU44MBscW8
CIyS3rmggrND/eR+rZdO2lfgUbETGZ3tyWjf3KcEQUw6vOaLwVY8kJ2+NOlOKO7BeNQlF7rCOSRg
uboBNYiKIJWQSJg/s5qp0DVhf3cOK748+mmKpgx6m3HGe3Jm9VTXhgaBN71oSE9twl4y1CA3bXzE
7Z6fYCYehcpuFD0iZ5Wck07al5AJo41DvKhsPp2jRq2vWUEd2UDBRLfPAeph8VjxWjKm/LzhwH3I
Z7zfoIz1ryZp1BXvsJXfR2NuKkeCWwh6NO9gyeahojps6hTjCHivhDgB1CDTwy+aLC81aFSl0y9f
CCdTgbH8PSi/esHITqOA+edbDIfjXcBrPeRDMCRHeCjTN/bLvVfPT+lbgTYXB7TpfKfuT7fdoEp2
r5BNGF7ANKruGQIQa2Qz23k/6f3Q/eJLTZAveZdhToTXDyxxMn86Rs9T5cKuVB6IlakPRi7YH+Fs
SiY9YrQDC99fKxgFaY/T14SS6fRUEWQ6+hJusaJksR6hAg3xXBIpf5Kgi7OhUuLPqaGIuzkJQ+qb
wO/j5GFTMwc2fkcmKm9SIPLR7k0/AMHp4FOXHWpzcgDWekGKCaIS24/9l03RJ4whu1oz+hb3AUzG
xrIhmrwd1tJj0G+bpnbh3j5oK/+bcnpluAFFOlcUtZ4a0G3rVvCudRXaHsamnudct9Xjyx6p672K
H55PqvBNCuzpFysZSFTy8PpNDZw9HltG0/fpDFur1ql77t7GIElRBkdFOWD78dL7KVCs1x8e6xsH
GYJAFhMyRiy6Qzeci8wmJciVICTy+bpj80d2W1BfVQXrzXuX8EEz+pCawxWfB1Yxyo5RqfaPikec
g+Wg/XBbG+T3TYbqFzXqqdhdQ9AJS4PUXG/rqQAVp0DBdGkW3EAyp/chhE58OSKVjENO9O5KlPe1
cWiwjeZgjr5XBXTb0MNo55OE5reSh03e76izP4+MUWWJYZITl9py3xlkU4JUccu7RYB6gsaw8lOF
SqFIuG3XgbkDjQna7RsSgXzLflHJ2VSwoyAkXg9ivxTiZgCnzfFC/U219Oy2bsgkQnHBj+BE+rF5
t6F8QXpD0bBFPT1dkhI4HGxRdhHinGnk3QK+AjLWlcgy2NtKi7/78HQt7MXjbixAOJQzxeV7H/z6
GWyecKD7NT5M20whHL1UwM7Wyfopep5NKCcnl2qHJw7GpN+cgXGOPkjUqAda536ogGTsz9g1XAkM
weJVcNU7RGWmZSyt5za/fKaulVvGfN0qIlj5NN6veMte/J4H4pvr8IqnBpkf4wsKlzhHi13ZNV9M
6AUvn0XAhlUf2Bf46gKiZLxQ99bIxvytLgUshHRLcdyCfKLb76Dwahn+7jefSFSKEHjLi3CO/azn
7Te5Q9eKEmpln1ewXBW3gCRBKvWPtRksCkHVL7+onhgn4zVZhQYkKv4mIodglnNpUVBSD9SsqoER
zhkWsPJ37EX8JyFXRburWif71EztdaKAjeLHvzA+zHHyCpR9A72Eq4j0dRyD7+YiBckcEBwsOs2f
520fg6e17AkJmhlQt95ycJQLQk1V2xttuAELPFwwj/H9K9eLttghHH0xxaaxN2qRSzwXprVeYvQ4
CPXwnEYCxWyRiJDwn57c8m6C8n53w7tI94PJjMcXvMxPk2rn3SZfymdZqtySjWiytLInazzAe9T6
Pz06nK6zD9d6oBHb0GA6HXUOGryiUj3J9mDNXwKO1ddH7+J0TtjwLEwl2VkCKdNj3y+7Q2BDLlku
SNL2dc6UNjIBObYlUobLJp802TnD6cgH9BLvRTAdlIFAZ7AqstGqJRkLVWqg6pYDYZJy6SfOfooq
E6iNwwOZcyLanAk3JIBkPn3MevvZzBP8BsqbvM9Rn8+hvisgnNgDa+DL9wt9hyFUqwgQNZFYMrvi
7Q7Vu3imhmSgeg5gjmCSDy6NvnVfz/UVRook+ieiZsdtwA7elUBGrzkNtGv+xyflYacf0GZ7dfrx
aNm62s03lXqrfmZK94agELIt1lE4BLSOmQ/ilnk0sJx9MlhzlUH17e/IsrnbgOnN7ML0VRg021Ev
WQJJ6YDXRSHdsQsLvjt7TiE2Pm8CvZEZy9B1onJ+RSkqz7jsp+ihhbAzE58a3e3UT+PmHALe8zqq
Ga9bvsCyF4Ft2S120rI2puYihkDewIJaucPni3m9E1iTqKanZUmDqbxcbAC7B48gXlQwdBvUgo6/
GMTx6zmhzIUuLI/xTE1M4mDCLi4gOqcL1dskqafm6y7g518k6RaC6afvN+5wwSZRakCb4XCEt0ZJ
qO9WFtiOPKxg9PEAthBVE6Z2agWLpZ7RZgksxkN0PgKpyGzs3tQDmD3TWeC5ePq4EF0TTCRqeaDf
H6sj3cjBXD5iBT6UsqKxLIfJdEHDGdzy4UnRpveCP75dauIea/gNesaMPad81c2D+uX+27Om5DlY
HqMZ+/srAuNDunKg+p5r1O8fr2Sur3wMlkfyJYAz9vW01iv4qDNPXbj4QLABtEFHC0DN9FYFf3ir
H84fAm8U1/NnfMWw3vUYOzXDPV96Lx6yMc2QVNDJWHbnOYJ5lHFYdz9zP/Tp7Qi9lv/+/JOacRJd
AC6WE/WU99FobV+2lQc9TtTXpDKZbse5g8bQHBDUFbGet/U7hlwyXahdmEnQj3ojQvQVznhn7gLA
nkyMoLfxE+yR1wmwgzCHoLu4ET2cHR4s3D6GcL9dCmoEj3cwz+UUA0c7J9Sf4ZWt/goPtp3S0YAz
xn55bHEE8LO9ITqoprH98LdMcelng57IFusl08MX1N1ApPauWcBybVUJmC9OJOIzPrJZdLUJrqum
aG4B7ml/jXTQ8l6Kd8qzzVl7dQtYM78n2059Abbyfyi+agdr8NLk0xRJGcjyQsX776Or5/vSL/Cn
t/bC9tnP2L/ZIN5/LkTsSZqzH1/cbqaZNKve5a16ckEc3e80uIAPYPMThLDNrh6uJYP007fuXBBN
xwFfW7FmK19rYRkLCHvtBiRkn+1TMGGHQ1J19Gu2c+AZ3IpkQ7Eq9AkZHqj4xZv6Mq/1QtAuC5TR
Q8O7b+wmizB+UuAPJxuHr2oIJtECvrKgIcCX4FCxMbqp1U+vkuvqz5SZZxGF9997HB7QzqCdkvtK
H3Ep9nf0m7Ni9mJg55sJLcnc9GN0E0rFDNCHSJORG7MmkEnR4uaBvc/hG8xvu06B/pANbJK6D4jS
T52aaOWFQO+cJexhzmd17b9YWwF+1WshnIVnjve57vYL3tlHkIfcFduLXwSMoe7448M4GG48Y7ab
3P/8yFDZ7IHQhk9XNUpNpM7QXwz2UYwXXPsJ3q39eCE314f3drOnlsLEpFqmyYXio91QLe3NZDpo
MYT6XtGxtylUY4zqmkBNnSKyTUsjEK5bhYcOkihZkmwEk+3Ppuov3YwKOuwCep3kAeqBBrH24SIw
X4fzHfbQ9+iqiNn2O5SSOj7IA//64yRfax+EJrKpE79o/+MjEAvW7p/z7s5yBCXbLH5+UzAPTH5B
crYN1DjBmBMu8Xy43ieNNTU02HnqfKi+EogDUZZy+ouX3Nsnah+zsaat7gzgq6oPamfnEsxJvz0q
O71NsWEpfj/5SUhgBBWHMPUwJUypvhJc30ylN90zglkbKwLKEzlS9Hzt2Ndik67uAdYQb8WkXjbN
s4ErH8F7oXQZKXPlrCwffUtDwuH65yfB3TMm5EtMq16s7hmpYlXZaKqdDxgFOJyVSH28ibJ0QTKF
QRqCSIsY9b7no8ECIbPh7WXsyXb1s+ao/0xKuNmcqP5YTMAHi1dCYbnvMRrJtl/ct97B3u9GNB0k
B8i//DJ2Kodd1TP65df/b+42xxZqLLbyIR+ItfVa+aYNZr3UNHX6DjGJN/cGLE9B49SHo1vYtGLU
8w+wlMqs6xzWLva3ZhwuXyBqNQ1f3rQ2loXrUhihLSbqU7AMtql3Okx17kO4fQ8ZkY9bCRbrp0lt
bCcEU+snFexfGURdEx+T9e81f/qsvrPImJLR1+GFsx90fb561fvoF6/1M9gKY8YmkcDK//DPX52G
Bjdy62YuDa6lXrNOf8TwIA4YH35+07R5T4oe3h9IQY6WCN2uOMLgsa2xHkzuin9TCtPnS0XKqhfm
QHzr8KYct1RT322wDNNV//nHf3yaje5Jg2mexPQXz1XflvCT3F3ssZxjQ8U9IDAPZYyNcPwYjH6u
IvjpvdVvBsu3+GrK7hkRjLvneh45Qupa3wjCPQXUTMoYrPoS0WD45DNzZgRXv4Twgrhlv+cFcQJ0
ajx8F7AQtRPUPvcG//Tywj9UH+y98UUi8D3W81Hd/dPf7R61xmCVFx6SjvOx7jq7hEleEsHidOZQ
f4w1xqtj4oPo5CpkW1ucsWS62cDF403sgcTq/36/vnAyeZrpq18q91tAn4YhtoP+Vi8qb0bQa8Uv
NXWxY/MpqjJFDvR1Y+kqBLzev1poJe+Q2kltJjM2G0UB967HdpkHgDY5WpTwau+pufbv7UGQQ+Ba
25na5WL2gjpzLdg3Rxnf+OWST1yY6tD0mElDb0yM+bANXSjvjRveP52kX+68nEETxCXFrqHlWwmU
EAbx0BFwvwT1dELFBGyPo6QJerUfacU1P/6KcQqcnolIVqDAjJ78+DgDYKrgOp9CHMdrAauzbQi+
gLvj/VN4G9PFbmN4wXePHDLBzpfN6aFDt3rYWH+XF2OenGcH62taUGu3FYNFTLVOXfk3Oq1+0Orn
NvCHn+bt2BtsncfAh6NZRD3xDaBzYWmwyb87HISZYpBrK0ggcLkcqVPsBKKhKi7MUv2K3deEcgHo
MoHbcLvgYPXHRJWcbbgj1w8B923OJhYcFliMZoA4/skHqx9NoDD4FRo/4T2hphtIyuvVOWQKk30+
XtWvBnl5X+Od+4QB/TYLUZa2yGg4KXUw/uYZtYYH8uOjgxh8j9BPpQoX+Nvk07N9dvC0TBCbYVEz
JnoG+Z2H7hDvBtv2qhUQDaTAftYiMHe74/Gn98k2mFnQDo3TAB2WKd4VyiEhrZFl0L8UG5yIwtCT
7VuR4Gex79Q9vDNAOU9JYThyRxweLmk9+7d7BmZTMREThbCfjnZK4DwiuvrJFptWvAPC91BgtPqn
6xsqFVj1MLlwB8wY3yEepA5PVn/UBsOylWOwMXNCjUGPEvF2gi8QNAHC4bmo2HyeBRuGl66jaB+E
NXMFv4IvCBekOttTMEc9XZQyfNa/+VFNL9wxA1YPttgRP34ivtVj8+dX2pvI7Bdmph1c+RG2xy4A
yzJeePjzS766NtTD7Ti3f/i6vcTPgLwj1YbUOmypz/Xfnm1usQi047skXP8Ne8Z/Jg5u+McHCcL7
WM8d6/Qf30Jia805CXo3++nBNX+3wfJRBx8yjdlY+4YLII8tjn/88h/9baiKD9gkbhArd4vBzmJ6
hscb01e9JfTTdqlCOODXiFc8Cjo57nRITnsOe3nkA/bz3y+BVxI+vVXGcm7JEQzd9MBeZJ9YfzDq
UtE/7YHer0MesN885uen54YcJOQW2RlkY5ZhQ209Y53HKqCxJ4UijQk1W76bCL6N+EaDQPH6ZXig
OxxOSord/ImC8dcfZK6UcYjjsR/O4+GlvrXrBWt5I7OBpIMCVeXToK59a/XyEqJKjd0xwlYzy4C6
tDiDMuAsiiRpWP0tYv7NG11988yn+V61P3+X7pAt9qs/rku/+RsecFxPbbQjUCxQjv2nufTTb35j
fRURewsg/bT58BX86a+d4tOeHrzrACvIHOrWVbjyYZpB58QkrOU3PWetjglQSLPg3elj9sR/x0j9
6RVPuMTJNKf8Ha54irjV7/z5f8rPX0TDlvTT0pABrP0Hba7mw5jTxOFhFIqIKM7RAOQjtCbk3SZE
QJTTZK3XM8zdfiKgo1U/t/bCARBbEUWlGOXzPk9N+M9Gwb/9+7//z9+3IDTt7f5eFwPG+zz+x3+v
CvxHdsv+g+fF/6Di37clkCEr7//6z3+WEP717dvmO/6vsX3dP8O//vPfJUX52zf419iO2fv/+MG/
rf/vv/7tfwMAAP//AwAqDwRBiWEAAA==
headers:
CF-Cache-Status:
- DYNAMIC
CF-RAY:
- 7b50c20eeb771678-DME
- 7b3ba19cfce39235-FRA
Connection:
- keep-alive
Content-Encoding:
@@ -387,7 +378,7 @@ interactions:
Content-Type:
- application/json
Date:
- Sun, 09 Apr 2023 06:40:20 GMT
- Thu, 06 Apr 2023 17:08:10 GMT
Server:
- cloudflare
Transfer-Encoding:
@@ -399,7 +390,7 @@ interactions:
openai-organization:
- own-45h3iv
openai-processing-ms:
- '331'
- '247'
openai-version:
- '2020-10-01'
strict-transport-security:
@@ -411,12 +402,12 @@ interactions:
x-ratelimit-reset-requests:
- 20ms
x-request-id:
- 1931f0d4b02fd3de5f2966b3cc43d4f4
- ddfb63c63597d48ade18afb82091b9f5
status:
code: 200
message: OK
- request:
body: '{"input": [[939, 7341]], "encoding_format": "base64"}'
body: '{"input": ["sharks"], "encoding_format": "base64"}'
headers:
Accept:
- '*/*'
@@ -425,7 +416,7 @@ interactions:
Connection:
- keep-alive
Content-Length:
- '53'
- '50'
Content-Type:
- application/json
User-Agent:
@@ -439,120 +430,120 @@ interactions:
response:
body:
string: !!binary |
H4sIAAAAAAAAA1RWy86rvJKdn6fY2tO0FCAEF2cGmDsBJyE3egQkIZAL4WIbfF6+9X2/1K2eWHJ5
uWTXKq1a//nXnz9/26K5lePff//5+6qH8e9//cSu+Zj//fef//7Xnz9//vznd/1/yNu7uF2v9af6
hf8e1p/rbfr77z/S/0b+D/TvP393NX/Rs3mpi4k81zdEz77HcnP1KjjHE8DdKAOW08BvKFGnkyY9
l0/mstOrmdwoPED9tptkTndyxL/q10bmx3cIPliTmMqjnCBLXynEoodtxDdWsQe2+vbEIV+BhfFZ
7+EkhedkkRjHno9EHmDpKhUL3Y1UTLt+cQMjjnTitaupaDxyvUG2tE/MaTSrmPhcaMjBqyOFexoX
HG+2HGZUyYRIuwBN7+n6RAwHt2TBsjb6evPuBopbyiy+X/KeXiNeozYjL+Y7T6kRgVmVOovaQ1KZ
qR+JzEIzZJaTshi6Nv0E95eF+vw4s7DUjFRhRR0CYTUmlok8NJHndIPT+fUhOAgeeDKceL+uFT1K
uuX9KObDNl4gl5tmot7wKLi8QD6s5u5BAc1uM0tg7iEsd8Cc09qOaDafTvDDD9kI95yO+Oza8Ihi
wYK1PUXT1/EotAhqEjVlGIkX7SngZ5IRz7vd+vnxKCSkbCmQwPU+ePLmrETpOjkT266XEdWPagnl
W3YoOi8YFpfi+QZN9zfs3LtYzOrKqoA/O8Ycx9n1QmYrC46+nDLznO9QL+7shrRsgUhyvfGeBvEp
h4cdLhJwJQlPyuF50i/Xl0McoCGSku30BNf6fJPxvjs2fPBVCni35sQ2ihwLe+lIsLxEJTPGV59O
z3q5RdfivKcS2CIdU3At7fo2x6T7qb+sqFvQ84xP5AhVJeZd01tgO/qXGL1PezpyfgPPuB5ZWB6V
iBeNOqDXpjqSXEJSMePvgYL5CR0WHjuWCtVcqfDVqEJsq/H6+Vy7MziVlCXy9mj10sM1fegOC53q
Be77mUo5IE0PN5TzIMZMNdwOzOtpz0hNYizO8zMEd/keCNHkqBDh47aF9IRdEh/eDM/lO3/CPi6X
yRxntWgv4e2t6aq7IeRrqujb2PMA3MQhnXljCSUqpAEUlJ1IxOdzL8LHaas13krQZU3iiM+8lCD4
TIxs7ps2ZXp9NoAHtUEil7mI7sM5gYmIK7PmQsZcsRPQzM8zI8HOA9EdS2EtWbncss2jrjBX3uEB
LZEwk1VWoXQy2rUK30kMiRSXajrHtuzDCKrB7O5ToSn88o2W+aPEQrCrYurlR44oa9SkfC2kaHjU
5Q11tnskgadKxVRsvyVUn0hL1HDbCBE2XgWNJwsaHAcPc/mZuWA4nURfLlsLzp0+gT7d6iRY5m0/
93UewuBNISmGa9uPTiNCyI0sJTfFTgQ7jtuLfgS0+amvhbiTHjlcV5pF1zuvROwhuhLJAb4S07zn
RZcddhw2z7edNHSvFvSubTpNk4hF9Wvlp2KfLvbosqFXtpkSB/Prd3P7xZNEfGbMY0efNcb9mvhs
8SqmKi4WAFN6SqbyEKXio4cWHBsvJ3GbZZhXoNRQ3TclOY9N1UzFfk/BG8sLI4p1jsZeflx0dVOm
5Pq0w2iS98KG1tO+LHyYTvp+lVcbiQpjhkvZxdxrxgMonZszd+NUaA6J16FkHTyIq5hDIez7xUC/
ehXkphKJb1TlkK7bPtHWH9zP0uXQatcFrJjBLR/Pi2y/0HNfCxLpVy8kXavAVK9molvXOOX1eNlD
907XyYp7Vjq908yA7DWorJDPpuDXTqdIqbHK3MJRir7rbwkqYqVj/vnywdMkK5d1tAoH4uPBiqRQ
pj4UD/WZ6PIDUv6+Jws0+fsD2aSwKl7Wbj4BT7nHipVPIm5CEIKd3O4kXOEHGkUbXEBd9Cs6ZQvR
iLAhFYRL9EjWy9xv5i2kEtSN0Gg331b9vCjsBH3SvGTezXn1U3CDHB39VcqcaUvE9+CZGoT8rFBe
H/OCmX3BQX3TnOHh0UfTPeoGOM+3cwJXW0MjfVEFFsplw+J2L+FpkpwQekv6/vDzxvya1Dlg4y3R
94l7Kd0H/gL95E+mvjejV2NrFMSmuDLTk3dCOP62RN+tURI3YAHm6H0/odqyOvKjV8Uk9b0GxXxQ
mcnWEf7pLwOMu7FnUedx0TvG0teEfTMoLcxUDGmb+DBAeibJSQRoSpZFBYumJ1TX3HcjXsN8g0Hu
FGYP70vBeRXvoVaqFT12A8VcHIIKaUYUEEtSJ8G3p/gE65CnZPPgXcS2380TNs+nzS56PIpBq0GB
ZadlJG6rBRZdnLUwa23MSstcivY7kRld1dMpWfNwJfi5qwaQ0q4lpi7VQszL9ROl665PZmmpoJmk
RfmrZywW9RMNllkCwOmJkijOx0a40bwA85hyqsdSU9BP0rroEYo7lWYgQnjTVoH4gVki2tTEPL+A
9o9/EW36wONHq3wAh6YkTA+rhj4ax9Xk7D6R+PGJU1rd66ee71JEpfiYFtx7gI3m9vxMhqdrisle
XBMYuwgTx+B1//1oVag/aGGTaNeeGwr0u0VKZ+fEXaBX3z4v5gK5B3X/z57y+6aFH35ZtHvse3G+
hC7qjsWOzvb927dZ+ghheXlxFvjurRGHqjnBs6YNSWQ9QDPwffWPH4lO8QMNLJ8q4AEcmHWJaM/f
72qvzQ+Gidn3j2iMPFBgqNdrZjBUo2lY7Tsw7ppGsDs80ylAZED35COoOrIZ0ZukajBln20yPfk9
nfUjv/2jR7/v5fg9K/DTHwlCCKe3Y3apod0N5a//w9w5ehZcw7JjTrwMkdhMo6RZwflLZdUq+jlu
vzP8+BOqjIkaDcl2/UY7eJ7ZZnFqC55tiIRe1ZsQssRyyjRenfT3S7dZWJZFJEKcDbps2gY5UPpC
n9cwl9Bb+ZGQ3jLT+SZxTd8YcKfDozawOAYN1w5FhogrrmHDDekjqd3tVBCMctRMvfy9gH4oP8yd
jtdoMj5+i27Zd88i+eaKSTkd38DDvCVB6jvp6u6rEooze0e8zUXF4oO2HFbH75WE83nfi3UU3iBS
6CUBSDRB59TQANZ8k6BQ52hw0uuMGklascjYh6n0frdbwDvTJ+FyXIpBsV34vc82qrEV47DGGjBr
DAm2DBnz+FtfIN6+CoaXS0coqrzbIk6NmHlj5/QDep9PYA3LhLjwotHY4KDUdrrV037WXpHYfooT
yknVMetj3yOBT/ENkqm9kDhGbSGeTn9CZAlnFgenRcQNiSkg4c0h+e6MFep///czL1iyomdMxUEL
UYsWNeXZ+p2yVxBL6Pi6YLK7uGY6L43ah8lTDBbvt6uCw6h2ICvWg7nNkvWDI2chfLdWSdHKZxEl
jPJf/fyZX6eoo4q9gee92VFZw3Y0vz8XFdZ7NiUzW6gF/fGDmtatGmIXloGUOTVU2D/sF4m7Nmpm
svJtxFZ9T1fsMIppynoXAq0aaGXmI+bnrqWw3o8TcTT8xGO5fkh6cFp7xI6uFRqO+c5H97O+p1o9
zw0LnRDA07UhocHcN2Ivr1uQXoVKheHXgkoQbEEay5zlr9MhHYRYHtCR3KVE42OXcmNevLVnlWls
Q9WxmGtqqHDc9+7PvInwPHxGDbajgplDRRqxzTQqaDVv4kQYvoUEWr3+yce8/wEAAP//TJpbD7K6
tobv56+Y+W7NjIhIy7rjJCBgiyAqycoKKCIgIocWaLL++w64s7MvMcYDpWM8zzsqFJdkcmInBf2N
N7H+5txinOTJlx7vw568BRW0lOvOOWzvrkldR105sz9N0lx/ybJ/ty+W3+GwEjf4aHqvotdu6zNI
6tceu5j0SbO7Z3eQvxuMxunhsjFPRH/hOawF3KcYA/E8QLGYCNYwlgIqZ1cIpfcVY/Tq4qJPyyCH
9hDyVC+uFFCVrHzQ6HpIBK3Ltab6eCKgwWcim/cuAoO3G6GUJ9ydDJr/DIZVnFSQak6KTZ/gdiqv
hzM4YXtN1WHsQJ/KlQgLM3rOfCqDsS7qCmJVOmMTNiAYyX6ngoWXnJDei251ay6AO/AbrMbxy5nm
/wcTEx1Qlj1qh5lM1cFtOGv0SpMJsKc18NI+4yNqudy7GCwzVuGIwwN2CArBOG5WMYxKzsP7pkPa
cOtOUML9l6Bt8tIY72qaAGc/JtLo9sl0oNEAQh9NhB3TwRmxonMQbndbbH/SF6jbdWUs/I7emWK2
E4kEEVqwlameqBkb6kBD4E4HTKTmVbXMc2RVqtsM0XNcJsXPT1WFzxB4osyZxBXOf8+TvNOqYNge
oxzGEKr4UWep1nPYq2DacQU1ynPKmJT1E2zvdULN0HGTb2kUEIZJ5VH8nIbk23pODgaiHhG3+1xa
Bh+OL95dq6H2+8IFbJt5k+SDc0vyCALQ6UhV4XQF3o/Hyfm0RTDZhxyd1ydZ+ot4zEUdHzfxNvj1
Jz8dAUUjr4PtcNkhOPM71YZ60Jp7rfEQTUZHdiHrHLov2wncd6VKGonUxeN2f7tQfQsRjQzTdKaj
Llnws1UVenjbuTbBIc6h2ac3rKa142yW5wftnNfsYzcwvrhCh5ZJnlQhOAfslnQVNOOTifeCCoop
9Tc5hFrUEB52gbZhapH99ncvHtt2XNuEB6pPrdl3Raef8wjR24wvqj20wWFve62LwpAfqS6fQ411
jy8C78twX9avnRS4m0RyLRN8E7e8M7ZBpcLaTEOqL/cbwW8DoZY0aLI3ijOWRrEC2/3xQO3dRyv4
GGTC7podCFX6USq6tKtW0JUzh/Bzf/v5bhNGJ7o3aAQmM+86GFQ7gmcfcIjm3GyYr4UQ39Z3OeGb
6+UM00AeqdVTn03iC7pw1as9kUq90bpYFDxoMcqjc5Ge2cBFoQ9mnyFCBGs292sCQiteo4fM3k6Z
xMSGT094YO0rHQJeOx5iIO9RhjXi35IXSogIoy6MqPrJy2TmhQGERUHQoPnrhKzvmzvchu2Dytae
AQIOWgzc08aksn1UEioZDgH0nkPyNWTV6T3XakCgbjZUfQcnQCZ6Rzu0N2syUJAztl09zmCfmglZ
UV/Ttsu1Z6P0V3/HXZ3egKvcGFXwXgVDz5ozPFmGTZFrFclvvyqC3GO120stW+rpxXQCBO6C1Y47
V9PBfaiFOd86JaxeAR/E0fmx8DnrTi/FFsH0Ukk383U5+TcPbqSiIeIxzJIvf0McPG+zK+Kf47sd
WX2IgbEuOwQuZwOMeTJ5wKiOCL9FrXSmmd/AhbOu+PLImUMu6g3CK4of2PLZnVFv/1yJK/m5wlh4
MjaNRsKJlmUzaht7Gny571TD0TdM6qrKGkw65+XwZOk2dp1MS7hbF60A95AN7Lx3BthO67GC6MO9
sXX35WIAJTFAtr81aPAEy2FOF+uLjyPh7svt0K4rHdYnz8J79UXBFINa/PGXDvUg+QZPsQMfW5Ww
/nnvWKv0exma7YHihc9YkE4GXL8kjK1RgqCbuukCt/qwwQjJYcFAlE0SZ58OWIOG40z+QYYS3IIt
kWr/7ExjuyMQ0lWElevrAjrjnl0AdEcLdfhTBwObkAG8w2DjMNhKztQdvmcoM92l8ewL9MobDexy
sEO5cTiBzrduFwhOrkG94dA7pJ++BMz9j6LHOyzG6POSYSV0CvauYVkMURjHMMvM/a8fDH51FmCM
VysU0Y3VbmZfhrMPos+T9my8snKA3/F6Q9yOgJb1+nCDAeUFtK5FxibQCTJ8ke0NjW9UFiMveKvl
/iKR43DCE85fwahjPdYPCCXjlXWTqN+6is78n4xp3njwycELYgZaaw2JBlGSn/eKYnQvNEIVy4Dd
qpIxMkPEJqpYOixWKKQ4adugO+p6ttQzAn2nccbSerhAK4Y9Ael3A75Py0IwjqaR2lu7Yr3xNQTY
7GSeegJUkq6qo3zphwg0cgvGhFQqHPMVxe7Fxxp7H/Y86HFaY1PC74Lu7QMB+d0f0Ga8ntmXOu4A
z7a+JrvbQQPbDV3L8PBhFMEbbQG9B9oZyl8vok4w7DVyOKY3oN71imqnd6gteS20NpVDnUncO1yP
NwTWm7NC94dRBlvPseSf/1722g4MJ+meQu7AbWaesxMQmmIGzz0psQ74r5YjLECYTbyKUXhQik2h
3BBYfs/Oams2XLSohI1zsBHXVUIyXERjBSdgvLDSj492CL7bG2Ru9MA4NL+gX/L6ud/jq3sKnEl9
iy5YlWZBtuPYtPTte/HPf/kVeBesbBX06w+bx+MNRiuqG6j2UU76y/UeTC9jHEDYqSVW+HIErVAw
H8LPsyB8sueDcR2MMZzzIZyeWxj0S//RimmP1VyKnaG4eARa7JSTKa1bbWQGUpdrxD5Z1w7h3b7B
t/5k6FIIdsHbt9iFJTQUrO2+KCF8XbrwkHUiPa6SsGVn/abCtxgrJLNcAfzyVEVPLKq+9ibYPJ2G
QDTpHVXbLQtq62Ny4H2Z7tTGxrFgCL4asV0l7c9n2ZzfL3kOlrPy6Ay8k3kwWhsXrJ71Q7G5RiqB
e7FxyG6OAPpJnjwou/2Xap/P2E7Xb63DYG+nZPaLYIqGtwGxc7eprhaforsRxwULP5hVzLUlC8VB
aPtCQ9LsT72txxfQG0VKtqZ3bofOGjppzpOQYCljQMR9JkgL77i+Y2vbg1Kn8OqaHNm6NtHIvD5w
e9mP6KTcwoSdyz0HX+ePTz4SfrfDmSlQumYOIbR862yrDp8M7tZeRt3nvgi6t6LGYN2ke4rMkACu
mMqZV8cH4qOjrm0MF1TQgqjDct8NQVOEng6fp/NI0bze49q3LGhTdUexLK3bkfuKjdjfOBP75Og4
xH0+EVAeU0WtnM8X/oBw498gtteVCL7BcyJL/0LN/Q6c2Z9LaDubmpqa2hSTuDIz+BW9HLHXKdbG
5f1XSiM0+9m8/p0NC3lakZVmlmB6nqsJzjyJvev92w5LvrXkN5pqihphlyKW2H07krW4aROC8LAC
AD0sbK52PWPabjKgPyUvsvscdkm/8YEBh+4gYN1X8mJUg7aCtSl8sf7pVmyy+iyHoXK2qVPcG41u
s9uwOzT6DSvoIxXz/rjBh3C+LPnEnLd6KiTXKsE28fp2mPN9YMY0wUeZ7Z3NtJZX4qfgB4pFUgV0
4cmdsjkgMX5oLT2KQyyZQV3ROT9qR3oIst88w/K7LaPdjsugndon9BEKPmmEU9cAR3UdqmH8CHqm
3wd4zAUd60zjABuYm4F5foH1y2Vs+8vIchhL8WXm62vAWi4bwMJP+y13Svixse5wrh9oO/N4K2dP
COf6iK2Z35rdvU5hJ5MQIy5Qir4KbR/24ueK3ZfzBEyXziUsYutOl3xq7NMvBy+R4CGRA+eANW3q
ijNPYG3rnpb5lyf6r3yHPqNAiy4UHh3YtENMf/tXyq/qMn8gTpx7rC/zrQ/DTi4RUKBfsKa9IJF/
fUqqTatPMUhjYMFVV0Z4zoe06ZRZ+ZJXUyWpEzYA45VLfnr9kO3sj7/6pR5GCfGX4ZP0y3zNOsEK
m1muBls/pDkon+s3lddZDhaegaPJyTjNTyXrwjiy4DXlRqrPvsU+186CjyT0Zz9I2vHFtToQHmdA
DzuHdwbUBjV8QsXB7jwfmYLhZsAxentkCM93p599GihalmG3NBQwbqWOAK6USiJdrjCZoulyhpPY
HDF+7sKk26gPAhNALtjm37CYjIe3AlPNBCS+LDsYh4tVQTBJl19+ya7c4wZsge/QQB+PdsmXRUq7
I77Lnt/O/vnLS8n6qwiM7atQBsL2rWKcyXFbe9dVCWjFNHoI36rT24ZTLfkclT/Kt6Amsw2ow7OD
r3N9+oosRr96i2hUa8OSb31WhU1RdfuC8eWtUoiHuMOo3lTFzGMdnD8fZfl9nZCFL2YeRLv1qy+Y
rLsNlIQsQ8Lg0rZ3DsRY8n1Uim7GfrwPL9aH6s/DG4xbwFIw8zk+eqfMmYRN5MHX0bDm+XGfkOgc
zTw1dguvaONhl97BgxgYG677BtOrXP3qN1Ybo3Km5pqeQXYJfGpEx9IZK4b4hd/p/VwbjHnuzofo
WKXkdTNeSfbBWScpemTh6HOIkgHargrXXh1QF5qzsW6vMdycVjVV5vezTxTGAH4UTI+Xzaud88gY
KI+zj+XFTyy+rED/7Sh162Rg78T3Owie1wHtOKpqw8IrJbO3s8/tGDt5wSC6tx2ken36BjRIJx3+
WU4F/Pevv//+93LCoKof6Xs+GNCnY//P/x0V+Cd+xP9wHP8P5X8nEUgXZ+mff/3vIYQ/37auvv1/
+rpMP92ff/3N/04b/OnrPn7/v5f/mr/rv3/9DwAAAP//AwCq5bGX4SAAAA==
H4sIAAAAAAAAA1SaWw+ySrel779fsbJu6S8iIjVZdwjIWUpBETudDigiCCKHKqB29n/v4Luzu/vG
CyBI6jDmM8as//jXX3/93aRldh/+/uevv6uiH/7+H8u1RzIkf//z1//8119//fXXf/x+/78nszrN
Ho/ik/8e/90sPo9s+vufv/j/vvJ/H/rnr79terqSq58WKUse2wC9r2DSe42qdLrIJx6yAGx6Qncr
pMfDLpJSbVNR97iuGBsu7RnQJa78dUDXLiOG7aF2+97j3cSmckJwMdCjQwI26P3ojkmCAthXQ4dd
fmYhqz9bH/pKvfpM31wQq6r1CC7yc2pKiE8nt+QyUNcph/GJKO6bpI8M6CePqBF81W7a7DwVJeE1
JGPqHNIpzo897LLzGh92s43GRzXkyJq/sS9i0rntF14ZrKxYoF7s3dOBIqVGyRVX1Lw8+HK+f8dY
rrX45pe2bLkzVboZ7t9DQDXO+bDPZn+uUR/fZupfQAn5sVZVmGtHw/o5M9HMZbsEwqr6YNcLXuFY
70Vuq7h24hdeG2nscJIlVIgN56+8ZmBje+w8UKukJOIqNkK2jC+o1sRRo4rVbuhxFMF24Ctsvqqb
RmJdsOB72TKqZYfJnbSL2YIy8CX2hMZGTG1QC28xu2H9cM66+VOkI+r7GWGjiD7avFltY9S1xhVr
VOA6cnGbO8jlziZStKIa69C5hhOcPXp6O7rGLC64w43jCFXk+oRYbtccxN0UUqvkzm4bNjRAPTIQ
1v31iIZyI0Tw0DrFF0uO18Y30iPZNA8GthvOQYKnvRoY9tr8Zzymdxu3wITviPXt7cbY3b/wIBnm
nZpF14WzldEzKjA6Eqk9sXI4HKpIYs5u9MnLUEI+QSPIGIsTvldyzib8QSpge91gF/GkI/vLWMDp
8LpQ1WsFd0SfmEdz2l9wbOz5lAVEJzDgTKf2F2jIVu4VYGUlAt7vDbNjnFoLkPtx4gvLfPBm+NKh
Qo5MBPnQdaOQdBYKNoFH2AN7ZU9aoQbJCkK6+3YHbbKDsw9F7PTYyh6uO/U4O4OVPw3suh5hM0JO
DmfPVn1h1Rfh92RxmRTEnIfVXS64bRPMDayFp0vW/Kiyzflz7sE5VFd8UE7XbnySqJG6js1kEpOD
O89qL8K2+xK8Q/cmJNLW1EEKyA57R2/f9c6l8IGX6YPuzWqtTW+F5lL59G54p5kS+wZJqK7Kr3uk
iis+y5lspDe6FZudP5oXpE2WZIuge8/O30ofMZw4stZB7lcrAt8iR6zoc136KDue+v0xT9m1OwXo
49wNvzlhPu0vXJ+guVUv+Le/57XxvcMx7iR/ErYlm9KvmcNh0nNiVoPBxh35GnDdJJRUerll4ydz
fahvdxkr4r5BI5ZaB7bmycFHU2xQX72ZD0l4CfHjovshScc8loXSdP/MH2viywjrJFLJBvOPjupB
m6N1e31gnD9y98vTV//bz37zfomIXvKeSFe/25EpuFnlbFw4H0Uu96CejPfhqEb3AkhDdIzvlzmc
OC7zJS0/l9jzPlU3706IA+fwufqi8nbD8eO3BqgnLcFmT2/a+FsvV5Y/8KM65+UkR4UAn76JKTZW
ESLye4rlbdqE+HF82WjaXzUdiOp86e5lWWXtzgcFBQXSqGsf92zaaUMMOVIT6un6y2VEaFRUy+sX
1hjXp+ND//BIMDgV73JVSOfXJU+AHvTGnztD65b16kms77dUOxSWNj0fBSenx8zxeRm4lKq3JIcx
FSV/nLJDyMza8uG4DiVfCL5qOUZHW4FNyos0ZscdY09FFlAcliLdCZHgtnjIHNRs55but+EnnNa1
cNw+L+2AzU2muoIncRYYk1X4cFAhZNTwJbTZz2fsCgTSTx4ECVT626TJKGKXidHWAeypT2wd5sIl
/OF2h0gKRSLtfMamLKU59AbO/XnjW9p8SJkITtgB6WZ3g0Zu9w5QWrR36h29qhuPAWToVtOQ6oeN
F34L5SXAc2IC4Qm+IZLrLg95TxJqB22L5uvD6WGcs+i3P9x+t85muIaKS23yZGyi88WBGb8/1GZV
rTEmFQEcrGgmtdiaWn9631R0V7uNLwy56dY537bA78yM4u4ZaHNXH88oLZo79teRHU5MxgnqBu6L
7Wym6azrKQf87bylFiauNtaXRAe04gNqll/KOiXDuvTgW5sMfRQwEjw5C6y5i/FhvbI7djm5bygI
PhJxw+qQ+WWQ/fSPursydqejcohAfdgmedgSCWf6+t4Rerg2Vj7BxCbrJUeANC/E3pm2LmW3/g3U
VHT6OD1JSFZ7EKA4Bwk2jiqnzVpvN4At60DvynOn9aKLCfrVf7h+N+X0jZQGWFJ88F63Sm1+VXaO
xjpq/FGdhY4xx72DfPEV6itu5ZJa7EVIxUE50HQeyomGgfSrL4R577KjAttwqLmaL7KxL0dtstp8
hp296n2YTIXN9OhJYHn6y0ei9NJoIIwWiHIU4kNSbcJhfq8daTTLCWsrywsJf5/fsiWlQECD0J2i
S2+hebPJ/O/82ZVT2Tx8OPpUww65vdLOO42+7AjXPbbzMC5/6xMVxTvBfiS9UfM1ZwNx+z7Ah9Sq
XJqJXgPDpfhSPy+DbkzcxEHPQ3oiW95qUcehyQC+PI1UN5ysHDVFS0DvggJ70sFG86qecxi5WsO2
PL060q2nO/Cn94V6Tkm6eXpRZ7voBbbt9wsRKfEEeHRbgS7j7c5kM9dA5VrCiiG9w3GucY9uwDaE
0/Zz17+9hoM8vp78Wbs/tTnmlOSPHmFjF6TTvJ8l0GLT9OelfmVqGRdwfx5TbHY+Ziy7Ghw03b2l
Fj+57qIPvHTIP1/CQZx208u8CTCenYhI+0lMycexC/Sd85iq2q5Jx5NEeeRkho+1427NyEfKI9kt
5D1VgmfqTtL528srA6+wu7qRrgY+yIFF7QW7rbvTxq94lOStFoskF/NV2e+34VlyYYuw83Kdcmqr
z31zV7g79vGIylmabzEID6Wm9sq5o8l9WCN6dsOJWlptlOPF39fQQfDFXo32IV/vRUA763jCjnAT
tZE+jiKs28sD+9o+6NiNdxJo5ezqT0u97Ye3IsEpUzx/vcFjSn96ybW6QF3aO5ogbKwjgHCzsbJ8
L/UaU4E+Ile6v0zHkjjfUIIt2zrY0K48Y/UmOAP3WqfU+cx7xh/El4L8g3egh+m+d8m+NyNophLj
gyATlzqb9i7Fb5WSQb9V6ehmaYS6Ab5U1e5Pd+yLQwbDCiKsvp9NOqoGStDHP8fUVFoZjVaJJbg1
ouPTl4DS7tA6GaTcbFPPel5Z/747BkKmUZEtR2uNhoLMo7m2NJym9502epbqATS1Qg8V3aRTbzUt
2FX0olZb0Y5+k5sP8Ul4EAgs0lF09HnYpzOPvbV57ZqJOx+h7jpMJuOku2MZxgDC1qX+lj7EdAhL
z5FycVNiD38VxM94FP/wtLFt3HLefpsj0rdmR1a2NrBJfXcGANe8ySt7DdrkRfH82294Z5ze2nA6
vURZsycTu9P4cvsdeeno4shnIjvizIYhl0SQ5UzwP9a1K//o3U1xRbJ2h4LRk3E7w3PUUxrfb+eQ
/Pi2qhnvo7xow6lus0gKsteWKlY6dOMrUQAMERnU8e9OyUJBFn77m1rCcHbpUX9IyN2cfV/AutpN
ZlOdUZmjNd2dxShlq7OboPeVM/GvnjEI1EDeaNWD9HILLiUDX0O5bUy6LxPOnbzoOMtLfSHsbGul
8NOTXd+usfU2Co00DvaQ7U97rCjC0LXV7hgjdxP5/lQYB43NuyT68RzW7fOHMRfzIzzwPGDHOQPr
9+NHhC88faw6WlLS7zbMII12G7p3PmM3rGsuQH3UX4nAm0XZjFEuoe8FMcI+YeKOW/kFcnE+JmQU
82fJXCVtYdFfbO0KjJj3up1R1ksrqjW3vqMlGBIM9PFceGjnTpugqX9+DuNBQeGyXji01Cfqye1d
oxaXZGgVEQG76f3lzrfNMf7Dg+XJbFzWluoZrZOzSu8rMqNp5x0F+SNxMfV2j6ocx9pRgRtka+H3
yJ2uA5eAZXkYH6TC10YYdyDPj9voz6qmMSFUmAjRoRPJpB+GdNoUdo9+vLvZj6M7i583D3H4ErFf
fd+ovQSRA0HNLv5bfpndKPKiAMnzqVAvi3M2KV1ooKW+Ln6yTtnGV1TZyXSfZhCni56dEnD9OvdX
lzB3J1vCNUzbQqPW6NflXGS3AvihV/HFNrJw2AR5DXvBK6iW9xljVSXP8DyNKdVvOy9tf9+/f6tH
epAK4nblOS3Q4Vj7/iqwIzQFX3Ck5mC11GQRH052Oc4yf7l/yLC9I9Rv9ECFxW+Q6V0liDSO6UNh
ftdUaRhyR8ubDImaqo4PON2E804NBDhENqJ+9dXR5pcfLPxOVZFOYfs4lQJEkFHC3+s+JQsfoU6M
r6SgLym8FXTvQVXpN3ohtpnOu9dDBzS0yjLzL8Z2uVNAVRm3P35t8/N/n/b5wvbFjrvJVpkOxyp4
0v23LhAjA9Rgo9se+5OPyrEv9tnP/xOoklDb+DN7o5/frUnWdeNhlRGk+hsTa/NDcom9ywupu1U5
NdhudJf7vXTKVI8aDpzZ6K5u6o//6KO0UrT4L0laxh8Hi5+doVhzwK+kM1XUOeomnn4JeOb166PW
3bnzrWICEpOvQ9W508rN3X/w4uIPqeXWckjTd83BzXk7ZOsXD/eP382OU0B1a0pcduKh/+MnvTN1
XBoVjQOTMkY4C65KukaucITLKx/pIXgEbJQucPzlBWQ7li0jJBbPsOQ//llML+EUzlWA5E0ckDH6
NGxWtt8WsfMz868rrXBrT+IcuN3H/+Jd4VXZBTroWY73HH8Mn9nV52D3viUUr87vdHpvtyOS4pD4
qxSvUM+d9vGf9eCkBet67GgRMsStQffiRUWkkNMZhTuskUZa7xAxsFWgqp54ukufp4766n3engP8
IetiV7BJFw5nNLs4I3yINI2fzcMdhXaWUXc6hOkInyxH9iZnf/IoVupSDDEEDtWsVdm1sS44kDn9
gK1olFN2aNU75HV38ufTaKWMOdodCXIuYrVIT+kIq85Ap9x7YrcxnZAs47FVT7uEDBedpDXcrSOY
/qolqxPJ3U58EP43Xj7q7Kpb+DxDt2ND/U3AGYjNu+CMusvg49cpfbvjwm+//YCP9m12F54DOJ+T
DLv35F4u9SiStELjsLP4J3a+ordkWAlPF54q2yoPGqjBMSiubys0rcaxgFS1HKzVopaulXTL/fw+
Nr+BgQR1nlqQmrHCPlsr2vzZZQZa1ovP7ifLHd93R4ep8zUfslhB8+se6fAqR2vRc4p++Rlw9vu9
6IvvtovfQvVnlvHuXUmsvahrBRY9XerLHY0LT0Nn3nzs0Bk6WupzAqu1Iiy8eGF/9AfsycYWRK47
Pj85yFlcCmR7rM/uxIQbgXpUb9h3D1d3SIORoNTbK/6rGZpyzr6cgZJL7uBY9OWU3d7fI4T7t0fT
2d2wvsiFFgo+XPsvkE8daQcrgi/X7Glgvns0yKZN0CcOMf19zxgddwps7fsOPyvlXbI350TgDUj/
Uw/mWnyLELXBwz+Kewut/Si6gyX2L5/EKdGYZ+sjuPVm2RYx6ua3pcSANvXGl54mK1kvxwoYinvz
xx/P1/uRA5LEur+FDqfrm1hwsMq7AR9syU8n4Tg40hbnFcV99e7Y9tkeYdzzkb+99bLW0MdRksVC
rKl+zUuNNqfGAIQ4BbvV3mfsXYk6LPkOtb6fTus9oXpL52PhETg5rTvla9lDAnurZFryh847iT4c
N9lEbVWuQ8qpNQ8//l3yA0TM67ZFy31fsJUOzb/6d1Mjih13jTXmPi8S+vG3Q68VG4RmS9Dir/2J
epHWaVM/wo/35+OsoY3X4GW9pKPPFj7o07N2hCoab/SQWnttqNvsjET9XVN90asZCpmDiCcuPWTD
3uWNU9UCwmeFet9CQZu5iBXYfiHE53K/RaMzQAZkbpCP7rKTIr9M7sDZ9RurQzWF+VeMxV/+hG3J
25Uboxf9P/VhCm5NOd+M7RsWvfKlIxJTFs8GB+Ywv7CZcI9u+jDzDuNLfmLlGzWI3IxtjYxjc8T3
fhW64zWWLETu5pNsE9yiYfG76KvRF4FnVIVsEl8OyAkbfOne1O407MQWbjBtSGcY93K+pLs3WunF
Gy/5BWoW/YTcwyWZzqIQzo12KmD76kWcGgaU1OrXEvSBsF/qWeJOrB5rYOdHRsTJ67R5jDkV6t2h
8NkD9x17PKUY6lN/879j6TBeH1sPNjtJxf5d8d2e272P8Muf9MK8pGz6WipwuaOQ5nYV/+Q1wD07
i+qxZiLh+nAIfFSvW3iY17qib3SUCOqdYls7MLY6a4mklrjHGNF3OMfrMwevVFtj9fXwEPNOo/fr
P2B/Z9ul8ERBCwu/ECi3NlvyUg+s5tDS3TaeOtacGv2nl2TxF+G8P+wNWPTgT57VL3k++umHcjOE
rp5e1BK9Xbj3Z/ZtXLrwMVryHSI4cEZTcst7+fVpE5+/rUbWj1EuygMYOnbZydGEmLMS+EDHCNc5
AxvS4sSDQl+cf1a+l44V7DLCwcM66YSw6v7w8ZJXEOqaern+rZcjez+pXrSlRkSrqFG7rffU4R/U
Xd+ltwGZcXn700HQtc2DpgWU+txjZcfPy3jmOhwdfqJL/hayOBUtYFmypUrhrbrpQDpVWvpFOE5E
1yVLvoB40/lQK4bCnTY7XYHh0yPsfnYb1Np5McN9FQp+/uO7IK3ev7yeOi+3LScmmzGopl76bH4k
2hxOAQFNCm8+dzgRdzpwngOPMVQIhy9vNOofYYbKaTqc7exvN7/umQ4SL4VYi20ppMv8y0zoRrJ5
Gl3am+HLQPJ0sTFe/N4YB6rz4zuy6enWHbQHMuCm2CI2zGfxX/6iw80Xq29ppf2pd/c0tql7l9uQ
vK/ffPv6NAk2MS+X7GSczkD6d4T1rFinbH0eVXi0fopx+xq6cctzBZKtT4oP693e3eTqFEjnuh6p
u+8qRge+r8HwToa/DgMVkdvmmMhrrq/pz++y+cjucFSriLreIGp0/OhvgD4L/KK8rVEHk1egcPY8
at/6h0Z//LboPd4n57XLxo/3Rp9zImMltmaXvlKtANLX0cLX13A6qcqI/vC0Xpw6wbLiHBb98Cf/
tChtvAJY5/EH74vVJ2wjXcyAP9UXfMCKUvZm6fjgpOiKcYqfCx/zb4hceNBfP24667YIVSCavuTF
IRuPE9dL7zow8a7UT+VI+1mRutEB/7OVKaOdI4/otYaEKg//HTJ7/VF//UCyz9GxJOr2E4GmKqUv
b1BQTo9ASKRsXr2pfs4+bBzXpQVLPw/v/FPC5v1FLH5+i3pzftfm9/VVyD//L14F3C39Q/Lrl/iS
ca7dHsmfFrZgVdhkW1VbC5QmKGrMmu7doUA/HgDZEBX8WO8qjW4eXwu8QzxTHPVpOIWlZ0F02Z4X
v5mmI/9KLfSZe/lPHs6CrHwDyysPL/rlstSPHejVi0/GjZygXt0kPsJi88R2HO7QwuMExbJcEbTk
4dP3Lpxh8Qt4zx8ubq89D+2f+XTvCZRTtj8SdDlpWx8tedIk7a0aquNwJNLTDLvp7hyO6MMJrS/o
YubOUYoDacmLcEaHIJ1H+yyB4JWUiLeryKYnVB46i5P6px41d8Q16Oy5KtVSoqU01roaefvWp/7O
/pa//AcuB8vF6Vasuo5zkwA6Mbli7WQ22vzLtzZEc6hHtS+a/CjLoKizHluVXpezXt0aCOWN6r/4
YJXSjW8pMAXQ+KvZGErWWNBCej7f/e30oikRZJmDKWeTPyROXpLBTGc0g/f57S80XpswQDendjCu
b0/EcLg9Q75xLLKqYeiG4XQbf/1IAobSaot/j5FBOIzdc1KhaeazAC39M/x73/QIuBhlRxbQ3cLX
rCZEQtfKe9BrUhqMXbtbADJ1BFJn08d9pcHYy4p5snCEn7d0kljHgcSLIdWK8VIO28Jc+jFGQ93n
7uX+8kX0iU+Y2pL36n7vQ9DfA7ysP8YET69R/FYotU6YD+tbFPQgrjTib1aauvSb1xKs1qpAdYK3
jFkH1kiWdAPqe+G37B/yrMPfv1MB//mvv/76X78TBnXzyKrlYMCQTcO///uowL+TR/Jvnhf+TYU/
JxFIn+TZ3//81yGEv79dU3+H/z007+zT//3PX8Kf0wZ/D82QVP/P5X8t//Wf//o/AAAA//8DAL+P
H9bhIAAA
headers:
CF-Cache-Status:
- DYNAMIC
CF-RAY:
- 7b50c215fcf91678-DME
- 7b3ba1a41cc19235-FRA
Connection:
- keep-alive
Content-Encoding:
@@ -560,7 +551,7 @@ interactions:
Content-Type:
- application/json
Date:
- Sun, 09 Apr 2023 06:40:21 GMT
- Thu, 06 Apr 2023 17:08:12 GMT
Server:
- cloudflare
Transfer-Encoding:
@@ -572,7 +563,7 @@ interactions:
openai-organization:
- own-45h3iv
openai-processing-ms:
- '14'
- '737'
openai-version:
- '2020-10-01'
strict-transport-security:
@@ -584,7 +575,7 @@ interactions:
x-ratelimit-reset-requests:
- 20ms
x-request-id:
- dfb9f22839291be2e6a38403a0861937
- f2f02f88b1097ce728e4b79198958fde
status:
code: 200
message: OK

View File

@@ -2,22 +2,26 @@ version: "3"
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.7.0 # https://www.docker.elastic.co/r/elasticsearch/elasticsearch
image: docker.elastic.co/elasticsearch/elasticsearch:8.7.0
environment:
- discovery.type=single-node
- xpack.security.enabled=false # security has been disabled, so no login or password is required.
- xpack.security.enabled=false
- xpack.security.http.ssl.enabled=false
- ELASTIC_PASSWORD=password
ports:
- "9200:9200"
healthcheck:
test: [ "CMD-SHELL", "curl --silent --fail http://localhost:9200/_cluster/health || exit 1" ]
interval: 10s
retries: 60
interval: 1s
retries: 360
kibana:
image: docker.elastic.co/kibana/kibana:8.7.0
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
- ELASTICSEARCH_USERNAME=kibana_system
- ELASTICSEARCH_PASSWORD=password
- KIBANA_PASSWORD=password
ports:
- "5601:5601"
healthcheck:

View File

@@ -1,6 +1,4 @@
"""Test Chroma functionality."""
import pytest
from langchain.docstore.document import Document
from langchain.vectorstores import Chroma
from tests.integration_tests.vectorstores.fake_embeddings import FakeEmbeddings
@@ -16,17 +14,6 @@ def test_chroma() -> None:
assert output == [Document(page_content="foo")]
@pytest.mark.asyncio
async def test_chroma_async() -> None:
"""Test end to end construction and search."""
texts = ["foo", "bar", "baz"]
docsearch = Chroma.from_texts(
collection_name="test_collection", texts=texts, embedding=FakeEmbeddings()
)
output = await docsearch.asimilarity_search("foo", k=1)
assert output == [Document(page_content="foo")]
def test_chroma_with_metadatas() -> None:
"""Test end to end construction and search."""
texts = ["foo", "bar", "baz"]

View File

@@ -8,7 +8,9 @@ import pytest
from elasticsearch import Elasticsearch
from langchain.docstore.document import Document
from langchain.document_loaders import TextLoader
from langchain.embeddings import OpenAIEmbeddings
from langchain.text_splitter import CharacterTextSplitter
from langchain.vectorstores.elastic_vector_search import ElasticVectorSearch
from tests.integration_tests.vectorstores.fake_embeddings import FakeEmbeddings
@@ -43,6 +45,16 @@ class TestElasticsearch:
yield openai_api_key
@pytest.fixture(scope="class")
def documents(self) -> Generator[List[Document], None, None]:
"""Return a generator that yields a list of documents."""
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
documents = TextLoader(
os.path.join(os.path.dirname(__file__), "fixtures", "sharks.txt")
).load()
yield text_splitter.split_documents(documents)
def test_similarity_search_without_metadata(self, elasticsearch_url: str) -> None:
"""Test end to end construction and search without metadata."""
texts = ["foo", "bar", "baz"]

View File

@@ -1,10 +1,7 @@
"""Test functionality of Python REPL."""
import sys
import pytest
from langchain.python import PythonREPL
from langchain.tools.python.tool import PythonAstREPLTool, PythonREPLTool
from langchain.tools.python.tool import PythonREPLTool
_SAMPLE_CODE = """
```
@@ -14,14 +11,6 @@ multiply()
```
"""
_AST_SAMPLE_CODE = """
```
def multiply():
return(5*6)
multiply()
```
"""
def test_python_repl() -> None:
"""Test functionality when globals/locals are not provided."""
@@ -71,15 +60,6 @@ def test_functionality_multiline() -> None:
assert output == "30\n"
def test_python_ast_repl_multiline() -> None:
"""Test correct functionality for ChatGPT multiline commands."""
if sys.version_info < (3, 9):
pytest.skip("Python 3.9+ is required for this test")
tool = PythonAstREPLTool()
output = tool.run(_AST_SAMPLE_CODE)
assert output == 30
def test_function() -> None:
"""Test correct functionality."""
chain = PythonREPL()