mirror of
https://github.com/hwchase17/langchain.git
synced 2026-02-06 09:10:27 +00:00
Compare commits
52 Commits
mdrxy/docs
...
sr/init_mu
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
722c1fbf6e | ||
|
|
0098f0c953 | ||
|
|
dcad123c31 | ||
|
|
dd29d34939 | ||
|
|
615239ff6f | ||
|
|
5e2a456b18 | ||
|
|
65e78374ad | ||
|
|
064c00e641 | ||
|
|
87847dd912 | ||
|
|
429ee7d035 | ||
|
|
7cfb6a6d6a | ||
|
|
f08e4f1274 | ||
|
|
86879be2fe | ||
|
|
5c05e6e5e1 | ||
|
|
d01d670308 | ||
|
|
3fddb50df8 | ||
|
|
b6da2d293e | ||
|
|
bfeab8e2b5 | ||
|
|
a2988c6f0b | ||
|
|
7c5af3a84b | ||
|
|
0c8bb4381e | ||
|
|
f7efd89e70 | ||
|
|
0f2a8961ce | ||
|
|
f48aa0f200 | ||
|
|
53bd808aa5 | ||
|
|
005897a165 | ||
|
|
62d84be951 | ||
|
|
412899b352 | ||
|
|
bde0bf2569 | ||
|
|
ec6e7d4f29 | ||
|
|
f4480569e2 | ||
|
|
92161763b8 | ||
|
|
6564669a14 | ||
|
|
0f4a4c9f86 | ||
|
|
d896b05255 | ||
|
|
b895a18e6d | ||
|
|
04f2f6c896 | ||
|
|
272efba5b9 | ||
|
|
7285b173b8 | ||
|
|
82a7f11c32 | ||
|
|
be39842019 | ||
|
|
5e804316c9 | ||
|
|
460ee6aa32 | ||
|
|
587af4af7c | ||
|
|
15f69254b5 | ||
|
|
0f28c5ca18 | ||
|
|
44bbdab6a6 | ||
|
|
cc102f0593 | ||
|
|
bd4a96dc50 | ||
|
|
f504062687 | ||
|
|
0ba7d1f8d9 | ||
|
|
46c08d26c3 |
55
.github/workflows/codspeed.yml
vendored
55
.github/workflows/codspeed.yml
vendored
@@ -5,30 +5,46 @@ on:
|
||||
branches:
|
||||
- master
|
||||
pull_request:
|
||||
paths:
|
||||
- 'libs/core/**'
|
||||
- 'libs/partners/**'
|
||||
|
||||
workflow_dispatch:
|
||||
|
||||
env:
|
||||
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: foo
|
||||
AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME: foo
|
||||
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
|
||||
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}
|
||||
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_CHAT_DEPLOYMENT_NAME }}
|
||||
AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME }}
|
||||
DEEPSEEK_API_KEY: foo
|
||||
FIREWORKS_API_KEY: foo
|
||||
|
||||
jobs:
|
||||
prepare_matrix:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install uv
|
||||
uses: astral-sh/setup-uv@v6
|
||||
with:
|
||||
python-version: "3.12"
|
||||
- id: files
|
||||
uses: Ana06/get-changed-files@v2.3.0
|
||||
- id: set-matrix
|
||||
run: |
|
||||
uv venv
|
||||
uv pip install packaging requests
|
||||
uv run .github/scripts/check_diff.py ${{ steps.files.outputs.all }} >> $GITHUB_OUTPUT
|
||||
outputs:
|
||||
codspeed: ${{ steps.set-matrix.outputs.codspeed }}
|
||||
codspeed:
|
||||
name: Run benchmarks
|
||||
needs: [ prepare_matrix ]
|
||||
if: ${{ needs.prepare_matrix.outputs.codspeed != '[]' }}
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
include:
|
||||
- working-directory: libs/core
|
||||
mode: walltime
|
||||
- working-directory: libs/partners/openai
|
||||
- working-directory: libs/partners/anthropic
|
||||
- working-directory: libs/partners/deepseek
|
||||
- working-directory: libs/partners/fireworks
|
||||
- working-directory: libs/partners/xai
|
||||
- working-directory: libs/partners/mistralai
|
||||
- working-directory: libs/partners/groq
|
||||
job-configs: ${{ fromJson(needs.prepare_matrix.outputs.codspeed) }}
|
||||
fail-fast: false
|
||||
|
||||
steps:
|
||||
@@ -38,25 +54,26 @@ jobs:
|
||||
- name: Install uv
|
||||
uses: astral-sh/setup-uv@v6
|
||||
with:
|
||||
python-version: "3.12"
|
||||
python-version: ${{ matrix.job-configs.python-version }}
|
||||
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.12"
|
||||
python-version: ${{ matrix.job-configs.python-version }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: uv sync --group test
|
||||
working-directory: ${{ matrix.working-directory }}
|
||||
working-directory: ${{ matrix.job-configs.working-directory }}
|
||||
|
||||
- name: Run benchmarks ${{ matrix.working-directory }}
|
||||
- name: Run benchmarks ${{ matrix.job-configs.working-directory }}
|
||||
uses: CodSpeedHQ/action@v3
|
||||
with:
|
||||
token: ${{ secrets.CODSPEED_TOKEN }}
|
||||
run: |
|
||||
cd ${{ matrix.working-directory }}
|
||||
if [ "${{ matrix.working-directory }}" = "libs/core" ]; then
|
||||
cd ${{ matrix.job-configs.working-directory }}
|
||||
if [ "${{ matrix.job-configs.working-directory }}" = "libs/core" ]; then
|
||||
uv run --no-sync pytest ./tests/benchmarks --codspeed
|
||||
else
|
||||
uv run --no-sync pytest ./tests/ --codspeed
|
||||
fi
|
||||
mode: ${{ matrix.mode || 'instrumentation' }}
|
||||
mode: ${{ matrix.job-configs.working-directory == 'libs/core' && 'walltime' || 'instrumentation' }}
|
||||
|
||||
|
||||
@@ -7,8 +7,8 @@ LangChain has a large ecosystem of integrations with various external resources
|
||||
When building such applications developers should remember to follow good security practices:
|
||||
|
||||
* [**Limit Permissions**](https://en.wikipedia.org/wiki/Principle_of_least_privilege): Scope permissions specifically to the application's need. Granting broad or excessive permissions can introduce significant security vulnerabilities. To avoid such vulnerabilities, consider using read-only credentials, disallowing access to sensitive resources, using sandboxing techniques (such as running inside a container), specifying proxy configurations to control external requests, etc. as appropriate for your application.
|
||||
* **Anticipate Potential Misuse**: Just as humans can err, so can Large Language Models (LLMs). Always assume that any system access or credentials may be used in any way allowed by the permissions they are assigned. For example, if a pair of database credentials allows deleting data, it's safest to assume that any LLM able to use those credentials may in fact delete data.
|
||||
* [**Defense in Depth**](https://en.wikipedia.org/wiki/Defense_in_depth_(computing)): No security technique is perfect. Fine-tuning and good chain design can reduce, but not eliminate, the odds that a Large Language Model (LLM) may make a mistake. It's best to combine multiple layered security approaches rather than relying on any single layer of defense to ensure security. For example: use both read-only permissions and sandboxing to ensure that LLMs are only able to access data that is explicitly meant for them to use.
|
||||
* **Anticipate Potential Misuse**: Just as humans can err, so can Large Language Models (LLMs). Always assume that any system access or credentials may be used in any way allowed by the permissions they are assigned. For example, if a pair of database credentials allows deleting data, it’s safest to assume that any LLM able to use those credentials may in fact delete data.
|
||||
* [**Defense in Depth**](https://en.wikipedia.org/wiki/Defense_in_depth_(computing)): No security technique is perfect. Fine-tuning and good chain design can reduce, but not eliminate, the odds that a Large Language Model (LLM) may make a mistake. It’s best to combine multiple layered security approaches rather than relying on any single layer of defense to ensure security. For example: use both read-only permissions and sandboxing to ensure that LLMs are only able to access data that is explicitly meant for them to use.
|
||||
|
||||
Risks of not doing so include, but are not limited to:
|
||||
* Data corruption or loss.
|
||||
@@ -39,7 +39,7 @@ Before reporting a vulnerability, please review:
|
||||
|
||||
1) In-Scope Targets and Out-of-Scope Targets below.
|
||||
2) The [langchain-ai/langchain](https://python.langchain.com/docs/contributing/repo_structure) monorepo structure.
|
||||
3) The [Best practices](#best-practices) above to
|
||||
3) The [Best practicies](#best-practices) above to
|
||||
understand what we consider to be a security vulnerability vs. developer
|
||||
responsibility.
|
||||
|
||||
|
||||
@@ -47,7 +47,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 1,
|
||||
"id": "6a75a5c6-34ee-4ab9-a664-d9b432d812ee",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
@@ -61,7 +61,7 @@
|
||||
],
|
||||
"source": [
|
||||
"# Local\n",
|
||||
"from langchain_ollama import ChatOllama\n",
|
||||
"from langchain_community.chat_models import ChatOllama\n",
|
||||
"\n",
|
||||
"llama2_chat = ChatOllama(model=\"llama2:13b-chat\")\n",
|
||||
"llama2_code = ChatOllama(model=\"codellama:7b-instruct\")\n",
|
||||
|
||||
@@ -185,7 +185,7 @@
|
||||
" )\n",
|
||||
" # Text summary chain\n",
|
||||
" model = VertexAI(\n",
|
||||
" temperature=0, model_name=\"gemini-2.0-flash-lite-001\", max_tokens=1024\n",
|
||||
" temperature=0, model_name=\"gemini-pro\", max_tokens=1024\n",
|
||||
" ).with_fallbacks([empty_response])\n",
|
||||
" summarize_chain = {\"element\": lambda x: x} | prompt | model | StrOutputParser()\n",
|
||||
"\n",
|
||||
@@ -254,7 +254,7 @@
|
||||
"\n",
|
||||
"def image_summarize(img_base64, prompt):\n",
|
||||
" \"\"\"Make image summary\"\"\"\n",
|
||||
" model = ChatVertexAI(model=\"gemini-2.0-flash\", max_tokens=1024)\n",
|
||||
" model = ChatVertexAI(model=\"gemini-pro-vision\", max_tokens=1024)\n",
|
||||
"\n",
|
||||
" msg = model.invoke(\n",
|
||||
" [\n",
|
||||
@@ -394,7 +394,7 @@
|
||||
"# The vectorstore to use to index the summaries\n",
|
||||
"vectorstore = Chroma(\n",
|
||||
" collection_name=\"mm_rag_cj_blog\",\n",
|
||||
" embedding_function=VertexAIEmbeddings(model_name=\"text-embedding-005\"),\n",
|
||||
" embedding_function=VertexAIEmbeddings(model_name=\"textembedding-gecko@latest\"),\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"# Create retriever\n",
|
||||
@@ -553,7 +553,7 @@
|
||||
" \"\"\"\n",
|
||||
"\n",
|
||||
" # Multi-modal LLM\n",
|
||||
" model = ChatVertexAI(temperature=0, model_name=\"gemini-2.0-flash\", max_tokens=1024)\n",
|
||||
" model = ChatVertexAI(temperature=0, model_name=\"gemini-pro-vision\", max_tokens=1024)\n",
|
||||
"\n",
|
||||
" # RAG pipeline\n",
|
||||
" chain = (\n",
|
||||
|
||||
@@ -204,14 +204,14 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 4,
|
||||
"id": "523e6ed2-2132-4748-bdb7-db765f20648d",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_community.chat_models import ChatOllama\n",
|
||||
"from langchain_core.output_parsers import StrOutputParser\n",
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"from langchain_ollama import ChatOllama"
|
||||
"from langchain_core.prompts import ChatPromptTemplate"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -215,8 +215,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_community.chat_models import ChatOllama\n",
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"from langchain_ollama import ChatOllama\n",
|
||||
"from langchain_openai import ChatOpenAI\n",
|
||||
"\n",
|
||||
"# Prompt\n",
|
||||
|
||||
@@ -25,7 +25,7 @@
|
||||
" * [Oracle Blockchain](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_blockchain_table.html#GUID-B469E277-978E-4378-A8C1-26D3FF96C9A6)\n",
|
||||
" * [JSON](https://docs.oracle.com/en/database/oracle/oracle-database/23/adjsn/json-in-oracle-database.html)\n",
|
||||
"\n",
|
||||
"This guide demonstrates how Oracle AI Vector Search can be used with LangChain to serve an end-to-end RAG pipeline. This guide goes through examples of:\n",
|
||||
"This guide demonstrates how Oracle AI Vector Search can be used with Langchain to serve an end-to-end RAG pipeline. This guide goes through examples of:\n",
|
||||
"\n",
|
||||
" * Loading the documents from various sources using OracleDocLoader\n",
|
||||
" * Summarizing them within/outside the database using OracleSummary\n",
|
||||
@@ -47,19 +47,7 @@
|
||||
"source": [
|
||||
"### Prerequisites\n",
|
||||
"\n",
|
||||
"Please install the Oracle Database [python-oracledb driver](https://pypi.org/project/oracledb/) to use LangChain with Oracle AI Vector Search:\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"$ python -m pip install --upgrade oracledb\n",
|
||||
"```"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create Demo User\n",
|
||||
"First, connect as a privileged user to create a demo user with all the required privileges. Change the credentials for your environment. Also set the DEMO_PY_DIR path to a directory on the database host where your model file is located:"
|
||||
"Please install Oracle Python Client driver to use Langchain with Oracle AI Vector Search. "
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -68,30 +56,65 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# pip install oracledb"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create Demo User\n",
|
||||
"First, create a demo user with all the required privileges. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 37,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Connection successful!\n",
|
||||
"User setup done!\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"import sys\n",
|
||||
"\n",
|
||||
"import oracledb\n",
|
||||
"\n",
|
||||
"# Please update with your SYSTEM (or privileged user) username, password, and database connection string\n",
|
||||
"username = \"SYSTEM\"\n",
|
||||
"# Update with your username, password, hostname, and service_name\n",
|
||||
"username = \"\"\n",
|
||||
"password = \"\"\n",
|
||||
"dsn = \"\"\n",
|
||||
"\n",
|
||||
"with oracledb.connect(user=username, password=password, dsn=dsn) as connection:\n",
|
||||
"try:\n",
|
||||
" conn = oracledb.connect(user=username, password=password, dsn=dsn)\n",
|
||||
" print(\"Connection successful!\")\n",
|
||||
"\n",
|
||||
" with connection.cursor() as cursor:\n",
|
||||
" cursor = conn.cursor()\n",
|
||||
" try:\n",
|
||||
" cursor.execute(\n",
|
||||
" \"\"\"\n",
|
||||
" begin\n",
|
||||
" -- Drop user\n",
|
||||
" execute immediate 'drop user if exists testuser cascade';\n",
|
||||
"\n",
|
||||
" begin\n",
|
||||
" execute immediate 'drop user testuser cascade';\n",
|
||||
" exception\n",
|
||||
" when others then\n",
|
||||
" dbms_output.put_line('Error dropping user: ' || SQLERRM);\n",
|
||||
" end;\n",
|
||||
" \n",
|
||||
" -- Create user and grant privileges\n",
|
||||
" execute immediate 'create user testuser identified by testuser';\n",
|
||||
" execute immediate 'grant connect, unlimited tablespace, create credential, create procedure, create any index to testuser';\n",
|
||||
" execute immediate 'create or replace directory DEMO_PY_DIR as ''/home/yourname/demo/orachain''';\n",
|
||||
" execute immediate 'create or replace directory DEMO_PY_DIR as ''/scratch/hroy/view_storage/hroy_devstorage/demo/orachain''';\n",
|
||||
" execute immediate 'grant read, write on directory DEMO_PY_DIR to public';\n",
|
||||
" execute immediate 'grant create mining model to testuser';\n",
|
||||
"\n",
|
||||
" \n",
|
||||
" -- Network access\n",
|
||||
" begin\n",
|
||||
" DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(\n",
|
||||
@@ -104,7 +127,15 @@
|
||||
" end;\n",
|
||||
" \"\"\"\n",
|
||||
" )\n",
|
||||
" print(\"User setup done!\")"
|
||||
" print(\"User setup done!\")\n",
|
||||
" except Exception as e:\n",
|
||||
" print(f\"User setup failed with error: {e}\")\n",
|
||||
" finally:\n",
|
||||
" cursor.close()\n",
|
||||
" conn.close()\n",
|
||||
"except Exception as e:\n",
|
||||
" print(f\"Connection failed with error: {e}\")\n",
|
||||
" sys.exit(1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -112,13 +143,13 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Process Documents using Oracle AI\n",
|
||||
"Consider the following scenario: users possess documents stored either in an Oracle Database or a file system and intend to utilize this data with Oracle AI Vector Search powered by LangChain.\n",
|
||||
"Consider the following scenario: users possess documents stored either in an Oracle Database or a file system and intend to utilize this data with Oracle AI Vector Search powered by Langchain.\n",
|
||||
"\n",
|
||||
"To prepare the documents for analysis, a comprehensive preprocessing workflow is necessary. Initially, the documents must be retrieved, summarized (if required), and chunked as needed. Subsequent steps involve generating embeddings for these chunks and integrating them into the Oracle AI Vector Store. Users can then conduct semantic searches on this data.\n",
|
||||
"\n",
|
||||
"The Oracle AI Vector Search LangChain library encompasses a suite of document processing tools that facilitate document loading, chunking, summary generation, and embedding creation.\n",
|
||||
"The Oracle AI Vector Search Langchain library encompasses a suite of document processing tools that facilitate document loading, chunking, summary generation, and embedding creation.\n",
|
||||
"\n",
|
||||
"In the sections that follow, we will detail the utilization of Oracle AI LangChain APIs to effectively implement each of these processes."
|
||||
"In the sections that follow, we will detail the utilization of Oracle AI Langchain APIs to effectively implement each of these processes."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -126,24 +157,38 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Connect to Demo User\n",
|
||||
"The following sample code shows how to connect to Oracle Database using the python-oracledb driver. By default, python-oracledb runs in a ‘Thin’ mode which connects directly to Oracle Database. This mode does not need Oracle Client libraries. However, some additional functionality is available when python-oracledb uses them. Python-oracledb is said to be in ‘Thick’ mode when Oracle Client libraries are used. Both modes have comprehensive functionality supporting the Python Database API v2.0 Specification. See the following [guide](https://python-oracledb.readthedocs.io/en/latest/user_guide/appendix_a.html#featuresummary) that talks about features supported in each mode. You can switch to Thick mode if you are unable to use Thin mode."
|
||||
"The following sample code will show how to connect to Oracle Database. By default, python-oracledb runs in a ‘Thin’ mode which connects directly to Oracle Database. This mode does not need Oracle Client libraries. However, some additional functionality is available when python-oracledb uses them. Python-oracledb is said to be in ‘Thick’ mode when Oracle Client libraries are used. Both modes have comprehensive functionality supporting the Python Database API v2.0 Specification. See the following [guide](https://python-oracledb.readthedocs.io/en/latest/user_guide/appendix_a.html#featuresummary) that talks about features supported in each mode. You might want to switch to thick-mode if you are unable to use thin-mode."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 45,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Connection successful!\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"import sys\n",
|
||||
"\n",
|
||||
"import oracledb\n",
|
||||
"\n",
|
||||
"# please update with your username, password, and database connection string\n",
|
||||
"username = \"testuser\"\n",
|
||||
"# please update with your username, password, hostname and service_name\n",
|
||||
"username = \"\"\n",
|
||||
"password = \"\"\n",
|
||||
"dsn = \"\"\n",
|
||||
"\n",
|
||||
"connection = oracledb.connect(user=username, password=password, dsn=dsn)\n",
|
||||
"print(\"Connection successful!\")"
|
||||
"try:\n",
|
||||
" conn = oracledb.connect(user=username, password=password, dsn=dsn)\n",
|
||||
" print(\"Connection successful!\")\n",
|
||||
"except Exception as e:\n",
|
||||
" print(\"Connection failed!\")\n",
|
||||
" sys.exit(1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -156,12 +201,22 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 46,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Table created and populated.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"with connection.cursor() as cursor:\n",
|
||||
" drop_table_sql = \"\"\"drop table if exists demo_tab\"\"\"\n",
|
||||
"try:\n",
|
||||
" cursor = conn.cursor()\n",
|
||||
"\n",
|
||||
" drop_table_sql = \"\"\"drop table demo_tab\"\"\"\n",
|
||||
" cursor.execute(drop_table_sql)\n",
|
||||
"\n",
|
||||
" create_table_sql = \"\"\"create table demo_tab (id number, data clob)\"\"\"\n",
|
||||
@@ -184,9 +239,15 @@
|
||||
" ]\n",
|
||||
" cursor.executemany(insert_row_sql, rows_to_insert)\n",
|
||||
"\n",
|
||||
"connection.commit()\n",
|
||||
" conn.commit()\n",
|
||||
"\n",
|
||||
"print(\"Table created and populated.\")"
|
||||
" print(\"Table created and populated.\")\n",
|
||||
" cursor.close()\n",
|
||||
"except Exception as e:\n",
|
||||
" print(\"Table creation failed.\")\n",
|
||||
" cursor.close()\n",
|
||||
" conn.close()\n",
|
||||
" sys.exit(1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -200,22 +261,30 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Load the ONNX Model\n",
|
||||
"### Load ONNX Model\n",
|
||||
"\n",
|
||||
"Oracle accommodates a variety of embedding providers, enabling you to choose between proprietary database solutions and third-party services such as Oracle Generative AI Service and HuggingFace. This selection dictates the methodology for generating and managing embeddings.\n",
|
||||
"Oracle accommodates a variety of embedding providers, enabling users to choose between proprietary database solutions and third-party services such as OCIGENAI and HuggingFace. This selection dictates the methodology for generating and managing embeddings.\n",
|
||||
"\n",
|
||||
"***Important*** : Should you opt for the database option, you must upload an ONNX model into the Oracle Database. Conversely, if a third-party provider is selected for embedding generation, uploading an ONNX model to Oracle Database is not required.\n",
|
||||
"***Important*** : Should users opt for the database option, they must upload an ONNX model into the Oracle Database. Conversely, if a third-party provider is selected for embedding generation, uploading an ONNX model to Oracle Database is not required.\n",
|
||||
"\n",
|
||||
"A significant advantage of utilizing an ONNX model directly within Oracle Database is the enhanced security and performance it offers by eliminating the need to transmit data to external parties. Additionally, this method avoids the latency typically associated with network or REST API calls.\n",
|
||||
"A significant advantage of utilizing an ONNX model directly within Oracle is the enhanced security and performance it offers by eliminating the need to transmit data to external parties. Additionally, this method avoids the latency typically associated with network or REST API calls.\n",
|
||||
"\n",
|
||||
"Below is the example code to upload an ONNX model into Oracle Database:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 47,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"ONNX model loaded.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_community.embeddings.oracleai import OracleEmbeddings\n",
|
||||
"\n",
|
||||
@@ -225,8 +294,12 @@
|
||||
"onnx_file = \"tinybert.onnx\"\n",
|
||||
"model_name = \"demo_model\"\n",
|
||||
"\n",
|
||||
"OracleEmbeddings.load_onnx_model(connection, onnx_dir, onnx_file, model_name)\n",
|
||||
"print(\"ONNX model loaded.\")"
|
||||
"try:\n",
|
||||
" OracleEmbeddings.load_onnx_model(conn, onnx_dir, onnx_file, model_name)\n",
|
||||
" print(\"ONNX model loaded.\")\n",
|
||||
"except Exception as e:\n",
|
||||
" print(\"ONNX model loading failed!\")\n",
|
||||
" sys.exit(1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -248,7 +321,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"with connection.cursor() as cursor:\n",
|
||||
"try:\n",
|
||||
" cursor = conn.cursor()\n",
|
||||
" cursor.execute(\n",
|
||||
" \"\"\"\n",
|
||||
" declare\n",
|
||||
@@ -275,7 +349,12 @@
|
||||
" params => json(jo.to_string));\n",
|
||||
" end;\n",
|
||||
" \"\"\"\n",
|
||||
" )"
|
||||
" )\n",
|
||||
" cursor.close()\n",
|
||||
" print(\"Credentials created.\")\n",
|
||||
"except Exception as ex:\n",
|
||||
" cursor.close()\n",
|
||||
" raise"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -283,24 +362,33 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Load Documents\n",
|
||||
"You have the flexibility to load documents from either the Oracle Database, a file system, or both, by appropriately configuring the loader parameters. For comprehensive details on these parameters, please consult the [Oracle AI Vector Search Guide](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-73397E89-92FB-48ED-94BB-1AD960C4EA1F).\n",
|
||||
"Users have the flexibility to load documents from either the Oracle Database, a file system, or both, by appropriately configuring the loader parameters. For comprehensive details on these parameters, please consult the [Oracle AI Vector Search Guide](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-73397E89-92FB-48ED-94BB-1AD960C4EA1F).\n",
|
||||
"\n",
|
||||
"A significant advantage of utilizing OracleDocLoader is its capability to process over 150 distinct file formats, eliminating the need for multiple loaders for different document types. For a complete list of the supported formats, please refer to the [Oracle Text Supported Document Formats](https://docs.oracle.com/en/database/oracle/oracle-database/23/ccref/oracle-text-supported-document-formats.html).\n",
|
||||
"\n",
|
||||
"Below is a sample code snippet that demonstrates how to use OracleDocLoader:"
|
||||
"Below is a sample code snippet that demonstrates how to use OracleDocLoader"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 48,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Number of docs loaded: 3\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_community.document_loaders.oracleai import OracleDocLoader\n",
|
||||
"from langchain_core.documents import Document\n",
|
||||
"\n",
|
||||
"# loading from Oracle Database table\n",
|
||||
"# make sure you have the table with this specification\n",
|
||||
"loader_params = {}\n",
|
||||
"loader_params = {\n",
|
||||
" \"owner\": \"testuser\",\n",
|
||||
" \"tablename\": \"demo_tab\",\n",
|
||||
@@ -308,7 +396,7 @@
|
||||
"}\n",
|
||||
"\n",
|
||||
"\"\"\" load the docs \"\"\"\n",
|
||||
"loader = OracleDocLoader(conn=connection, params=loader_params)\n",
|
||||
"loader = OracleDocLoader(conn=conn, params=loader_params)\n",
|
||||
"docs = loader.load()\n",
|
||||
"\n",
|
||||
"\"\"\" verify \"\"\"\n",
|
||||
@@ -321,23 +409,23 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Generate Summary\n",
|
||||
"Now that you have loaded the documents, you may want to generate a summary for each document. The Oracle AI Vector Search LangChain library offers a suite of APIs designed for document summarization. It supports multiple summarization providers such as Database, Oracle Generative AI Service, HuggingFace, among others, allowing you to select the provider that best meets their needs. To utilize these capabilities, you must configure the summary parameters as specified. For detailed information on these parameters, please consult the [Oracle AI Vector Search Guide book](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-EC9DDB58-6A15-4B36-BA66-ECBA20D2CE57)."
|
||||
"Now that the user loaded the documents, they may want to generate a summary for each document. The Oracle AI Vector Search Langchain library offers a suite of APIs designed for document summarization. It supports multiple summarization providers such as Database, OCIGENAI, HuggingFace, among others, allowing users to select the provider that best meets their needs. To utilize these capabilities, users must configure the summary parameters as specified. For detailed information on these parameters, please consult the [Oracle AI Vector Search Guide book](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-EC9DDB58-6A15-4B36-BA66-ECBA20D2CE57)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"***Note:*** You may need to set proxy if you want to use some 3rd party summary generation providers other than Oracle's in-house and default provider: 'database'. If you don't have proxy, please remove the proxy parameter when you instantiate the OracleSummary."
|
||||
"***Note:*** The users may need to set proxy if they want to use some 3rd party summary generation providers other than Oracle's in-house and default provider: 'database'. If you don't have proxy, please remove the proxy parameter when you instantiate the OracleSummary."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 22,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# proxy to be used when we instantiate summary and embedder objects\n",
|
||||
"# proxy to be used when we instantiate summary and embedder object\n",
|
||||
"proxy = \"\""
|
||||
]
|
||||
},
|
||||
@@ -345,14 +433,22 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"The following sample code shows how to generate a summary:"
|
||||
"The following sample code will show how to generate summary:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 49,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Number of Summaries: 3\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_community.utilities.oracleai import OracleSummary\n",
|
||||
"from langchain_core.documents import Document\n",
|
||||
@@ -367,7 +463,7 @@
|
||||
"\n",
|
||||
"# get the summary instance\n",
|
||||
"# Remove proxy if not required\n",
|
||||
"summ = OracleSummary(conn=connection, params=summary_params, proxy=proxy)\n",
|
||||
"summ = OracleSummary(conn=conn, params=summary_params, proxy=proxy)\n",
|
||||
"\n",
|
||||
"list_summary = []\n",
|
||||
"for doc in docs:\n",
|
||||
@@ -391,9 +487,17 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 50,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Number of Chunks: 3\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_community.document_loaders.oracleai import OracleTextSplitter\n",
|
||||
"from langchain_core.documents import Document\n",
|
||||
@@ -402,7 +506,7 @@
|
||||
"splitter_params = {\"normalize\": \"all\"}\n",
|
||||
"\n",
|
||||
"\"\"\" get the splitter instance \"\"\"\n",
|
||||
"splitter = OracleTextSplitter(conn=connection, params=splitter_params)\n",
|
||||
"splitter = OracleTextSplitter(conn=conn, params=splitter_params)\n",
|
||||
"\n",
|
||||
"list_chunks = []\n",
|
||||
"for doc in docs:\n",
|
||||
@@ -419,19 +523,19 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Generate Embeddings\n",
|
||||
"Now that the documents are chunked as per requirements, you may want to generate embeddings for these chunks. Oracle AI Vector Search provides multiple methods for generating embeddings, utilizing either locally hosted ONNX models or third-party APIs. For comprehensive instructions on configuring these alternatives, please refer to the [Oracle AI Vector Search Guide](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-C6439E94-4E86-4ECD-954E-4B73D53579DE)."
|
||||
"Now that the documents are chunked as per requirements, the users may want to generate embeddings for these chunks. Oracle AI Vector Search provides multiple methods for generating embeddings, utilizing either locally hosted ONNX models or third-party APIs. For comprehensive instructions on configuring these alternatives, please refer to the [Oracle AI Vector Search Guide](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/dbms_vector_chain1.html#GUID-C6439E94-4E86-4ECD-954E-4B73D53579DE)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"***Note:*** You may need to configure a proxy to utilize third-party embedding generation providers, excluding the 'database' provider that utilizes an ONNX model."
|
||||
"***Note:*** Users may need to configure a proxy to utilize third-party embedding generation providers, excluding the 'database' provider that utilizes an ONNX model."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@@ -443,14 +547,22 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"The following sample code shows how to generate embeddings:"
|
||||
"The following sample code will show how to generate embeddings:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 51,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Number of embeddings: 3\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_community.embeddings.oracleai import OracleEmbeddings\n",
|
||||
"from langchain_core.documents import Document\n",
|
||||
@@ -460,7 +572,7 @@
|
||||
"\n",
|
||||
"# get the embedding instance\n",
|
||||
"# Remove proxy if not required\n",
|
||||
"embedder = OracleEmbeddings(conn=connection, params=embedder_params, proxy=proxy)\n",
|
||||
"embedder = OracleEmbeddings(conn=conn, params=embedder_params, proxy=proxy)\n",
|
||||
"\n",
|
||||
"embeddings = []\n",
|
||||
"for doc in docs:\n",
|
||||
@@ -479,19 +591,19 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create Oracle AI Vector Store\n",
|
||||
"Now that you know how to use Oracle AI LangChain library APIs individually to process the documents, let us show how to integrate with Oracle AI Vector Store to facilitate the semantic searches."
|
||||
"Now that you know how to use Oracle AI Langchain library APIs individually to process the documents, let us show how to integrate with Oracle AI Vector Store to facilitate the semantic searches."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"First, let's import all the dependencies:"
|
||||
"First, let's import all the dependencies."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 52,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@@ -514,80 +626,100 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Next, let's combine all document processing stages together. Here is the sample code:"
|
||||
"Next, let's combine all document processing stages together. Here is the sample code below:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 53,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Connection successful!\n",
|
||||
"ONNX model loaded.\n",
|
||||
"Number of total chunks with metadata: 3\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"\"\"\"\n",
|
||||
"In this sample example, we will use 'database' provider for both summary and embeddings\n",
|
||||
"so, we don't need to do the following:\n",
|
||||
"In this sample example, we will use 'database' provider for both summary and embeddings.\n",
|
||||
"So, we don't need to do the followings:\n",
|
||||
" - set proxy for 3rd party providers\n",
|
||||
" - create credential for 3rd party providers\n",
|
||||
"\n",
|
||||
"If you choose to use 3rd party provider, please follow the necessary steps for proxy and credential.\n",
|
||||
"If you choose to use 3rd party provider, \n",
|
||||
"please follow the necessary steps for proxy and credential.\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"# please update with your username, password, and database connection string\n",
|
||||
"# oracle connection\n",
|
||||
"# please update with your username, password, hostname, and service_name\n",
|
||||
"username = \"\"\n",
|
||||
"password = \"\"\n",
|
||||
"dsn = \"\"\n",
|
||||
"\n",
|
||||
"with oracledb.connect(user=username, password=password, dsn=dsn) as connection:\n",
|
||||
"try:\n",
|
||||
" conn = oracledb.connect(user=username, password=password, dsn=dsn)\n",
|
||||
" print(\"Connection successful!\")\n",
|
||||
"except Exception as e:\n",
|
||||
" print(\"Connection failed!\")\n",
|
||||
" sys.exit(1)\n",
|
||||
"\n",
|
||||
" # load onnx model\n",
|
||||
" # please update with your related information\n",
|
||||
" onnx_dir = \"DEMO_PY_DIR\"\n",
|
||||
" onnx_file = \"tinybert.onnx\"\n",
|
||||
" model_name = \"demo_model\"\n",
|
||||
" OracleEmbeddings.load_onnx_model(connection, onnx_dir, onnx_file, model_name)\n",
|
||||
"\n",
|
||||
"# load onnx model\n",
|
||||
"# please update with your related information\n",
|
||||
"onnx_dir = \"DEMO_PY_DIR\"\n",
|
||||
"onnx_file = \"tinybert.onnx\"\n",
|
||||
"model_name = \"demo_model\"\n",
|
||||
"try:\n",
|
||||
" OracleEmbeddings.load_onnx_model(conn, onnx_dir, onnx_file, model_name)\n",
|
||||
" print(\"ONNX model loaded.\")\n",
|
||||
"except Exception as e:\n",
|
||||
" print(\"ONNX model loading failed!\")\n",
|
||||
" sys.exit(1)\n",
|
||||
"\n",
|
||||
" # params\n",
|
||||
" # please update necessary fields with related information\n",
|
||||
" loader_params = {\n",
|
||||
" \"owner\": \"testuser\",\n",
|
||||
" \"tablename\": \"demo_tab\",\n",
|
||||
" \"colname\": \"data\",\n",
|
||||
" }\n",
|
||||
" summary_params = {\n",
|
||||
" \"provider\": \"database\",\n",
|
||||
" \"glevel\": \"S\",\n",
|
||||
" \"numParagraphs\": 1,\n",
|
||||
" \"language\": \"english\",\n",
|
||||
" }\n",
|
||||
" splitter_params = {\"normalize\": \"all\"}\n",
|
||||
" embedder_params = {\"provider\": \"database\", \"model\": \"demo_model\"}\n",
|
||||
"\n",
|
||||
" # instantiate loader, summary, splitter, and embedder\n",
|
||||
" loader = OracleDocLoader(conn=connection, params=loader_params)\n",
|
||||
" summary = OracleSummary(conn=connection, params=summary_params)\n",
|
||||
" splitter = OracleTextSplitter(conn=connection, params=splitter_params)\n",
|
||||
" embedder = OracleEmbeddings(conn=connection, params=embedder_params)\n",
|
||||
"# params\n",
|
||||
"# please update necessary fields with related information\n",
|
||||
"loader_params = {\n",
|
||||
" \"owner\": \"testuser\",\n",
|
||||
" \"tablename\": \"demo_tab\",\n",
|
||||
" \"colname\": \"data\",\n",
|
||||
"}\n",
|
||||
"summary_params = {\n",
|
||||
" \"provider\": \"database\",\n",
|
||||
" \"glevel\": \"S\",\n",
|
||||
" \"numParagraphs\": 1,\n",
|
||||
" \"language\": \"english\",\n",
|
||||
"}\n",
|
||||
"splitter_params = {\"normalize\": \"all\"}\n",
|
||||
"embedder_params = {\"provider\": \"database\", \"model\": \"demo_model\"}\n",
|
||||
"\n",
|
||||
" # process the documents\n",
|
||||
" chunks_with_mdata = []\n",
|
||||
" for id, doc in enumerate(docs, start=1):\n",
|
||||
" summ = summary.get_summary(doc.page_content)\n",
|
||||
" chunks = splitter.split_text(doc.page_content)\n",
|
||||
" for ic, chunk in enumerate(chunks, start=1):\n",
|
||||
" chunk_metadata = doc.metadata.copy()\n",
|
||||
" chunk_metadata[\"id\"] = (\n",
|
||||
" chunk_metadata[\"_oid\"] + \"$\" + str(id) + \"$\" + str(ic)\n",
|
||||
" )\n",
|
||||
" chunk_metadata[\"document_id\"] = str(id)\n",
|
||||
" chunk_metadata[\"document_summary\"] = str(summ[0])\n",
|
||||
" chunks_with_mdata.append(\n",
|
||||
" Document(page_content=str(chunk), metadata=chunk_metadata)\n",
|
||||
" )\n",
|
||||
"# instantiate loader, summary, splitter, and embedder\n",
|
||||
"loader = OracleDocLoader(conn=conn, params=loader_params)\n",
|
||||
"summary = OracleSummary(conn=conn, params=summary_params)\n",
|
||||
"splitter = OracleTextSplitter(conn=conn, params=splitter_params)\n",
|
||||
"embedder = OracleEmbeddings(conn=conn, params=embedder_params)\n",
|
||||
"\n",
|
||||
" \"\"\" verify \"\"\"\n",
|
||||
" print(f\"Number of total chunks with metadata: {len(chunks_with_mdata)}\")"
|
||||
"# process the documents\n",
|
||||
"chunks_with_mdata = []\n",
|
||||
"for id, doc in enumerate(docs, start=1):\n",
|
||||
" summ = summary.get_summary(doc.page_content)\n",
|
||||
" chunks = splitter.split_text(doc.page_content)\n",
|
||||
" for ic, chunk in enumerate(chunks, start=1):\n",
|
||||
" chunk_metadata = doc.metadata.copy()\n",
|
||||
" chunk_metadata[\"id\"] = chunk_metadata[\"_oid\"] + \"$\" + str(id) + \"$\" + str(ic)\n",
|
||||
" chunk_metadata[\"document_id\"] = str(id)\n",
|
||||
" chunk_metadata[\"document_summary\"] = str(summ[0])\n",
|
||||
" chunks_with_mdata.append(\n",
|
||||
" Document(page_content=str(chunk), metadata=chunk_metadata)\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
"\"\"\" verify \"\"\"\n",
|
||||
"print(f\"Number of total chunks with metadata: {len(chunks_with_mdata)}\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -601,15 +733,23 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 55,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Vector Store Table: oravs\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# create Oracle AI Vector Store\n",
|
||||
"vectorstore = OracleVS.from_documents(\n",
|
||||
" chunks_with_mdata,\n",
|
||||
" embedder,\n",
|
||||
" client=connection,\n",
|
||||
" client=conn,\n",
|
||||
" table_name=\"oravs\",\n",
|
||||
" distance_strategy=DistanceStrategy.DOT_PRODUCT,\n",
|
||||
")\n",
|
||||
@@ -638,12 +778,12 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 56,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"oraclevs.create_index(\n",
|
||||
" connection, vectorstore, params={\"idx_name\": \"hnsw_oravs\", \"idx_type\": \"HNSW\"}\n",
|
||||
" conn, vectorstore, params={\"idx_name\": \"hnsw_oravs\", \"idx_type\": \"HNSW\"}\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"print(\"Index created.\")"
|
||||
@@ -653,7 +793,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"This example demonstrates the creation of a default HNSW index on embeddings within the 'oravs' table. You may adjust various parameters according to your specific needs. For detailed information on these parameters, please consult the [Oracle AI Vector Search Guide book](https://docs.oracle.com/en/database/oracle/oracle-database/23/vecse/manage-different-categories-vector-indexes.html).\n",
|
||||
"This example demonstrates the creation of a default HNSW index on embeddings within the 'oravs' table. Users may adjust various parameters according to their specific needs. For detailed information on these parameters, please consult the [Oracle AI Vector Search Guide book](https://docs.oracle.com/en/database/oracle/oracle-database/23/vecse/manage-different-categories-vector-indexes.html).\n",
|
||||
"\n",
|
||||
"Additionally, various types of vector indices can be created to meet diverse requirements. More details can be found in our [comprehensive guide](https://python.langchain.com/v0.1/docs/integrations/vectorstores/oracle/).\n"
|
||||
]
|
||||
@@ -665,16 +805,29 @@
|
||||
"## Perform Semantic Search\n",
|
||||
"All set!\n",
|
||||
"\n",
|
||||
"You have successfully processed the documents and stored them in the vector store, followed by the creation of an index to enhance query performance. You are now prepared to proceed with semantic searches.\n",
|
||||
"We have successfully processed the documents and stored them in the vector store, followed by the creation of an index to enhance query performance. We are now prepared to proceed with semantic searches.\n",
|
||||
"\n",
|
||||
"Below is the sample code for this process:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"execution_count": 58,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"[Document(page_content='The database stores LOBs differently from other data types. Creating a LOB column implicitly creates a LOB segment and a LOB index. The tablespace containing the LOB segment and LOB index, which are always stored together, may be different from the tablespace containing the table. Sometimes the database can store small amounts of LOB data in the table itself rather than in a separate LOB segment.', metadata={'_oid': '662f2f257677f3c2311a8ff999fd34e5', '_rowid': 'AAAR/xAAEAAAAAnAAC', 'id': '662f2f257677f3c2311a8ff999fd34e5$3$1', 'document_id': '3', 'document_summary': 'Sometimes the database can store small amounts of LOB data in the table itself rather than in a separate LOB segment.\\n\\n'})]\n",
|
||||
"[]\n",
|
||||
"[(Document(page_content='The database stores LOBs differently from other data types. Creating a LOB column implicitly creates a LOB segment and a LOB index. The tablespace containing the LOB segment and LOB index, which are always stored together, may be different from the tablespace containing the table. Sometimes the database can store small amounts of LOB data in the table itself rather than in a separate LOB segment.', metadata={'_oid': '662f2f257677f3c2311a8ff999fd34e5', '_rowid': 'AAAR/xAAEAAAAAnAAC', 'id': '662f2f257677f3c2311a8ff999fd34e5$3$1', 'document_id': '3', 'document_summary': 'Sometimes the database can store small amounts of LOB data in the table itself rather than in a separate LOB segment.\\n\\n'}), 0.055675752460956573)]\n",
|
||||
"[]\n",
|
||||
"[Document(page_content='If the answer to any preceding questions is yes, then the database stops the search and allocates space from the specified tablespace; otherwise, space is allocated from the database default shared temporary tablespace.', metadata={'_oid': '662f2f253acf96b33b430b88699490a2', '_rowid': 'AAAR/xAAEAAAAAnAAA', 'id': '662f2f253acf96b33b430b88699490a2$1$1', 'document_id': '1', 'document_summary': 'If the answer to any preceding questions is yes, then the database stops the search and allocates space from the specified tablespace; otherwise, space is allocated from the database default shared temporary tablespace.\\n\\n'})]\n",
|
||||
"[Document(page_content='If the answer to any preceding questions is yes, then the database stops the search and allocates space from the specified tablespace; otherwise, space is allocated from the database default shared temporary tablespace.', metadata={'_oid': '662f2f253acf96b33b430b88699490a2', '_rowid': 'AAAR/xAAEAAAAAnAAA', 'id': '662f2f253acf96b33b430b88699490a2$1$1', 'document_id': '1', 'document_summary': 'If the answer to any preceding questions is yes, then the database stops the search and allocates space from the specified tablespace; otherwise, space is allocated from the database default shared temporary tablespace.\\n\\n'})]\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"query = \"What is Oracle AI Vector Store?\"\n",
|
||||
"filter = {\"document_id\": [\"1\"]}\n",
|
||||
@@ -719,7 +872,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.13.3"
|
||||
"version": "3.11.9"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
||||
@@ -53,7 +53,7 @@
|
||||
"id": "f5ccda4e-7af5-4355-b9c4-25547edf33f9",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Let's first load up this paper, and split into text chunks of size 1000."
|
||||
"Lets first load up this paper, and split into text chunks of size 1000."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -241,7 +241,7 @@
|
||||
"id": "360b2837-8024-47e0-a4ba-592505a9a5c8",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"With our embedder in place, let's define our retriever:"
|
||||
"With our embedder in place, lets define our retriever:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -312,7 +312,7 @@
|
||||
"id": "d84ea8f4-a5de-4d76-b44d-85e56583f489",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Let's write our documents into our new store. This will use our embedder on each document."
|
||||
"Lets write our documents into our new store. This will use our embedder on each document."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -339,7 +339,7 @@
|
||||
"id": "580bc212-8ecd-4d28-8656-b96fcd0d7eb6",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Great! Our retriever is good to go. Let's load up an LLM, that will reason over the retrieved documents:"
|
||||
"Great! Our retriever is good to go. Lets load up an LLM, that will reason over the retrieved documents:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -430,7 +430,7 @@
|
||||
"id": "3bc53602-86d6-420f-91b1-fc2effa7e986",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Excellent! Let's ask it a question.\n",
|
||||
"Excellent! lets ask it a question.\n",
|
||||
"We will also use a verbose and debug, to check which documents were used by the model to produce the answer."
|
||||
]
|
||||
},
|
||||
|
||||
@@ -1 +1 @@
|
||||
eNrNWA2MHFUdL6CxEQI2aMVq9Lk5257d2bu93ftsmvS8a8u117v27oqlvXp5O/N29/Vm5w3z3uzetpSkKIQoDW4hIMW0Ktc7uZztVRoRoUmRQqoiKibilQIx2GgABZQqGCj+35vZ3dm9j0KCiZu93M7M+7//x+/3/5h383iWOJwy65JJagniYF3ABS/cPO6QG1zCxTfGMkSkmTG6qbd/4H7XodNfSgth87a6OmzTCLZE2mE21SM6y9Rlo3UZwjlOET6aYEb+zKLI7lAGjwwJNkwsHmqL1jfEw6HimlDb9t0hh5kk1BZyOXFC4ZDOwApLwI1r6RdDe3bAYmYQE651E7sG0WJao8aZZRGhmViAgSAkGDO9zSyckZsJnKVmfogT7OhpWEAt2xVDXE+TDA617Q7ZYDJxBJUm7A6Bn05e/jAI1x1qywjAJv1KGqmnSDBkMjaMXFvqy9tSCxcOtVKhPXJ/3QTjhgyWwdRSm2Ir35tUNlFBMurWDDH/BnYcnIfr0grLNU3lu0GS2DUhGtvVRdC6dmRSLhBLIl+pNNEhcnNdIM9zeQ3i8lFk0Bq0kP/ZwgkSacqRjR0IGMCOcmlitZVXRCNoIE2QBAWREdukOhVmHvmc4IhaSeZksLQFJR2WQdwmOk1SHeVIgoPHHC0nkVQkjAZDa6llIN2ksJwgAwvsSViY40iKZQdDtWW9DQG9GeCB5CLCFmJOClt0l6eQOQjIZkOEUY4CN13hq89DXMEx4oekyoSgzTghpeimNLPAVGVPu22bRBlTNqfLQgkm0kjHnPAwyjMXcdBnGsiQUctQiyh92JaMcqjy0IfDV759MFT2dIe0He5gqUtmDNyqBf8MAExUQVKBWJ8PZI6aJurt6b5eRoB4hksD/PATo6ReQxZDYDsEkjPX0YkvnCDIZ6sRKe/f6RENgQE9EBK0HIT9IBY5BWGrDUhI/pKR/w/a+3b4TKyg/ofMe1CGs4wCn6F8yOjMQfeZXCsmgJ7GVoqgBNyymPARBBpLxBUhLpYNUioHZVdyvZjflSkYTBZeShEojfOmhy1zAXbMUpLjRfvSUuR/kh4B2MqZUhGIDydZktSUOMsIYLj0+8s8mfP+skLZzmfJCY9+QwaxRboyIYjlZuBHKIE51aGLYCOLLZ0YoR0zO8o8OVGSr0yMDnAMWikv8h9Ad5ibSkN94ypoHldU5XQIsJ7TLJEPffsrs2TQ0zIYQsBjxGkGIFCdEFqmygYqDYU7VB8OIzAb01RawNocdgxQx3MwVQQiU7170XdQsNz3rFapkuaZZKSoK+zhg026CxASctCAe+XdHOyQYLKFJWdkTsNNTWEAxmAzzwFC+FqEVNa9YPOG/EyRqiLmo5CAAYNga35gktjkpBqWzmIigP6kR7hifRKuY8lgmgRiASRT6oGkDAIrkxYJMiICdawijP0yBRgacFzi+SvmK1tZyl2wrhwcT1dO5Wp5U05TlqwgYA5IEystMUKuZQCYAjgkIS8Vjf40y0FVgg2xQAkT68MoDVHi3pxk0mESBnQDFPDLDLQSF5ySJbyPWJjCKCi1YEdAdZmVMt0EZ2EBR2tlhKsIk2HQDgIMwGaJp+AqsENFEbDnbhJco5D8VfgLmiFDjqzKs+erAV0qHMoRMhyS0ygMvPA/Dyh+wMSVd6rp0U0zVPDAnFYqULabgE6XBtZLLkDFweUCLy1Oyso3a7jUeFAqkXOSozxeqf0QDMSUBdlQQtqbs1F7F+RPjitUoXkLqpvF2cnE8FxGaC4I1wJSpqxENrNdEzuSiRaFebyU02AeSjgMA9c8exQiPGjPoBd90A/SgwoC0Kc6icWBUwGIy2k1R/70KgR4G+wD+Moy1BBP10pvPTfQ8mZDXRZVLo/Vezc8tXDd1GjM7mxV26imm3R4dqaloCA72AR6yUDDvyS1ZHJ80A5R3qfqjcbvdaUmoQOyKQavNzKTGCzLqDo7R8y80u1vPqNyq0Qsle2uno7uLZ1dPetkIwI5v6A5Ga5KQ0Vd8Pjl8Qoc938rjB2iA5pwpWjskCRxVOMBzH07grk/Y9gbLEbQb2QZ7AwTAeZRKwtqZArABYGcYxnIK/lyojqIJwVtRzF+lm3lbdhTpZrc2GYmFYrH3GaO3FRp2yk7mus4kpEkK7VBfsMLN8Q4kQ/EQA0ewiE4A2lpUAzTe0/vgNd38zCy61imB/bfRTEf5kqrsgPMqAxBmWzANq9Xw5AB9PLec8tcYomdRBeSk9Uztk8PYqXkCFcmRrFDlwaIMMI6OAgAhtWQIRyXiwCDZLiSrqkEFYRyRpRdWE3SakxAqkd5lciblCsCFkFd3qTMLNXRgP/wgKMtfd3eXMMtatsKVDmzsiRQBB7480XREzUKwBJ/esvAUJAiEv5woNj4syL3XFE90pcPe7VUnjPAViaVZyZgCzY1KV0dBKivCgstAU0xEIwueQpRnJATEk8eOGKIQBLvGU8TWQH5CwsWjULeiMKxGYcsR0EZeKMRS2eyIxceTO2iNrhGkqYyYRcXxoQuj0jUS1thYpgQW8PS6gdHNC5fW2QpVqZDvAsPANWG1nVdt6ZnzNu6MCVHbhgDpHjdTs6sSb8raZI5Mx9PyN6iQbgsUXiovWhs3aY85L6F6iON8Uj9VFC1CeEujNnq+SPBBzZEDPbR/EOpwpgnfCS4hvHC4Y1Y7+2v2FIGsnAYO5mmeIWXjmtJRwvjHZtmqvMfltSNxyLRKHyPVezM85ZeOKzGu2MlKEoyEw31DTGtvkmrjz5UsTdQNa/pDFQUvl9/pBhBE5JKpAuj8Vhryw+BGTYQn3x9DOSEy28eBUzJU6fH/ROyH/RuKDPiM6OdgG/hxIALIEeb0XrXQqC7EUXjbQ3wbUbrNg5Mdvh6BmZF6tgAMJ1DimhrivQZ19OuBTSd6JiVMtNLyy5LiptyYtHUgZrmnepp6lZhLF4vP9Nfvuh6h8gklLqLMkvfhwxMNIXj0l8Z62jzgO9107bpZbNJA7NnmHi4RWlbcfH1ZRN9mWXvR2YOE5u3TYdmEy+O5751o/E5Q1FaWbZrNNba2nqRfee0KL5tGs0mWQWo53rNPCuDUHqr0byr5wRxwjdao0bhUfg9VB+NdmzuzNkN6/TrEls3p3uyuK8xmRp42DuN0oQksuy1GidQfKnIF6bDGTwiy9CqWLQx1gTWrCyedfW7iU7v9X4lgu5lwpz5cNnQ4IGJNGC6JtbaFGs0YgmNJJKGFm9tadZaWxuiWqKhocWIt0Sb40bT/VmKCxNQLlCKsZRJjupJTccw02peMhfGO6/vad/Y1TG5VetjCQaADGAAzoKZcKyfOFBAChO6yVwDCrdDxjrWan3t1xeOtzZGY6Ack5ZYU2ss0aqt+WrfVDGtS2k7Kqu+OmTfO+YNhU9csv8L31q4QH0ug7/33jM2b7jj96uvuLDiyXtOjG6583nn9Eff+ERi0cdqVl87uSneffbUL089/fTImX0TF3J3rV50250/eo385q3bzyb26S8f2PH2/vOvnzt/9VWvvXjTW//63Qt3Df7930enVt+3bNvAXftfWXj57UsWnzx86hffvWZxy84Nlwy/+di5u5seuu/dpQ8snB47syV37sjE1OqjiZpbzz/+9pKdbx7/47N9L638ePuhx9799cJVay7t/2ttd/3wCz9ftrh3w+UnVu4/e81P45suv/KZzoNXvfqnaH/DdXvvndyqhx7Ofs25SXvkTK+5quXeT+eN9StanGefO/DoLbtf77nQfWtn5D+fv3LJ3nPfu0G8csWnWn52+sDV3+l59eo/n3pm7/mPHHq+8PiN7n4RPn7L0ROxFnrl3w58dmTroZbbDp5897dm3ev/OHHSaGssDPY9tqTrzJPOJ9/ZWLPrqZqDU3d/7it786vW71l7d+12ve3bNZO/euTln6x+7cbefXccP3L2nRXTO94+d+eFv8Sfe/H0P398svON0weX6N/c+pLzzh+eWFx3zyEV+MsWTL1Yd3L9pQsW/BfdSCzc
|
||||
eNqFVW1sU1UYBrcfhgQlEhMlRg8NEBN229vbj63DKGNzOnRu0IqAWebpuaftZbf3XM85d1u3LIQh0YREcxMTEzXRSGmxzI0JBAySaBCjBH/rMEr4YdQoGn+YqInO99y1MGTB/mhOz/u8X8/7vKeT1WHKhcWc5VOWIynHRMIP4U9WOX3eo0K+UClSWWBmub8vnTnkcWtufUFKV7RHIti1wtiRBc5ci4QJK0aGo5EiFQLnqShnmVm6tPy38VARjw5KNkQdEWpHUd2It6BQAwU3z46HOLMpnEKeoDwEVsKgFEeqq4K1NjQxoDyYSW11Q2zsmVSLaYI5DpWaARF1w0gpR8mYXY/p4GIQU+Jhyy4NCoo5KQxyKjxbisE94KwcTCoIt1zVswJ3oAUcok7ecihiYClaY9REOcYRdOhyWoBGrGHagjAhHsdSnRwTSe4JCcB6hjB6WtCcZweOI+CDSsxDDgWEZOAgRihHAcGKboSzzJMI4nHoG9Fh+IYQPY4Lt6LAPNtEWYpwozxw5KWwasBSkEFBCrSIoYPxkAvToFxaAbfjoQAZnP7T6uJIqiSbsSHkuQGLJTegTkhuOfnQxATcKTVYnJqK3HrQgUVQlt1DiQTowES1QLEJmnqlXGBC+rM3qWQGiKOu1KhDmAkJ/PfyY5bbgkyas4HOGlFzDWTo14YodTVsA9+VBS//GHZd2yJY2SNqjFN1tWiqlpvNNSUqDbTmSP9UR6OOSH8JRO0gPRyLh41jo5qQ2HJsUKVmYyip4gb2M4sNLiZDEEerL4xfWXCeXoxhwj/ci0lf+oaQimn/MObFZPz44nvuOaAv6lc7+29OVzdeTxcLR6Ph1tkbAouSQ/zDOWwLOnuN5GsuNdiNmKYnNT063WDJBmnLgn8onmo9Alp1QX10fwVCSk9MlmEi9OJn1fp6vtP3RGOa3y67q9wF0/HPdnOrBRmtKE1dpHYPRZPterRdb0OP9WamOutpMksOYzbDQfo5GMijjeFXScFzhqhZ61xy7HOh622pZbNhH6VWf5tgWOqnX47ruj634ZZIDgtiOSpjOZZKpf4nLjBDpX9C9afpKc1ozSx0mYjvnkNLeS48cPV6KqoeqGjdLZDX62mg0S3RS9ejt+2u1YvWLNP/EM6DejS9Q+R6jQQpxLaYu2h/b7JrqMtJnBzViM08U5PwylMtEMSo9OdQnEQTRipptMbNeLYtnkglcjgWTyaTuqEn4/HsoWEL+7VoOIryjOVtOtPZrXVieHK0dCAbv9q166mO3p7OqZ3adpZlwF8GA88Oc2glTTnI0a8FqWHBOa2A+/aOXf6JNpIi2QSNJUjMMHI5om2BvWkI6JpAyup1CP5N9lUWXqRPvnjg4O3Lgk/Tky9/s/Xc5lUH3jovzu6TX6/45ezOI6OREP78jrW52OrY8edWlkZ6rI+PNl3+80rLD6tv6776+tjIuvvPXKEPP3Lnr6/NnBoYO3l5+sD+gQ2XtpfXbJo811Jq2nbymYO2e8/6bd2b3yA9masb7869f+HLf/rH55tf3PHVSyuvkA/61vxtf3e0+c0H86WP/vrx7Q3pWDqDm1dVP71w8fS9Wy8fOL2+OZza9OpPM8fe/v7dncRYtXFy70Ntj8/O/3x6ei/UPT/ftOz8H/et+B3O/wLqW+ZT
|
||||
@@ -1 +1 @@
|
||||
eNrNWAtwXFUZLoI8qqMCFoRx4Li2pYG9m928k0ofJmknJSQl2VpIA+nZe8/unubuPbf3nLvJpuJYXoIU6FZAi6AiaQKxA+kAVmxBFCLMWCsjpRCqwICIIr6t4CCt/zn37jOPwozOmGknueee//yP7/sf91w1miYOp8w6bie1BHGwLuCBZ68adchGl3BxzUiKiCQzhld3dkfvcR06uSAphM2bKiuxTUPYEkmH2VQP6SxVmY5UpgjnOEH4cIwZmRdPbdoUSOHBPsH6icUDTZFwVU0wkNsTaFq3KeAwkwSaAi4nTiAY0BlYYQlYWJvE4jyOUhlk4RRZGrjychBkBjHhnW5i1yBatVarcWZZRGgmFmAsHCAYM72DpRjsFThNzUwfJ9jRk7CBWrYr+rieJCkcAPNsMJ84gkpzNgXAZycj/zAI1x1qy2jAId1KGqm3SDBkMtaPXFvqy9hSCxcOtRKBK+X5ugnG9RkshamlDsVWpjOubKKCpNTSFDF/ATsOzsBzfoflmqby3SBx7JoQmXXqodi65cikXCAWR75SaaJD5OG6QJ7n8hnE5atQr9VrIf9nDSdIJClHNnYgYEABNJAkVlNhRySEokmCJECIDNom1akwM8jnB0fUijMnhaUtKO6wFOI20Wmc6miAxDh4zNEiEkqEgqg3sIJaBtJNCtsJMrDAnoSFOQ4lWLo3UFHQW1WkNwWckLxE2ELMSWCLDnkKmYOAeDZEGA1Q4KkrfPUZiCs4RvyQlJlQbDOOSSm6OsksMFXZs9y2TaKMKZjTZqEYE0mkY054EGWYizjoMw1kyKilqEWUPmxLRjlUeejD4Stf1xsoeHq5tB1WsNQlsweWKsA/AwATZZCUINblAzlATRN1drRfJiNAPMOlAX74iZFXryGLIbAdAsmZ6+jEF44R5LPVCBXOb/GIhsCADggJWgTCfhBznIKwVRRJSP6Swf8P2vt2+Ewsof5/mfegDKcZBT5D+ZDRmYHuU7mWSwA9ia0EQTFYspjwEQQaS8QVIY6VDVJqAEqw5Houv0tTsDhZeD5FoDTOmh62zAU4MU3JAM/Zl5Qi/5P0KIKtkCklgfjvJEucmhJnGQEMj36vmSVz3l9WKNv5NDnh0a/PILZIliYEsdwU/BGIYU516CLYSGNLJ0bg8qkdZZacyMuXJkYzOAZtlef4D6A7zE0kob5xFTSPK6pyOgRYz2mayJe+/aVZ0utp6Q0g4DHiNAUQqE4ILVNlA5WGwgrV+4MIzMY0kRSwdwA7BqjjAzBhFEWm/PSc76Bgke9ZhVIlzTPJYE5X0MMHm3QIEBJy6IC1wmkOdkhxsgUlZ2ROw6KmMABjsJnhACH8swgprXvFzRvyM0HKipiPQgwGDIKt2YGJY5OTclhacokA+uMe4XL1SbiOJYNpEogFkEypB5IyCKxMWiTIoCiqYyVh7JYpwFDUcYnnr5itbKUpd8G6QnA8XQMqVwuHcpqwZAUBc0CaWEmJEXItA8AUwCEJeb5odCfZAFQlOBALFDOx3o+SECXuzUkm7SdBQLeIAn6ZgVbiglOyhHcRC1MYC6UW7AioLtNSpp3gNGzgaIWMcBlhUgzaQREDsJnnKbgK7FBRBOy5GwfXKCR/Gf6CpkifI6vy9PlqQJcKBgYI6Q/IaRSGX/idARQ/YOLKlXJ6tNMUFbxoTssXKNuNQadLAuslF6Di4EKBlxbHZeWbNlxqPMiXyBnJURiv1HkIBmLKitmQR9qbs9HyNsifAa5QheYtqG7mZicTw3sZoZkgXAFImbIS2cx2TexIJloU5vF8ToN5KOYwDFzz7FGI8GJ7er3og36Q7lUQgD7VSSwOnCqCuJBWM+RPp0KAN8E5gK8sQ1U1yQrprecGWlRvqMecykXVYW/BUwvPdbXG9M6WtY1yukmHp2daAgqyg02glww0/IpTSybHB+0QhXPKvmj8XpdvEjogm2DweSMzicG2lKqzM8TMK93+4VMqt0rEfNlu62huX9PS1rFSNiKQ8wuak+KqNJTUBY9fHq/Acf9vhbFDdEATnhSNHRInjmo8gLlvR3HuTxn2enMR9BtZCjv9RIB51EqDGpkC8EAg51gK8kp+nKgO4klB21GMn+ZYuQxnqlSTB9vMpELxmNvMkYcqbRtkR3MdRzKSpKU2yG/4+IYYxzJFMVCDh3AITkFaGhTD9N7RGfX6bgZGdh3L9MD+tyjm/VxpVXaAGaUhKJAN2Ob1ahgygF7ed26BSyy2gehCcrJ8xvbpQayEHOEKxMh16PwAEURYBwcBwKAaMoTjclHEIBmuuGsqQQWhnBFlF1aTtBoTkOpRXiXyJuWSgIVQmzcpM0t1NOA/vOBoTVe7N9dwi9q2AlXOrCwOFIEX/nyR80SNArDFn95SMBQkiIQ/WFRs/FmRe66oHunLB71aKu8Z4CiTyvsTsAWbmpQuDwLUV4WFFoOmWBSMNnkLkZuQYxJPXnTFEIIkvnI0SWQF5C/NOXUY8kZkd025cHkAlIE3GrF0Jjty9sHEELXBNRI3lQlDXBhjurwiUR9t2bF+QmwNS6sfHNS4/GyRpViZDvHO3gdU61vZ9oXWjhHv6Oy4HLlhDJDilRs4s3b6XUmTzJn6ekz2Fg3CZYns7uU5YytXZyD3LRQO1daEwuPFqk0Id3bEVu/3FL+wIWJwjuZfUGVHPOH7i/cwnt1xMdY7u0uOlIHM7sBOqq6mxEvHtaSj2dHm1VPV+S/z6karQ5EI/NtVcjLPWHp2hxrvduWhyMuMVYWrqrVwnRaO7C45G6ia0XQGKrJ3h+/PRdCEpBLJ7HBNTSR8LzDDBuKTq0dATrj8qmHAlOx7etS/Lftu50UFRpw13AL4Zh+NugBypB6tci0EumtRpKapqrYp0oBWXhzd2ezriU6L1K4oMJ1DimitOfqM6knXApqONU9LmcmFBZclxU05sWjqQk3zbvg0tZQdqQnLn8nzj7nfITIJpe6czML3IQMTTfYh6a+MdaQ+6ntd3zN53nTSwOwpJu5oUNouOPb+gom+zHnvR2YGExt6JgPTiefGc9+64ZoZQ5HfWbBruLqxsfEY585oUV3PJJpOsgxQz/X5s+wshtLbjWbdPSOIY77RGjWye+HvvnAk0nxJywC361sTkTU1G6vWWtFVuD39iHcbpQlJZNlrNU6g+FKRyU4GU3hQlqELqyO11XVgzeLcXVe3G2vxPu8XI+heJsyZjxQMLb4wkQZMzq9urKuuNapjGonFDa2msaFea2ysimixqqoGo6YhUl9j1N2Tpjg7BuUCJRhLmOQBPa7pGGZazUvm7GjLZR3LL25r3nmp1sViDACJYgDOgplwpJs4UECyY7rJXAMKt0NGmldoXcsvyz7UWBuprq6P6LUNYb2xOtaota7tGs+ldT5th2XVVxfum0e8oXDiuEPn3njyHPVzPPw/elR0/XjDoWWfeO82HHrs6Z+mv9Yl/rDjwPDm04YumNiSfurgtue2/HDx91pq33s78LPr7tj67E82sZf3ZvadcvVTZ+/54q8eqL9l6W/oO0cOH+aZ1W88ds3Koa2s89K7b7qw8/z922/eMm/hTacva/yKOf+6eek3G2M3hbr+uvff43v74vPwOuv5Q3t+PrRnYu5dX931qYNLO9d2v/b879/9130bTzrQf0B/54l5NRPrT7qC4Gd++fiy/Rv3r587/7Pf2LBvvXbW3uQN52zRr7kZH9j2nTduXbPo81uvHrk5eO+yj/7izJ57X5v31uPZL6/69Geqn2hpuG1z97cOzz9nVXz89s0nzs3e8fTfX5l7xYuv9NgLFpxx42vm6U9OvHrWyVXddwy+ed8pkRcaVt6F9z/z9YUnDpze+frnKq7feNr3L1pyyR7rpdHw1sjvFr97yT0T6PqPN//l2lc/ueT1eW997NmDH1l65vqhdc8daP3wntaTtMPvRYPz//SD/obl4Ybk0LP75h49Mhb+5j/fffLWNeMrzvjR7m0nVLb89uVrJyIHu2I7x++8ZXQJ3vrYCy/9Y/sC/dFHDn1p5zPZ/syfT/v2H9/eve3wbZ3tR7TRiSMrl/x67dHh6J090du333l237btPcPGjhNWDVfc0Fqx6uGH5ygEj59Td8K5W/72oTlz/gMJcU9B
|
||||
eNqFVX1sU1UU3yBREIiJX4kY4VLY0GSve6/tuo8/MKVjfIxtyAYOcJbb927Xt76++3jvvm2lzAjjDwN+PWUQoghhXWeaMjYYERE1zkxZ+HCJBDIiQqKJMRElUYMxmnle18KQBftHc3vP75x7zu/8zumO3laiGzJV89OyyoiORQY/DGtHr062mMRgO5NRwsJUSqypq2/oNnV5rCDMmGZUFBdjTXZilYV1qsmiU6TR4lahOEoMAzcTIxGkUuxK/t9xRxS3BxiNENVwVCCBd3mKkCOHgptNcYdOFQInh2kQ3QFWkUIqKrOv2sKYGSgaQyqOkuccHU22M5WIYhtFBZsS4dycQVWVMM4FwXmXq9yOwShVsuFtVxvOcKusxAIGwboYDujEMBVmBFrA2XaQiCHqsmaXb4N9aAKHiNosqwRRsETlrURCIaojKFbTSRhqkltJEcKiaOqY2SdVQkw3DQbA7AtOtM4gIVPJOLaBD4pRE6kEEIyCg9FGdJTh2mYe4SA1GYJ4OlCASCt8Q4iVqga3RpiaioSCBOFceuCox5x2AbINCRhimEQxVBB3aNAYojM5Q3PckUFmTv8pdXIkOyWF0ggytQyLMS1DncF0WW12dHTAnS0MWSeSTW42aNMkKA22EJEBtKmjN0ywBPJ6MxGmBrMG7hHMUSCOaIwjqkgleMA60rxV1oqQREIK0JkS7b5mFGmlIoRoHFaA7+SEl9WPNU2RRWzbi+02prPC4exc7jWnbH1xIDuVWR/6cnkUr4mBvlXEO90ep6u/nTMYllUFBMopGFJKahn7x5MNGhYjEIfLzo6VnHDum4yhhtVTg8W6+rtC2kxbPViPej3HJ9/rpgr6Ilavf829z2WNd55zOwXBWTpwV2AjpopWTwgrBhm4TfJtlxTMhpvjvRwv9OVYUkDaLGx1l/BlH4BWNVAf6UxCSGYaOxLQEXLuTG92Ug/XVee6+V3eI4lK6I71SZUuFyFXKaonGrJnDwneCl6o8HjQ8pqGtD/7TMOUzRho0EH6IWjIslzze8WwqUaIlPJP2fYxx52y7GFTYB4Zl11T0Cz7p5Xw8Dw/VnhfpA4DIqv2iwl3eXn5/8QFZgizBu36OL6cc5U2TFRZ4tk4hqbynNh12XySdj6Q0aL7IO/kk0Oj+6Knzsfj2ZjKJs3JknUazgFecOvrlpWuEr2Vonc58QXXR6pWY6n2RDsnKtSUOAYLn3AZQbQzawy5giVBF1/mFjwCCXqDJWUlPC9K7qDXzQtC0OPqbpWxlRKcAmqmtFkhR/1VnB/DyuHqM7Kxeis31PpqVvrTjdxaGqTAXwMGnlWqkmQ90UGOVirzNAy4TpLgvta3wRosE8vFoFfkyyTscvOuEm4pzE1OQLcFkrC3Q+aPZXtyYiMN5y+Zv3tGXuYz/cW3qqun+2Z/9dmqM8NnBy/emHXr0tKH/yh85iEH+fb1G6P7pQMjg86ea03JP4fqr+8bX/jDzr7huf4ratXlby5vad9Y+1PfhqsPvrSkL33he27uxU+7Dx/e8/Pe0c2PBV6pGSk4kQ7zK6rnRZTT57k5X3YuOpfobGmoqVpndj7wFC5Ynz5Yd81aNX9425MlS26+W3le6dpGN885crNr6MK8Ga8W0LbruxY1mkOHpsX3Lvz97YX9czw7l00729/4xdewh0+9kx59fvXQJX91y4G6zYFZvxSMR45pgx0zbzniZ55eUTRv1+fCP3t9obpnH89r2tMZmfneE4FDC94PzC7sUh8dKcRp6+TV8BuvBXZX/rr/5IJ0eZzygZc3HRzdss/sPra4tvGF4x/5Rv7aVxP68bfFQNT4+PS8g+n+7afy8/L+BTWeI00=
|
||||
1
docs/cassettes/agents_482ce13d.msgpack.zlib
Normal file
1
docs/cassettes/agents_482ce13d.msgpack.zlib
Normal file
@@ -0,0 +1 @@
|
||||
eNqNVktv20YQRi75HQtdlATiU9Tzpjh2GzgvVA6KICiIDTmStloumd2lHjF8aPq46y80rl0EaZtTb7n01EN/QX5NZ5emIhvuAxBAzrczs/P4ZqjX5wuQiuXixjsmNEiaaBTU5vW5hJclKP3dWQZ6lqenTx6Pj96Ukn2YaV2ooefRgrmaLhhfu0meeQqoTGanL/J0/fHGzeMGHsdzWDeGpKEXfO3oo3ud3mj8xXwf7pYvHz/rr9LO4aGG1TjyH84f9Rot0sAbpbVYzqgmTBE9A7IEig9JmCDjA6OV0VUsQZVcK9QNEanujlMo9MyY03RBRQKp0WYi4WUKcZpnlAlj8fwrhGF1LVxrU6GWIBGdUK5g50DSZZzkWCqhrzllGZ2Cqg9OzmdAU6zvD2+fKpDOaIpWm9+LNRZUOBcFVl7otvH32yhJMH5nXyR5ysR088v0FStaJIUJpxrOquPNmzvenbd7uRBgO7V5OwcoHMrZAn7dq+JyHoCY6tnmTdgN39XY0bqAzXtaFJwl1Fh6X6tc/Ix1LLDf8O2Z0lSX6vUp3g5//XmegVKYy4+PD+skvv8PX1evPw26fng2BokM2/wkpkysTu9hJpsPB5K1SNgjYyhI6IcRCbpDP8Af+ezh0T8kZ4n1DUYpsTYfb949rsnyb1xpTHLO82VcFrGttaF2YyhKzluNusOVVDcOSdDYcuv5cUMzzQEv+XLHMRXkQCK/mEpyvKOUHBXqqVgul+5FFGZAzGSgzpYxjeMmz6uiNYfkuCloBvjSvOS02SJNCdNKp7mH+U9yKRg1eJKXQsu1OXgqmIaUjLFxoEg+IaMMJHbEqCFlUKXdc3t9I1lPThCGboTz0tSvYpYaFxcW3oNcxSMxBQ7KWmOIXLMMYijyZIaaQS/sRZ2w3/Z3j40L00DHHzjYT38w9P3miYmylBLztSlyqjR2IMUo0yv+goFv/e1oXHXZH0YdE5KGrIgTY9lxu7U8QbnruwHKTMUpNWUJbJFEyrY1xh2jbY1LIdbGF0uqynpekoqr3boQvW606kYe+vSCoO0WYlpVPzVJB77vmzSXTKRxVpiEItevgbkFum6nBlKYSjB2vWALMWki2H+0b9wWSDlVSoizF9Z7EFpvW5iZcMOB2+9XaMKKOMsQ82s9g1itCpmVGcMKmHr0TbUSnpemtOZsAsAVZ3PYLecncKemJtJkxjivNNvbFCvQaHY6Nk1cERpxWFWa4RXQakYWTGFZ5PipqRQDN9rFrF5lvMCGzk2KQddea+SMGX4OycAi5cIw3L5OSyRQ1Ye+tbZA1QcMu3tygjOIgyVxlH13MOj1/aCHg767y80eOGl9mvjPcxzsNcHJgwT5aV4uj36LfJrMFnk6HhGHmKGgIjVMNmS6djtc0dkSrlTUU1Q4k/oCb2ZDuLQ+TFTOi7VjnuTg/8fmknqDHeXI6a10kWUQkXuIbh0+w2UJ0pD/iRFr7T2Oe1IDuTXCnW4W5m2yV006XyNrQvLH+wOXjHG/oTWuJks75ZJbtQPzkcGxHF6Oltw3fz2EPaKcjJgscqlt3LfRHQDJMDBysVTqJe9e6mkfP3ZRu3NdU6uVbj9zsSl+Y9h2o8HJ3wjDCKE=
|
||||
1
docs/cassettes/agents_532d6557.msgpack.zlib
Normal file
1
docs/cassettes/agents_532d6557.msgpack.zlib
Normal file
@@ -0,0 +1 @@
|
||||
eNrtWH+QE9UdB/lRRVBwKlKldc2ABzab7K9kkysK4ThQ8Y7DOwbRO7eb3Zdkuc1u2N0kl8NzAIXB6qDBEQeVqUq40xM5UAQ7pxbaolIY20JbBQttBZTyo8zY3ghisd+3m+RyxwE6I+0/MEzu7fd9v9/3fd9fn/feorYUMkxF1/qvVTQLGaJkwYf55KI2A81LItN6uDWOrJgu52pm1NatThrKnrExy0qY5V6vmFA8ombFDD2hSB5Jj3tTtDeOTFOMIjMX1uXM3ssGz3fFxSbB0huRZrrKCZpiODfhKnAB5b75LkNXEYxcSRMZLpiVdDBFszApHRMtk7BiiEgjEf4YhKIRZmSiq6UBq9FlpGI2SRWTMiJZ0tQ1DVkkA8tQDBPE2ixdV/MLaWLcXsgSU4qaEUwkGlJMMJCZVC1TmAvCWEBGpmQoCewIzBwiHD4CaVFFQ4QOM3GlGclERDcI2HbCQDHYnZJCbkKUpKQhWnikyYRlJE0LGPMreIhZJookVVswDTJERk8SGgIOSwcBMw37s72OY0CIYT1pEaDPAGcQKAW/oOIOLQFUM6YnVZkII0IsmAeCRsaDN6BgFsGUYiguwg7muxIQImRYiu3w+S6b0x712mqpJmySquuNRDJhezGTsF1nWoaiRV0tLUDDKaIYSMbOzSttKGHVw3ORZAFrQ0tbDIkyJNrjuZhuWtkNZ6VOBzgOJSwSaZIuwwLZV6PNSsJNyCiigjvbJRxXOzez7Y0IJUhRBX+3OlLZ9WIioSqSiOe9OIxr8ylEYlvOnm7HmUZCAmpWdnOoYIe3JgOZrhGUh+U8zPom0rRERVMhVUlVBJNaE/Z8Z+lEQpQaQQ+Zr6JsqyO8rpRHN7NrqkRpRm0PldjT2TWiEfdzr5fSjaQG+YWybRU1Zy+Xn+xejvXQtIff0EOxmdGk7JqIqJpoQ9HJRZF2qA2WpPwkRa8reEmF1LZi2dU+OvAS5GoCsg891AoqraS5KAcRQTvfb8vX7Iszpheiub/fNbkpEJ3s21MNxU0wPFGLEgSuPYL2l1N0OUMT06rq1lbkl6nrMxgb6gxI/QgEpLIQ/DYpltQakdxe0WfY97i6t4WLTYV6tMh8w4Jg4c9sjqMoas/N5+U0oEAUDa+YY4PB4AX0gmeQld2I90dSQZLh65xd+rh79xB9STpdL29PK7YHLBpzHs5uewrcxHm5+7aHoe9tzxtNKnL2LRgLFD3tnmnBKn+suoql+ZkV98TT8ebqOu2NJlJS9aRMWtD6EWknRJOV3UNwwaAYZH0yI/MsoliRpX0RnqeDFO+XIsFgYHVKEbPttIcmoroeVVFHxVSyQoSWQ9baaZNtmzKnOlR1R8Xae8i79bAO/qsTwc+arqHWWmRAOmbb7aWhwA3UCuJ3h+ZkNwakoBT2sxQt8zwTiUjkZKibQgIVEySHu4MNMQtbnY60rf8NNz56eT/734C6J/bd+etJwxcLY3d2DV4eHy+f+FnX9wbOPLZ43Gvu7R+1apGPV+358/DP/7l86fNPmx/sTF/FTvp4wcJl6U3jXjn0zgMtfzn82br7Dj+9Y8Vm4egr7LgnzizdPZLp/OGiRaO+GhP4dMW27IJFHu7w1knHG+ZSwsHRS2uMhsPoheSgIUcmLRrz0/1LhNzDf4t/8kjwupWfyo9+/vLLVz17tPPZNz/9/O+Jd44/29Xy5oLLjH0PfPbBqyfI1Y927Z968ODi4wsHVI0pv7k/u2W8eGzrlkGrjtEREh0bd3rpsiMrjj5xR6z+6gMn1y0YOzo+hb/pTNczw96tndw+YujYE9fmTo2InNz+2jW3ZUc9dv+sIZOX/3LvnY8f4MELX389oN/OT64IRvv363duOH+7FM0dXLSh3IECB8X7XznfBdNCI8rY6JlSM6RVN8XHh2rvbqxEk5PzZswJNMm+6dMheWo5qqqxmseYUcAZVyl2ixoRgYKXFFPSAWNkMYNZ8SEhj5IgwLgLWCTI0N9jWIcop0AKwMYGOEkFvBdkHdeLDe4YeFBTn+QCtwOwQLVbY8mEIaaF7jNH71klnj+p2BNFKFvSPguDSMgGkU0ONBR7hZfxsPC/I+SgWmXfqNbqTGdX3+K95Rytbl0h9e/Kt2jGz1+gn36bDr74Arp6L5+juaCvUL5r8Hmo6QII4MMIcI7N9SjgvYM+LJ5MLpgwroiuqnpaSCaE4nnJVa4lVdXtKoTZ+SpEDzLBVUwwOApaioUPna7Z3SvVwkpTCyvBGkkDzpauQn2k02lP3ixcKrhGgKeYNq75ZarueK4MTlZl+KgJg7IeSsvcRJmBog5PWQU4AU6CmiJiuqQDthsZPDFLU/CpETdSZBJ6hAjFkQFhwWyQN8DC8h4+gL9sTSTNMB4OiqbMahYUGavIS3jv0k0hpEURnAxsaTBRxScIASV0KQacNM/wnI8JsFTpNFbRjSsEFSynqLIWbKVzGrW3qIqmBRGQwUq5lz7ACVtfCUdvlYFyzodNslA8IUhY0ufxF74j8O2nPDR8K6YAIcfztpM0WSn6GEOV7eOkpmWwLkVyPOv1SrLWO1r5T6+fa/JzXtDppWnWk9CijvdlvGkaMBdvM61oshBP4A1xHqpAaLQJfo+vQJBR1EBYjqeLJMXAFlRWV2K1cDUwzaSBhHjY1k4ztrYiWcHmMkFPIOBQJSUhxONAowp8mGJzOZRYMq6AB7A/AthbNoLiWRhHEFJNVWlEpe7sJpb4FFsqxRRVdTjZ4hYdIub0+extQp+wgI6aHE6mF9Hm5GyijNIJHe6QDiPt4UppNp8jnIKANuIt0n57WfwdV3B+lhNBm5JM4Qy3h1G4P+XjELClbYITBzDb39ICNQiFZUApUx44lgT4IAOFXtrQcR9ocXdXfBVMxNQMUaj8qSAtQaLat7Ie5eomKkIEWbh54iQ6uyt0zxUTLO4s4FW9/mAYyZJf9IX9KMjzwSAbCbPIx3IcYiXKLwfCEusX6UjYz/s4iZXlSNjH0pTISn7ZLwX4Hg1mdvdSRBh3TNO+PuK7cRwuVcWrJ5Ffv3hjjpxngx6PB3dtniN+tZ7w0fiXCXgIvz3yMfgXWHo6GQ6Qforry8tOj7XBR8BtxFUO95OW7+gt4fLrL9pbgpvoFhRNUwHcBJEe0hgy8vda/KAggH7M8M1eFOwLeY+r94UQDl+vFbmwWhLO7zPl4NwplbdPNUImXT03NS0jh6braRdcrkutP3vbvQ13bOvlmvvm1+PErodx/fkQrx7k6guCDvcl3LuEe5dw73+Oe/V21+qrai8qIvXRAC4yLtUnKSpMYWzKj/L4VKAzhRGIgFMaCk++uEML36iF4lZrCsgw9OLlELrqpfflS+/Ll96Xv9P35RxD8dx3+8DMX3pg/j88MPN9PTD7xAqmZjYT5Kw5986a1synKius6unnfGD2USzFRCQZwSAghVk6GGHpCBfgGF+EE8PixX1g9nFUgAt/uwfmpu4H5vqZ+7TdNUMfvHbHbGLs2GFzPjy0bNhTV066PNk5+SXmkT+NPvSbstGna15sP7Ex+tGTL1h1zzfM+fKr/8y57cXGDe+GvugMn/rHvuN74xM7Tnf9e0PzX2ddfeOuyKlr2Y3KwZXb2jdGd1KDJw7IHfnj7UPZ0ZN3/KhRnbBmxb82/2LwiE1vuX/bPm7HDTvXXxF2L+nafGDbrpNfuG97/PoTP1ky4Mnk6NDK8O6QXLX/+89tXHeTtH7Z0Oe2rBjoz+bm7R24/ve7BnZsmnD91RMDkzOrto6+tfbVA9elOl5nm7YbQ56ZcnMgcHTBlqZRweVr3p9Ej1HEWRNu1fa7O0a+0vZe56CdVW/8PIIOrPzk9V2dLdFEYteRytmHfjzqXa2/8OC8MSe2/uDk8OZBJ5qX/IEd/8jCsgP1u6nEV7OPbnvqygktv6shH+t8PxdaP3VFKjp+yJmGcV/f5R4mkdtPD919Suoaseqx7amRzSO3fdl6CzN0+5n31nVUt03Mv15XPXO/OPyyfv3+CwLsDoI=
|
||||
@@ -1 +1 @@
|
||||
eNrNWX9wHFUdB6sMyI8RnCo/Rvu8Ak3x9pLLJWkSBiEkbQmmSZuklKap4d3uu9uX7O5b9u3e5dphOqDIjIzoKcIICCppQmMFKghUqDiDBSz++EemxB/AKCOMDKIMI8yg4vf7du9uL78KMzhjppncvX3f9/3x+Xx/7Ot1MwXmSS6c4/dzx2ce1X34IsvXzXjs6oBJ/0vTNvNNYUxtHhgavjvw+Nx5pu+7srOxkbo8RR3f9ITL9ZQu7MZCutFmUtI8k1NZYZR+d8aZuxM2nRzzxQRzZKIz3dTckkxU9iQ6d+xOeMJiic5EIJmXSCZ0AVY4PixcxpOkd41NLhXZzySuSVY3Uim59EFv/W5mWULtJf1cZ8QXxGbMJyURpNQxJvPUqsksF1dJkfsm4U5OeDZFnwl1DPiVRYgHCSR38sQ3GaEFyi2atRiRjHq6CWcIS6bINpP6pCgCy1CnWXxCHT/hiCKhWRH4F8eNXuAdiq+RxC4Rh9oM9u6EsAiDWfBMt2hgMC2jtWpSOA7zNYv6AAUcoJSrsKEY7PVpgVulsdA22MAdN/DHpG4ymyY6dydcAId5Psdg704Aol4JPxhM6h530W84ZCj0TD1FJywhJkjgor6Si1qk70E8wCE4X7fAuDFD2JQ76lDqlAZyyibuM1stLRCLFqjn0ZKKS7TgBJalfDdYjgYWRGaH+hK3rguCK30iciRSiiZ6DA/X/Qoq8B3E8VFq1Bl1SPSzVQIqJpfEpR4EDAhOiiZzOms70ikyDDAjQIRNuhbXuW+VSMR+WceQnCdsIl2m8xzXSZFlJXgsSQNL5VNJMprYwIFCusVhOyMG9Wko4VBJU3lRGE2sreltjum1gROYdUA/Irw8dfiuUKHwCKSVCxFWdAVWRepLFXqGIZlnQh2rkYuEbzaFA6Yqe7pc12LKmJo5vQ7JCkgInUomk4rS0lTsNjBqNndYmA4uMsrjysMIjkj5jtFEzdOdaDusUNSFtQGW1qoUk5CV9ZDUITYYAVnklkUG+vu2YwRYaDgaEIWfGVX1GnEEAdshkFIEns4i4SwjEVuNVO38npBoBAzoh5CQBhCOgljhFIRtbUwC+csm/z9oH9kRMbGO+h8w70EZLQgOfIbygdFZgu4LuVZJAN2kTp6RLCw5wo8QBBoj4ooQx8oGlCpCoUeuV/K7PgXjySKrKQKlcdn0cDEX4MQCZ0VZsc9Ekf9JesRgq2VKXSA+mGTJcQtxxghQ+Br1mmUy571lhbJdLpITIf3GDOb6Zn1CMCew4UMiSyXXoYtQo0AdnRmJnQs7yjI5UZWvT4xucAzaqqw2ZFN4IsibUN+kClrIFVU5PQasl7zA8GFkf32WjIZaRhMEeEwktwEC1QmhZaps4GgorHB9IknAbMrzpg97i9SrzguxyMw/veI7KGiIPFurVKF5Fpus6EqG+FCL7wKEfBypYK12mkc9Fk+2JHIGcxoWNYUBGEOtEgxHiKLDWH3dizdvyM88m1fEIhSyMGAw6iwPTI5aks2HpaeSCKA/FxKuUp/8wHMwmBaDWADJlHogqYDAqjHMZ5N+rI7VhXEIU0CQYS9gob/+cmWrwGUA1tWCE+oKB7XaoZLnHawgYA5IM8dEjEjgGAAmzJYGQl4tGkMmzHQ2aoeRL2tRfYKYECUZzkk4+SUB3RgFojIDrSQAp7CEDzKHchh6UQv1fKgui1Kmj9ECbJBkA0Z4HmFsAe0gxgBqVXlaDKdbjCJgL4McuMYh+efh73ObjXlYlRfPVwO6VDJRZGwigdMojPbwtwQovs/ExZX59OjjNvdlbE6rFig3yEKnM4H1yAWoOLRW4NHiHFa+RcOlxoNqiVySHLXxSp1HYCDmIs6GKtLhnE26eiF/ilKhCs3b57pVmZ0sCs8xQktBuAGQsrASucINLOohEx0O83g1p8E8kvUEBa6F9ihEZNye0TD6oB+kRxUEoE91EkcCp2IQ19JqifwZUAjITjgH8MUy1NxirkVvQzdIwzpDfa2obMg0hQuhWvje1mos7uy8tjGfbujw4kzLQ0H2qAX0wkDDnxx3MDneb4eonTPvjSbqddUmoQOyeQGvN5hJArbZqs4uEbOwdEeHL6jcKhGrZbu3v7tva09v/0ZsRCAXFTTPluFLYbwuhPwKeQWOR58Vxh7TAU34pmjssRzzVOMBzCM74rm/YNgbrUQwamQ29SaYD+ZxpwBqMAXgC4OcEzbkFb6cqA4SSkHbUYxf5FhchjNVquHBrrC4r3gsXeHhoUrbOHa0wPOQkayA2iC/C1CVDJItxWKgBg/fY9SGtDQ4hem9f2A47LslGNl1iulBo3dRKiek0qrsADPqQ1AjG7At7NUwZAC9wvfcGpdEdpzpPnJy/owd0YM5eRzhasSodOjqAJEkVAcHAcCkGjJ8L5B+jEEYrlxgKUEFIc6I2IXVJK3GBKJ6VFiJwkm5LmAp0htOysJRHQ34Dw8k2TrYF8410uGuq0DFmVXkcnhTUZkvKp6oUQC2RNObDUNBniH8yVixiWZFGbqiemQknwxrKd4zwFEWx9shsIVaGkrPDwLUV4WFloWmGAtGL95CVCbkLOIpY1cMKUjia2ZMhhVQPn/c6VOQN375wILrpPtAGXijMUcX2JHLD+R3cRdcYzlLmbBL+sasjlck6qWtPDvBmKtRtPqBSU3iawuWYmU6xLu8D6g2trH3ivX90+HR5ftx5IYxAMUbx6Vw9kddSUPmLHw8i71Fg3A5fvnhroqxjZtLkPsOaUq1tqSa7o+rtiDc5WlXPX80/sCFiME5WnT9Vp4Ohe+N7xGyvHcT1QeG6o7EQJb3Us9ua6nz0gscdLQ80715obroYVXdTCaVTsO/A3Uny5Kjl/eq8e5AFYqqzGxzU3NGa2rTmtIP150NVC1pugAV5e813VuJoAVJ5ZvlqZa2TPM9wAwXiM++OA1yfiCvmwJM2S+fnonuAr8/8PkaI86a6gF8y4eGAwA5vY5cHjgEdLeSdEtnc2tnuo1s3DS8vzvSM7woUgeGgekSUkRbX6HPjG4GDtB0tntRysydX3MZKW7hxKKpCzUtvL/U1FJ5uqUJf+YuOOZ+j2ESou6KzPnvQQYmmvKD6C/GOr1uOPK6dWRuzWLSwOwFJu5tV9o+e+z9NRMjmTXvRWYJE9tG5hKLiVfG88i6qZYlQ1HdWbNrKtPR0XGMc5e0qGVkjiwmOQ/Q0PVzl9kZhzLcTZbdvSSIs5HRGjfKj8HnsaZ0untLT1Gal27csql0ZXfPFn28MKhvPxjeRmk+Ehl7rSYZFF/ul8pzSZtOYhm6KJNuzbSBNRdW7rqGgmxP+Hp/IYHuZcGcebBmaPzCBA2YOzfT0ZZpNTJZjWVzhtbS0b5O6+hoTmvZ5uZ2o6U9va7FaLu7wGl5FsoFyQuRt9h9ek7TKcy0WpjM5Zme7f1dm3q791+pDYqsAECGKQDnwEw4PcQ8KCDlWd0SgQGF22PT3Ru0wa7t5Qc7WtOZzLp0c4YZ7R2ZbIe2ftvg/ZW0rqbtFFZ99d8J106HQ+Hh47esuvHE49TPCvh9912j/Kv+HV2nXD92+I3p2086cuowz39707PBJ797y1/HO+XNfQcPPPHc0NnrS0X/2ivIFbf8VP5rz59G2t94/qE/e2+/UXr73+5br/1l5PE9s4/87XZq75NXXfzppkevuuWhF3+2aeascx8Yv/PnHX//wsnn33THzG9P/sStN37t5KHT387+eHTnqc88+5UV+ZU3tD+1ambP+k/tefnNmY+VvnnK9cOv3HNoy0oxsuufs2fcWZ666J6z7TePHv7hH45su+CCVXd8uOGly/94Se9NxmXa609vXXN76Yzdl6x+YevqH3wjfevdg+f8aGCj952PvvaPnzz1zCm/P2HyW6995K49J7QdfeK5r/ddv7r7yCuHTnr8xC+/c9ovXmy45rYPvTUjD398W4e/aufr5MmrHxlpXPHyzcUzZfdtR895YeQ/r56677GjD74z4R08zTL3jD/5ysqvrv6NWLtv5efuWnU8xm3FcU2v/vq8l+DzfwFf21Ms
|
||||
eNqFVVtsFFUYLjQEHwjXJvqiHDcIATrb2QvdbuVib0CF2tpWuQXXszNnd6ednTPMOdN2aUoCKAaqwTEEgiY8yHZXNxWoLSGKoIlNNMELGqLUEEN4gBhM0BBF5AH/M7tLCzS4D5uz5//+2/f9/9ld2S5iMY0aUwY1gxMLKxx+MGdX1iLbbML4a5kk4Qmqplua29qP2pY29kyCc5NVV1RgU/NigycsamqKV6HJii5fRZIwhuOEpaNUTf0y9fteTxL3RDjtJAbzVCOf7A+WI08RBTdbej0W1QmcPDYjlgesCoVSDC6uEhrSkihKo097+srROBQzpjEO2R/AryW6TlEt4FEjX8SQoSkEcYqShHCUorYXNSKVGos4gk5trOspZBCiCghkR9hIwZHqTFxYhJnUcG0YMS1p6gTFLQikGXEvWku7EbaICAoR4QpwKk6tur/OSVrqTmDOUBIS4yQB+FbBB1WJLoyKjm2VSAGJUcMgXPIDX7LfHxYx3MLyjAlXAee4S9NTEUawpSQiULCtcxbpAGfhoBKmWJopFBXgGpTHIWLENYMgCpakth26j1ELgX6mRRIgk9ZFyoEexbYwFydBgWUzDsBCBi96iZGYrbuO3eDjslAkEhusm1jIHR8xTAhHqc0RxLOAAkS64BtCNBom3LIEtXUVRYngOF8eOFopr2hAE5AIUxIkiaGDXo8Js0YsrrmT0+txke7pgVYnRhIl6ZR2Itt0WUyZLnWMWyCapw/k8ohZ1yyiCnILQbdOgNJoB1E4QLf2ZRMEq7Ax+9MJyrgz9NAOHAfiiMklYihUhQTOR/HtmlmOVBLTgc6cInR1l8zJdRJiSlgHvjN5L+cENk1dU7CwVwgZBwuDI4laHjbnxHxJsEkGd07VFOuoaEnByhpI9gaCXv+JHgk2RTN02DlJx1BSxnTtpycaTKx0Qhyp8Bw4mbzzsYkYypyBJqw0t90XUjDtDGArWRkcnnhv2QbMF3GydS0PpysYx9MFvD6fNzR0X2CWMhRnIIZ1RobukXzPJQe7EZDkSkn2HSuypMNo84RzNOQLf5BfX0Z2ZyAkt9muNChCvvk6W3h83m9eV1Tz15K56XpQxzmz2tLKkT+E2oiJxO4hX2W17KsO+tGapvbBukKa9knFGGq3YPRjIEhDUfyskrCNTqLm6iaVfcwz3pZYNh32kUuFlxfEEj+ddFCW5bGFj0RasCCaITKmA+Fw+H/iAjOEOyOiP0kOS/5Qe77LZcHNY2gyz/zzXagnI+qBihY8AjleTxGNHomevJ6gf3OuULSkqc5ncI7Ivhcbarv4ho6e+hhes622pTbVpqnm+pM9kqJTW5U4/IcRyR2IHu6MoYAcCIZUfywmK1gNh6pilWQZVn3hUDQoV/nl0NEuDTs5n9eH4pTGdXK8brVUh+HJkdrcsXGy9ZteqGlqrBvcKLXSKAX+2jHwbFCDZNqIBePo5NzUsOAWyYB7a80mZ6RKCSvRyqgfk6pQQPYvk2phb4oDdG9A0uJ1cP8rd2byL9Lov/P7HytxP6Xr93/7/JfPlb0eIcuvf5jWpgXW1cz+4a1nh+ZmDq/zBz+Zx25c3PNj96GyaRumn71Zeu3NJy7La7c8fv3UjdPJc9l9kSM3r1ddWrWyCa/8ovaac7Bs7z8HX/UN7JjyyoJ921/eOGNxy6w3MrfC5zuG3xld/+mSfmvLATKtf+HQ3KWXvrpNW38bOXD8yh9HPp4zf/fhtiMz7uxcX3bz7b+vXpxTcWFkntMqlx5WLnz3+/mGedOHm++896Sx7fbI7CZlhXNo1EQzT17d+2fnY33kyp7LVbfONJzdYcXrnxr+6d2Zo1NPLV7Sf+CvVUt/PnRuFjR5925pyefLN6zwTykp+Q/z/kl9
|
||||
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
eNrNWHtsHMUZDxAViiivllLaRgxXN4nh9ux72IlNkXDtOBgcO7GdpAmm7tzu3N3EezvLzuzZlyhVSamomlJyEo2iFlpKHBuciCSFFiECEmpBvCrS/oEwbUFCpVRCrUSDANFW6ffN7j39CEhUqnXW3e7ON9/j9/seO3tmCsyTXDhnHeGOYh41FVzI0p4Zj93qM6lun84zlRPW1MbB4ZGDvsfnvppTypWdLS3U5THqqJwnXG7GTJFvKcRb8kxKmmVyKi2s4quXrN8VydPJMSXGmSMjnfHWRCoaKa+JdN68K+IJm0U6I75kXiQaMQVY4Si4McyoZ+ZIRnhE5RiZYBS+PMIdMtwb2X0L7CIsZsNC06a+xYyk0WZI4ThMGTZVYDnspoSwAy0OzaMWRQvcLo5JvTcs4I7rqzFp5lieRjp3RVzwhXmKo227IhAAr4g/LCZNj7sYmqpl+ilRgthCjBPfRX1FF7VI5XEnG9mN+5s2GDdmiTzljt6UOsXBjLaJK5bXt+aJhTeo59EiXFdWOL5ta98tlqG+DWG6WV/UWtdFbC4VERkSKkUTPYabm4oEnuM1iOOj2Kgz6pDwb7NkEGwuiUs9CBjwgUzkmNNZXRGPkRFAA9EibNK1ucmVXSQhWSTAA4DlKdpCMp7IE+kyk2e4CQCmJXgsyWoWy8aiZDTSyx2LmDaH5YxYVNFAwqGSxrKiMBpprupN1OjNA0GQpIQ6RHhZ6vCdgUKgCrDQhQiTCQ6k9VWovghx1SwKQtJgQq3NNI1SfGNOOGCqtqfLdW2mjama0+eQtFA5YlLJZJQUhU8k6LMtYmHU8txhWh91kVEe1x6GcITKbx6NVD29BW2HOxR1YSrBrWbwzwLAVAMkdYgNhUBOcNsmgwP92zACLDAcDQjDz6yKeoM4ggidS1L4nslC4TQjIVutWHX/noBoBAwYgJCQ1SAcBrHMKQhbc40E8pdN/n/QPrQjZGId9T9h3oMyWhAc+AzlA6OzCN3nc62cAGaOOllG0nDLESpEEGiMiGtCnCkbUGoC6jFyvZzf9SlYmyyykiJQGpdMDxdzAXYscDYhy/blUOR/kh41sFUzpS4Qn0yyZLiNOGMEKFyGjWeJzPloWaFtlwvkREC/MYu5KlefEMzx8/AjkqaSm9BFqFWgjsmsyC3zO8oSOVGRr0+MbnAMeqws8x9A94SfzUF9kzpoAVd05fQYsF7yAsOHof31WTIaaBmN6MYseR4g0J0QWqbOBo6Gwh1ujkcJmE15Nqdg7QT1LFAnJ2DcqIlM4+5l30HB6tCzZq0KzbPZZFlXNMCH2nwnIKRwAoF71d086rHaZIsiZzCn4aahMQBjqF2UACF8HMbq615t84b8zLKGIhaikIYBg1FnaWAy1JasEZaeciKA/kxAuHJ9Ur7nYDBtBrEAkmn1QFIBgcWkJYpNqpo6VhfGYUwBQUY8nwX+qqXKVoFLH6yrBifQNaFztbqp5FkHKwiYA9LMySFGxHcsAFMBhxDyStEYzokJqEqwIVUkbVNznOQgSjKYk2w+zqKAbg0FwjIDrcQHp7CEDzGHcpgRUQv1FFSXBSnTz2gBFkjSixFuIExeQDuoYQC1KzwFV4EdOoqAvfQz4BqH5G/AX/E8G/OwKi+crxZ0qWhkgrHxCE6jMAnDdxFQ/JiJi3ca6dHP81zJmjmtUqBcPw2dLgesRy5AxaHVAo8WZ7DyLRguPR5USuSi5KiOV3o/AgMxF7VsqCAdzNmkqw/yZ0JqVKF5K27a5dnJpvAcI7QYhL2AlI2VyBWub1MPmehwmMcrOQ3mkbQnKHAtsEcjImvtGQ2iD/pBelRDAPp0J3EkcKoG4mpaLZI/gxoB2Qn7AL5YhhKpXDN6G7hBVq+x9GVZ5epka3AjUAvX7W3Wws42tI1GuqHDCzMtCwXZozbQCwMNXxnuYHJ83A5R3afhjSbsdZUmYQKyWQGvN5hJApbldZ1dJGZB6Q43n1e5dSJWynbfQHf/5p6+gfXYiEAuLGheXurSUFcXAn4FvALHw98aY4+ZgCZcaRp7LMM83XgA89CO2tyfN+yNliMYNrI89caZAvO4UwA1mAJwwSDnRB7yCl9OdAcJpKDtaMYvsC3ehj11quHGrrC50jyWrvBwU61tB3Y03/OQkayA2iC/4U0cYpwu1sRADx7KYzQPaWlxCtP7wOBI0HeLMLKbFNODhu+iVI5LrVXbAWbUh6BKNmBb0KthyAB6Be+5VS6J9A5mKuRk44wd0oM5WRzhqsQod+jKABEl1AQHAcCoHjKU50tVwyAMV8a3taCGEGdE7MJ6ktZjAtE9KqhEwaRcF7AY6QsmZeHojgb8hweSbB7qD+Ya6XDX1aDizCoyQBF4EM4XZU/0KABLwuktD0NBliH80ZpiE86KMnBF98hQPhrUUjxngK1sjocpYAu1DZRuDALUV42FkYamWBOMPjyFKE/IacRT1hwxxCCJd8/kGFZA+dqyS6Ygb1Tp+LzTl6OgDLwxmGMK7Milh7M7uQuusYytTdgplTVr4hGJfmkrzY4z5hoUrX540pD42oKlWJsO8S49CFQbW9+3Zd3AdLB16RiO3DAGoHjLDimcI2FXMpA58x/PYm8xIFyOKj3aVTa2ZWMRct8hrbG2VKz1WK1qG8Jdmnb188drH7gQMdjHCE+rStOB8EO1a4QsHdpAzcHhui0xkKVD1Mu3p+q89HwHHS3NdG+cry58WFE3k4zF4/A5XrezLDpm6ZAe745XoKjIzCZaE0mjtd1ojT9atzdQtWiYAlSUftH6UDmCNiSVypWmUqlE8gFghgvEZ9+dBjnlyz1TgCl78dmZ8Ojs/sGbqoy4YqoH8C09MeIDyPE15EbfIaC7jcRTnQn8kPUbRo50h3pGFkTq+AgwXUKKGOvK9Jkxc74DNJ3tXpAycyurLiPFbZxYDH2gZgTHfYa+VZpOteLf3NVnXO8xTELUXZZZ+RFkYKIpPYL+Yqzja0ZCrxPb51YtJA3MnmfiobVa2zVnXl81MZRZ9VFkFjExtX0uspB4eTwPrZtKLRqKysqqXVPJjo6OM+y7mEXxju1zZCHJBkAD15uWWFkLZbCaLLl6URBnQ6MNbpVOwO+x1ni8e1PPhNjaf4PjuyNu/Ib+ofbkTcnHgtMoQyGRsdcakkHx5apYmovm6SSWoeuS8bZkO1hzbfmsa9hP9wSv99cS6F42zJmPVQ2tPTBBA+aakh3tyTYrmTZYOmMZqY61a4yOjkTcSCcSa63U2vialNV+sMBpaRbKBckKkbXZUTNjmBRmWiNI5tJMz7aBrg193Ue+YQyJtABARigA58BMOD3MPCggpVnTFr4Fhdtj0929xlDXttIjHW3xZLKNsjilazuS6Q5j3dahY+W0rqTtFFZ9ffp+23QwFD59lnXl3vOW6b9z4P/0aWvTH288p+uC//z4qpYn+9njWw6fWvmDkRMb745EEj98+Pcr7tnwa/ng/l+ejj3+9E3vNDetWtWxau47h6/v+uzrKv7tr937p3u+9+EH71j7jx6dPlV46oWDn2+a8vfuvvyZ3GsXNy1/b/dTv/1bOnH39V//zWU7N37riecu+9GFvXccfLbnQC/bNPDMum17lr9w975LH5j53fsnb//5dbvPOz/2/onL2z7ccqRj7/dXvrL9hHfxlbmhO7vW7nzz5IG/HLj9K7eKG0aXd7+87oV/nXzu9cde/OZbrxxeEX2v+PLTk072wrt+curaU//82fPnnjCuuDy57/rpKyYzb/57u9j6vrVDnv2r29gzx/7xmb9evO8Pd92/4tMn/ZcuTb+x5dxUiW99blnPaKZp732rd7VcddH6zT/9+9X3frCiec8XB59/cPm4feern/py9O0/73/jSxflZk9/4a13H00YTT0XXJPdtOqOsz737kuH+x+47+0ndSjPWXbj4eLZ55+9bNl/AbrLDgQ=
|
||||
eNqFVX1sE2UY3wTE8I8wM4mJH6+XTWKy6/qxdusSldExRDI3tiIwstS3d2/bW6/3Hve+t62bSxRQo0TxAtEYNU7oOlMHbDhCdBA1CiJgAokai4gfKPCHGgU/QgLB524tDFmwf7R37/N7vn7P73m7bribGEyhWumIonFiYInDC7PWDRtkrUkY35BNEZ6gcqa1pT28zTSUfGWCc53VV1djXXFhjScMqiuSS6Kp6m5PdYowhuOEZaJUTh+/aXa/kMK9EU6TRGNCPfK4vTVVSCii4GRNv2BQlcCTYDJiCGCVKJSicftoZQLzBQzxBEE9BMOPgRQNtTc9JAx02nGoTFQbJ6nYlInoExnVNMJFL+Rxe71BOxynVC1k0nDKycRxt6KmI4xgQ0pEDMJMlbNIFzjbDjJhkqHoNhM2uAFN4hDR4opGEAVLSukjMopRA0HfukES0J7STaoQliTTwNx+0mTEDZNxABYyuNAKRmKm6jj2gA9KUxNpBBCcggPrgf4c2u0hIBylJkcQzwA2EOmGbwixVNPhlCWoqcooShAulgeORtplN6DYkAiTEiSFoYN+QYcZEYMrDuP9goN0nv7T6tRIdkkqpUlk6g6Lad2hjnFD0eLCwACc2RpRDCLb5BaCdk6B0mgXkThAOweGEwTLoLRNmQRl3Bq7Tjs7gTiic5FoEpUhgbU93qfoVUgmMRXozEn2XB1xWrkkIbqIVeA7O+lljWJdVxUJ2/Zqe4wjBQ2Jdi3Xm3O21ERQoMatPQ3FOqpb0yB1DbldvhqXd7RXZBwrmgpaFVUMJWV1xz4x1aBjKQlxxMIaWdlJ5x1TMZRZQ81Yamm/JqTNtDWEjVSg5t2p54apgb6INRxqvT5dwXg1nc/l8bhqx64JzNKaZA3FsMrI2BWSr7jkYDd8ojsguj07iiypIG2esLb5PcG3Qas6qI+sz0JIbrJ1GZgIOXJwuLC0W1uWFad5sqQs0wjTsfY1GUoV8taidqIje/eQJ1Dv9tS7A2hJc3gkVEgTnnYYY2EDpB+DgSwuDn9YSphaksi50LRjzwtX27KXTYV95GLhxoJh2a9WpsbtdufvuyHSgAVRNDtjxhcMBv8nLjBDuDVu9ye6g6K3NjzZpb+mI4+m85y89gr1ZO16oKKKGyCv1lNEoxuip6/HHejIFYoWFdnaC88Rt+fh0PLuhse6Fre2xVem+jzJjh5fI2vc3StKKjVlkcPdT0RHEL3cyqO6On8s6CXRqOQP+GpxjEheH/H7PCQgByQpSrZ1K9jKeVweFKc0rpKdoSYxhOHKEdsd2VjDjasfbWheGhpZJbbRKAX+whh41qhGsu3EADlaOSc1LLhBsuDe1rDaGq+TglLULwf8xIu9sZgkLoK9KQroikAy9u3g/Mc8lZ28kfaXlt+z8ZYS5zMj/FLLso8Xzt3/wSOvnr/93JLn86+93Dfa1nqrIBzedcc3X/yq7vIlX7j3u86yeX/W0dOXTv9xc6LrqP7jvs/OdV3a/cCeJ1Y8+N6Xv43mT0zcxWbMydC17y+KTSxpnbv+zZmHT4yuKmvV5/j7jn6/t9nvOlw5W9hCrPL5q5eHBz+cVVZResl7UaicSctajgQP1izY/PP+82e2b/n0r1PNrm/fmDh5KDT+yuCTvyh/5zcd+OeZY0097nnBNGY/LbztueMVg7PPlh24u+b1Y4dKT219a9aajr2Ll+3A8x+/M1I+uuGrxvHNFyJb0x99XVnx7OD96Z3o8/KzP5zZaP3+4ieDG1cMoYoL72w/t/bizSUlly/PKDm+/Ok6XFpS8i/VnBjV
|
||||
@@ -1 +0,0 @@
|
||||
eNqNVt1uHDUUFnc8hrVCWkh2fnd3djYVElHSICClVZMqQIVGzox3143HHmxPNtsoFxQkxOW+QhslqCq0V0VclIteccELlKfh2DuzPw2llap0z+fPx+d85xx7HlweE6mo4O89oVwTiVMNhpo+uJTku5Io/eNFTvRIZOe3bu7tPyolfTHSulAbnocL6mp8TNnETUXuKYJlOjo/FNnkVXzagM1y0thAjYMR1ogqpEcEjQmG/ySiHO3tNFqokeOTRBJVMq2AG55djgjOIKCfHt9RRDqbQ8L19HkxgQi4U0WkvNBtw7+nm2lKCu1c56nIKB9Ofx3ep0ULZWTAsCYXs+XpozVv7fGW4JzY1KaPjwgpHMzoMXkCsIYTnP1JQabPcFEwmmLD8u4pwZ9+5Wwxatb3RClTYDDMh+kIU+7MEv+tdrBL+FCPpg+70S+QTgEKkh8ulMa6VA/OITzy91+XOVEKD8nDm1/UWf78lgB+T6tlRVJQXk+cQgBhMn31JySJQTVHyRQ1ueCkeQ2pVNKighRhA4DSWd4rGM2HK7bSE0aWkUOsiAPn1fZAyNyZNcZ8k6kFlcTRslSaZI6GBJQDTGDYMIBTFkMJeTqU2/jJvIDX3lCP1+U8D4MgfA5VWFLJuVnMOvSSC8XpYABV2pE4Xyycb1//8usLaB/o7OlleUxTIfn5NrTE9MV+SVoo6KHPS45CP+yioLMRdjaCEH16Y/+PPS0piLUvMYcaSqh7pfv01QfQqQ5U7+N20G1Hvu8jEJKnrMzIXnm4LXJoCmWb/3uou4R2/Of9T+ZT8H9D0BgIxsQ4KYvEqmNyaGzwkrFWA+IYE1lbNIcAYO3ut63GfGjunjY01YyYQ5YcY45AFZ5SlYoW2tqEc0rJgFRP73g8dqtIzCCbCQZO1XDAO20yMWvF5gY6bXJQGH40Vxw3W6gpyXDGaW5BDaEBOMUGT0XJtZyYhTucQougPRgHopAYoM2cgNCWBpMKlHbP7fW6xrSunCAM3U4QtwHR9xOaGS/VJm9XqGSTDwkjyjqAKJmmOUlIIdIRMINe1w+iOPKj5WXjwlTc8SMHGsCPNrpR88wEWkoJKdssGVYaCpFBoNlr/np937fxLhhXXXa6JiRN8iJJzc7A7dX2AOxu2w3ApirJsFEmsDrxjM5l1uTERNLcEUPjCVrXSut5acZfL1dlelHnJOp44NELO7Fb8OFM/ozYCNpdk+SY8izJC5NOx/Vr4MgCkdutgYwMJTH7wmCBUWkLf2DcFtB3CkY5yQ+Ndz/oWW9zmJpw277rhzM0pUWS54D5Nc8gljVDRmVOIX+jRmzqlTJRGmHN2oAQphg9IrWY7RXQKgp9UgWajihjlhlb1ReYIXZ6FoSLVwNOTiyx78YrmCX2LTEj40LAi1jxomWoopkUj6GWRya/ILLpGDunpjXNLoOUx/Nch3BVVkWIbC4WmBUh8N3w7AwmEMZKwjD7bj+Iu+1eH0Ydj5P5XJqb4Kz19pk3AFxyxN5yV4dfSJY5VQO5lA+EB5NLUmhur1TYU5gng9qVdw/8OMbP6g3xTkejDxfXwkfIQdtEY9AnQ/XunepcZN4OjG6A+xFaQwi90f27nWtcBGg9ar98hta7XfhrkBCQyCKdCmnPkZrTucLpXuFEc6RXIT1AOiuc+Mrp/TlSew58MPqrUHAVMmEHc2hJgAOsRvDWaMFbaNvdctfW1qP45bO1NZAmNa8i1wKwOLDYLUYwlBbYgPVCi90m2ViIDG3BEC7v5miXYJ5Jsz0K5uANuPgs1JlDt/EAE2Y8+ktYbg+paaKEqq5Ua+F2G7NJfbxJHKADyuD7yFCimQ0RQj5kcYjt4CY8qHC3wpWo4fJBGkjgSTD4ROMUgbHeBfYO4nQ40utBZH5v3kfflId4Iu16EHYWhH7b/K7VrQcCwVfPBL7/kIJXDH3G4auOMZSsJJNU9dlfiqaEV08tD3Sv34vjOPivgZ496PazMTGPlaXHZ/8CGWbrYg==
|
||||
File diff suppressed because one or more lines are too long
@@ -1 +0,0 @@
|
||||
eNrNWAtsHMUZTggtVVra0FKgQNXpAXk0t+d72bGNoDh27BiCHexLyMPBmtudu5t4b3fZmb3z2URRAkECxOOA0lQgxMOxqQngEKooQGgRj1KKoogUqNMKikSBhqBWKi20qKT/zO7d7Z0fAYlKtRLZOzv//I/v+x8728dyxGbUNObupgYnNlY5PLDi9jGbXO0Qxq8bzRKeMbWR1d29iQcdm05ekOHcYs11ddiiIWzwjG1aVA2pZrYuF6nLEsZwmrCRpKkVjpy2cjiQxYP93BwgBgs0R8LReDBQ2hNo3jgcsE2dBJoDDiN2IBhQTbDC4LCwkgZR56IsWm4mETY01Il0miKIGqi3PRTYsgmOMTWiw05Vx45GlJhSrzDTMAhXdMzBdDiOm6buqjFwVqjhOEf1Qj8j2FYzsIEalsP7mZohWRxoHg5Y4AyxORXGDQcgAnZB/KERptrUErGBQ3qlNJJvETeRbpoDyLGEvoIltDBuUyMd2CLOV3Uwrl8zs5ga8lBsFLpT0ibKSVYuTRHzFrBt4wI8l3cYjq5L3zWSwo4OcdooH/zWtUCgGEdmCnlKhYk2EYerHLmei2cQF69CfUafgbyfNYwgnqEMWdiGgAEhUD5DjObKjkgIJTIECbgQGbR0qlKuF5DHFgb4pEw7i4UtKGWbWcQsotIUVVGeJBl4zNBiEkqHgqgv0E4BV1WnsJ0gDXPsShiY4VDazPUFllT0Rn16s8AQwVLgBTLtNDbokKvQtBHQ0IIIozwF1jrcU1+AuIJjxAtJjQl+m3FSSNHVGdMAU6U9LZalE2lMxZxOAyVNnkEqZoQFUcF0EAN9uoY0EbUsNYjUhy3BKJtKDz04POUb+wIVTzcJ22EFC10il2BpieQ9I7wGkirEejwg81TXUXfXqvUiAsQ1XBjghZ9oZfUKMkwEtkMgmenYKvGEkyK5JFu1UOX8NpdoCAzogpCgxSDsBbHEKQjbEp+E4C8Z/P+gvWeHx8Qq6n/JvAdlOGdS4DOUDxGdGeg+lWulBFAz2EgTlIQlw+QegkBjgbgkxImyQUjloSALrpfyuzoF/cnCyikCpXHW9LBELsCJOUryrGRfRoj8T9LDB1slU6oC8eUkS4rqAmcRAQyPXueZJXM+X1ZI29k0OeHSr18jFs9UJwQxnCz8EUhiRlXoIljLYUMlWmDT1I4yS06U5asToxUcgybLSvwH0G3TSWegvjEZNJcrsnLaBFjPaI6Il5791VnS52rpCyDgMWI0CxDITggtU2YDFYbCClUHggjMxjSd4bA3j20N1LE8zBu+yNSeXvIdFCz2PFsiVQnzdDJY0hV08cE6HQKEuBhBYK1ymo1t4k+2oOCMyGlYVCQGYAzWCwwghH8GIdV1z9+8IT/TpKaIeSgkYcAg2JgdmBTWGamFpa2UCKA/5RKuVJ+4YxsimDqBWADJpHogqQmBFUmLOBnkvjpWFcZekQImStgOcf3ls5WtHGUOWFcJjqsrL3O1ciijaUNUEDAHpImRERghx9AATA4cEpCXi0ZvxsxDVYIDMUdJHasDKANRYu6cpNMBEgR0fRTwygy0EgecEiW8hxiYwpAotGCbQ3WZljKrCM7BBobaRYRrCJM1oR34GID1Mk/BVWCHjCJgz5wUuEYh+Wvw5zRL+m1RlafPVw26VDCQJ2QgIKZRGIXhdwFQ/IKJK1Zq6bGKZilnvjmtXKAsJwmdLgOsF1yAioMrBV5YnBKVb9pwyfGgXCJnJEdlvJLnIRiIqelnQxlpd85GLZ2QP3kmUYXmzamql2YnHcN7EaGZIGwHpHRRiSzTcnRsCyYaFObxck6DeShpmxi45tojEWF+e/rc6IN+kO6TEIA+2UkMBpzyQVxJqxnyp1siwJrhHMBXlKFoPLNEeOu6gRYv0+RjSeXiWNhdcNXCc0O9Nr2zNW2jlm7C4emZloaCbGMd6CUCDb9S1BDJ8UU7ROWcmi8ar9eVm4QKyKZN+LwRmWTCtqysszPEzC3d3uFTKrdMxHLZ7uxqXbWmrbOrQzQikPMKmp1lsjRU1QWXXy6vwHHvb4mxTVRAE54kjW2SIrZsPIC5Z4c/96cMe32lCHqNLIvtAcLBPGrkQI1IAXggkHNmFvJKfJzIDuJKQduRjJ/mWLEMZ8pUEwdbpk655DGzTFscKrVtFh3NsW3BSJIT2iC/4VMcYpws+GIgBw9uE5yFtNQohum9qzvh9t0CjOwqFumBvW9RzAaY1CrtADOqQ1AhG7DN7dUwZAC93O/cCpfM5GaicsHJ2hnbowcx0mKEqxCj1KHLA0QQYRUcBACDcsjgtsO4j0EiXClHl4ISQjEjii4sJ2k5JiDZo9xK5E7KVQELoU53UjYN2dGA//CCoTU9q9y5hhnUsiSoYmY1U0AReOHNFyVP5CgAW7zpLQtDQZoI+IO+YuPNisx1RfZITz7o1lJxzwBH6VTcpoAtWFeEdG0QoL5KLJQkNEVfMDrFLURpQk4KPJnvikHcdmwZyxBRAdmbc04bgbzhxT1Trl8eA2XgjUIM1RQdubg3PUQtcI2kdGnCEOPauCquSORHW3F8gBBLwcLqvYMKE58tohRL0yHexZ8D1fo7Oteu6Bp1jy5OiJEbxgAhXreZmcZuryspgjlTX4+L3qJAuAxe3NdSMrZudQFy30DhUH08FJ7wq9Yh3MVRS75/yv/CgojBOYp3XVUcdYUf9e8xWXHX5Vjt7q06UgSyuAvb2YZ4lZe2YwhHi2Otq6eq816W1Y3FQpEI/NtTdTIrGGpxlxzv9pShKMuMR8PRmBJuUMKRfVVnA1ULimqCiuL94UdLEdQhqXimOBKPR+MPATMsID65dhTkuMO2jwCm5JWXxry7swe6L6sw4nsjbYBv8UDCAZAjy9CljoFAdz2KxJuj8eZwBHVcntjd6ulJTIvUngQwnUGKKCtK9BlTM44BNB1vnZYykwsrLguK62JiUeSFmuLe9ylyqTgaD4ufyR+dcL9NRBIK3SWZhZ9DBiaa4hPCXxHryLKE53V4w+Si6aSB2VNM3NUotS098f6KiZ7Mos8jM4OJkQ2TgenES+O5Z91IfMZQlHdW7BqJNTU1neDc6S2KNdcv2zCJppOsAdR1/fxZdvqhdHejWXfPCOK4Z7RCteLT8Hd/OBJpvaItb0Q2x9dd2Yvzre00flljO9/v3kYpXBBZ9FqFESi+lBeKk8EsHhRl6KJYpD7WANZcWLrr6nWSbe7n/YUIupcOc+b+iqH+CxNhwOT5saaGWL0WSyokmdKUeFPjMqWpKRpRktFooxZvjCyLaw0P5igujkO5QGnTTOvkMTWlqBhmWsVN5uJY2/qulss7W3evU3rMpAmAJDAAZ8BMONpLbCggxXFVNx0NCrdNRlvblZ6W9cUnmuojsVh9LNyUwlFcD3FacWXPRCmty2k7Iqq+vH7fNuoOhS/M/e0PbvraHPkzD/4fP67d3jJwTmTBjuMva6H9u6890Bk+tuvQGep7q+fff/qGl6N3vfjmsxvowz9zntz52cePX/qds4du3/nxR++e03nm6bee+s8h/acP1x1+Lvdp9siqb919661/uOaqt7be9u/Y2szhX+29ac1fXlqHu3DotvGHrojvnui+d17D+yfdfO+Z+276dMfoWZNn/OPxZzYqH5y9lC85u+O5dz/98wW/++SNa54Onmo0XXfyXw9/Nv/iG9SJky/a++ydL57zyZHOPxWGt6QO3RIdnrjlxvktB7/x7beLAT26YtHRnjUjtxw897FTV3Yf23bJKTl7wSk/7lh7x+k/JO//JrZAC73w3h+/33jjoavPuvTgHevy8y78ygeNsczRyeYbfr3/w0smn795mz33owXvJP7WHblo7oeHM+zMuqPtr+UnliYuObjn6LHe0WPNVvzQVdt3Tlzc+nTg3POHbhtcuX9s+X/uGRoN0ntiX716w7aFjeFTGrXQnJFF9zUe+Obwjl9uTTx01/UNMXrj71HHW0d2nvfZd/kzO94/6+7G61e/ve7NA/vvex4/8fqrNzyY3PvkU48s5e80G5veXf7aM78I9n/0yjvH33g1c3zt6x/M/8nf15+7b+E9e57919bCy1vvfCCx8esSsXlzGO0475GT5sz5L+PURmw=
|
||||
@@ -1 +1 @@
|
||||
eNrNWHtsHMUZDyBBUSoepVLU0pLhQkJCb88+39mODVFr7CQYEtvYDjSJI3dud+5u4t2ZZWf2zmc3SCRFKuWPcpA+lLZqKY4d3ChPWsSjKQKlpS2iVKhFRi1EIBX6BxIVj5ZWbfrN7N7d3vkRkKjU01m+3Z1vvsfv9z129s4UiCcoZ+cdpkwSD5sSLkR574xH7vCJkF+bdojMc2tqoH9o+CHfo3Or81K6orOpCbs0gZnMe9ylZsLkTlMh2eQQIXCOiKkMt0ovX94+GXPw+KjkY4SJWGeyuSUdj1XWxDp3TsY8bpNYZ8wXxIvFYyYHK5iEGzfROOq91kE38szVsT27QIpbxIYHpo19ixgpo9UQnDEiDRtLsBSkJed2sCvDjtpV4gK1S6OCYM/MwwLKXF+OCjNPHBzrnIy5YDvxJFW2TMbAYa+kflhEmB51VShgkyEtjfRTJDmyOR9Dvqv0lVylRUiPslxsj9rftMG4UYs7mDK9KWal/qy2iUri6FvzxMIb2PNwCa6rK5hv29p3i2Sxb0NYduqLqHVdyKZCIp5FoVJlokfU5qZEgefqGsTVo8QIG2Eo/GwTBMk8FcjFHgQM8EfFPGGdtRXJBBrOE6TQQWTctalJpV1CITkEoizLPQcrW1DW4w4SLjFplpqoSDICPBZoLUnkEnE0EttEmYVMm8JygiwscSDBsMCJHC+MxNbV9LZE9DpACEVKhBniXg4zOhEo5B4C1rkQYVSkQFJfhupLEFdwjIQhaTAhajPOKCk6kOcMTNX2dLmuTbQxNXN6GcpwmUcmFkTEUYn7SIA+20KWippDGdH6sKsY5VHtYQhHqHznSKzm6S5lO9zBSpdKHbi1DvyzADDZAEkdYoMhkEVq26i/b8t2FQESGK4MCMNPrKp6AzGOwHYIpOC+Z5JQOENQyFYrUdu/JyAaAgP6ICRoLQiHQaxwCsK2LiKh+EvG/z9oH9oRMrGO+h8z70EZLnAKfIbyoaKzCN3nc62SAGYesxxBGbjFuAwRBBorxDUhzpUNSqoI9VdxvZLf9SkYTRZRTREojUumh6tyAXYsUFIUFfvySuR/kh4R2GqZUheIjydZstRWOKsIYLgMG80SmfPhskLbLhbIiYB+oxZxZb4+IQjzHfgRy2BBTegi2CpgZhIrtmt+R1kiJ6ry9YnRDY5BTxUV/gPoHvdzeahvQgct4IqunB4B1gtaIOphaH99lowEWkZiCHiMBHUAAt0JoWXqbKDKULhDzbE4ArMxzeUlrC1izwJ1ogjjRSQyjbtXfAcFa0PP1mlVyjybjFd0xQN8sE0nACGpJg64V9vNwx6JJltccUblNNw0NAZgDLZLAiCELyOkvu5FmzfkZ440FLEQhQwMGASzpYHJYluQRlh6KokA+rMB4Sr1SfoeU8G0CcQCSKbVA0k5BFYlLZJkXEbqWF0Yh1QKcDTs+STwVy5VtgpU+GBdLTiBrqLO1dqmguaYqiBgDkgTllcYIZ9ZAKYEDinIq0VjKM+LUJVgQyxRxsbmGMpDlEQwJ9l0jMQB3QgFwjIDrcQHp1QJHyQMU5gJlRbsSaguC1JmC8EFWCDQJhXhBsI4HNpBhAHYrvIUXAV26CgC9sLPgmsUkr8Bf0kdMuqpqrxwvlrQpeKxIiFjMTWNwuQL/0uA4kdMXHWnkR5bqEOliMxp1QLl+hnodHlgveICVBxcK/DK4qyqfAuGS48H1RK5KDlq45XeD8FATHmUDVWkgzkbdfVC/hSFRhWat6SmXZmdbAzPVYQWg3ATIGWrSuRy17exp5jIKMzj1ZwG81DG4xi4FtijERFRe0aC6IN+kB7REIA+3UmYAE5FIK6l1SL5068REJ2wD+CrylBLOr9OeRu4gda2W/qyonJtqjm4EaiF67ZWa2FnG9pGI92UwwszLQcF2cM20EsFGv5lKVPJ8VE7RG2fhjeasNdVm4QJyOY4vN6oTOKwzNF1dpGYBaU73Hxe5daJWC3bvX3dW7b19PZtVo0I5MKC5jlCl4a6uhDwK+AVOB7+1hh7xAQ04UrT2CNZ4unGA5iHdkRzf96wN1KJYNjIHOyNEQnmUVYANSoF4IJAznEH8kq9nOgOEkhB29GMX2BbdRv21KmmNna5TaXmsXC5pzbV2narjuZ7nmIkKShtkN/w5g0xzpQiMdCDh/QIdiAtLYpheu/rHw76bglGdhOr9MDhuygWY0Jr1XaAGfUhqJEN2Bb0ahgygF7Be26NSzyzm5hScbJxxg7pQVhOjXA1YlQ6dHWAiCNsgoMAYFwPGdLzhYwwSIUr69taUEOoZkTVhfUkrccEpHtUUImCSbkuYAnUG0zKnOmOBvyHBwJtG9wSzDWCUdfVoKqZlWeBIvAgnC8qnuhRAJaE05sDQ0GOKPjjkWITzooicEX3yFA+HtRSdc4AW9lUHZ6ALdg2lHRjEKC+aiyMDDTFSDB61SlEZULOKDxF5IghAUm8ZyZPVAUUryy7fAryRpaPzzttOQrKwBuDMJOrjlw+mZugLrhGsrY2YUJIa9ZURyT6pa08O0aIa2Bl9clxQ6jXFlWKtekQ7/LDQLXRzb23beybDrYuH1MjN4wBSrxpt+DscNiVDMWc+Y9nVW8xIFxMlh/tqhjbNFCC3GeoOdGaTjQfi6q2IdzlaVc/fyL6wIWIwT5GeDpVng6Ej0TXcFE+uBWb/UN1W6pAlg9iz2lL13np+Uw5Wp7pHpivLnxYVTeTSiST8D1et7MoMbN8UI93x6tQVGVmW5pbUkZzm9GcfLRub6BqyTA5qCg/2HykEkEbkkrmy1PpdHP7IWCGC8Qn+6ZBTvpi7xRgSp57diY8Kvtx/y01RnxmqgfwLf982AeQk+3oZp8h0N2KkunOltbOZBpt3jp8uDvUM7wgUseHgekCUsTYWKHPjJn3GdB0tntBysytqbmsKG6ricXQB2pGcLxn6Fvl6XSz+sxdd871HlFJqHRXZNZ8CBmYaMqPKH9VrJPtw6HXqR1z1y4kDcyeZ+LB9VrbF869vmZiKHPth5FZxMT0jrnYQuKV8Ty0biq9aCiqK2t2TaU6OjrOse+iFiV3zKGFJBsADVy/ZomVUSiD1WjJ1YuCOBsabVCr/CT8Hm1OJrtv7SmKbRRes90s6bt9447Bmya2PBacRhlSEVn1WkMQKL5UlspzcQePqzK0IZVsTbWBNddXzrqG/ExP8Hp/PYLuZcOc+VjN0OiBiTJg7ppUR1uq1UplDJLJWka6Y3270dHRkjQyLS3rrfT6ZHvaanuoQHF5FsoFynGes8lRM2uYGGZaI0jm8kzP9r6urb3dh79sDPIMB0CGMQDHYCacHiIeFJDyrGlz34LC7ZHp7k3GYNf28iMdrclUqg36WEcKvpkOY+Ptg8cqaV1N2ylV9fVp+13TwVB4+rydK+/9xDL9uQD+zp61yl23rOq67O6zN29f0XRF7y8PvfCdu79/YvKyGw4t/22LfHws5UzMZm7/6a03/udfLfv2vvbwmcmjV53+yarpr19y0Z8L75x+5p+n3niv9eyd4yNH7txw8InfFH+33H7hlffTj9/0B37xlfH7N5/Z9sCB3DMXXnTP/t/jZ09tOrDr+SPZ656zThZeSax452enL9x37PIrH//ir7458NBLd+z57CUv7T/51M1tOfd759918Sd7LuSDTWfe+GDZhm/d8/7+v6+6u/vEVZ86+0DL+C1Pn+nYm9swcOuhA+PL/3jFP1b88MSr1vmrR2+8dM39751/1SR37zvw+v5vv8i+8erxzz3Fp16+9IZ/OwNb39919dGn37xvfeG25a8/8aN3R+654MHXD4h9/uyvP5/826kduROrbpP3kskXv/L2gzOJTx94/q1TO/6ycuLqlb8YG3wr9diTuwcefnLqSz9Yfd2b7lc3v7v6tb++u1JF8IJlK777wdt/Om/Zsv8CmHEMVA==
|
||||
eNqFVWtsFFUUbnkkqFEUjIQfhMvK409nO/vuFh+UbbEIpbUPeYVu7s7c7U47O3c6987SpTbGAqlEfIwv0BoSZdkla4E2gJoIKlHUgIkVjbEgkEhC1KAhKgHxB56Z7kKRBvfH5u4933l95zt3e3MpYjCFaqUDisaJgSUOP5jVmzNIp0kY35RNEp6gcqahvql5p2koI/MSnOussrwc64obazxhUF2R3BJNlqc85UnCGG4jLBOjcvpk6ZVuVxJ3RTntIBpzVSKP6PWXIVcRBTdru10GVQmcXCYjhgusEoVSNG5fJRSkJFGMxua4etbZjlQmqm2QVGzKRPAJjGoa4YIXAoteb9j255SqhdAaTjqhOU4pajrKCDakRNQgzFQ5i7aDs+0gEyYZim63boOr0CgOEa1N0QiiYEkqG4iM4tRA0KhukAT0o6RIGcKSZBqY2ydNRtwwGQdgIYMbtTASN1XHcT34oDQ1kUYAwSk4sPXEQA7PNusIx6jJEcQzoH1EUvANIZZqOtyyBDVVGcUIwsXywNFIu+0GFBsSZVKCJDF00O3SYSjE4IpDcbfLQTqn/7Q6NpJdkkppBzJ1h8W07lDHuKFoba6eHrizRaEYRLbJLQRdNwZKY+1E4gBd15NLECyDtF7MJCjj1tAtYtkHxBGdC0STqAwJrD1tGxS9DMkkrgKdecmeq6NGK99BiC5gFfjOjnpZg1jXVUXCtr3cHuNAQTSCXcut5rytLQEkp3Hr/apiHeUNadC2hkS3z+/2DnYJjGNFU0GcgoqhpKzu2D8ca9Cx1AFxhMLeWNlR571jMZRZu+qwVN90U0ibaWsXNpJB//6x94apgb6IlYs03JquYLyRzuf2eNyhoZsCs7QmWbviWGVk6DrJ113ysBs+QQwKomdvkSUVpM0T1s6A6N8NWtVBfWRjFkJyk/VmYCLkqy9zhS19p35ZcZpnSqZlqmE61uElhlKGvCHURHRk7x7yBCtFT6VfRI/VNQ9ECmmaxx3GULMB0o/DQGqKw89JCVPrIHI+Mu7YR1w32rKXTYV95ELhiYJh2T+tjF8UxZH5t0UasCCKZmfM+MLh8P/EBWYItw7Y/QliWPCGmke7DPjXjKDxPEffuUI9WbseqGjubZA36imi0W3R49fjC6/JF4oWFNk6BOeo6KmuTnR0tixuaazgy6TGmvjy9TWNCjvYJUgqNWWBw2NPBEcQXdwaQRX+gOz1xn1BjyRVhCpiPtGPpbg/RPxB2eP1xXamFGzlPW4PaqO0TSX7IkuECIYnR2hyZGPlqlevqKpbGhlYJTTSGAX+mjHwrFGNZJuIAXK08k5qWHCDZMG9sWq1daBCCkuxIA544uGgT/QGhMWwN0UBXRdIxn4dnD+VZ7KjL9LRUjT7uSklzmfi8pfql3266N6jHx+r2NZ6orFp7c+tmzaIm3d81l/7Q7zP/8EbWxbu/77h6ul5NUe+3Xj2H9Y148KEygsH92y7vG/LPad+O37RbG3JXfrz7dMPz2raPPlkxfCrrpaz0x7YPnfFJ++m2nfcfdeFeQ3VfXV/hYefCqzE8/u3Hf8Dr359bdngfcFY5HLq737/1R3HQp2rntg6YWH4zJEz8Tt/DEyaPzx95taXZwz2oYd8F58ns89NWX5/tvbJCV8/uL32uzvmnqi6cj49vHe2q6r/2YFzqdLX3jqy4M0lc2q/ELZPevSRny4dXhSY+Wt16+Zv5nTuFhZ8Ho384pse2vQC2zr5/KkFv/fOmjrS/srUWM1KLTj58feuqQc/an0aqLh2bWJJ36HzQaO0pORfBzAQdg==
|
||||
1
docs/cassettes/agents_a3fb262c.msgpack.zlib
Normal file
1
docs/cassettes/agents_a3fb262c.msgpack.zlib
Normal file
@@ -0,0 +1 @@
|
||||
eNrtW81v28gVb3soir310vNU6Na7gEiJ+rLkRVE4Ttwkmy/E3ibZOBBG5EicNclhOEPLiiGg3ba3AoXQ/6DrdRZG9iPo3roLFD310H8gPfRv6XtDUl+WnewiEX1wDjE58+bNb968zxnq46d7LJJcBD98xgPFImoreJF//fhpxB7HTKo/HvlMucI5vHN7a/uTOOIv3naVCuVaqURDbtJAuZEIuW3awi/tWSWfSUl7TB52hDP4749+dlDw6X5biV0WyMIascqVWpEUMipoeXhQiITH4KkQSxYVoNcWACVQ2NR3qZJEuYz0GYU/EeEBkd1fF4aPkI1wmIdktkdjhxlVQ4ogYMqowDTlSqWF3KSKGPWBSkUxg3clhJdOHFBfT6zoHvcGbcloZLvtiMnYU7L9ETBDBg6TdsRDFAwSr5OEjrCgxwNGBPT4/AlzSFdEBMQQRsyF1fI9mI3adhxRhU+BgwikAsJ0BpN8IFk39vTAPowhAxGTgAGFEjBA9mG9ehdwTwjtiFgR4BeBcAjbg/+BxbUghFbpithzSIcRmsGDgdHAxAVwJGlL22U+hRUcFELYMhYprjfgoKAp9dPcUqc5ISRPiF0Sh8hTDUItOpAuD3qF4RDaUGV4xBwUbsr00RSp6HzEbAWkj4ZPXUYdULy/HLpCqtHzE6r0JQiOhcpggS0cmGD0ee8JD4vEYV0PxHls4z5rXR0d7zIWGtQDeR8lo0Zf0TD0uE2xv4Tb+CxVKQOxnOw+Rs0zQCEDNfpmXQ4Cez0DU7ozAPUPSNms1szKV/uGVJQHHuiv4VHAdRTq/n9Md4TU3gVmRmpao6Nk8BfTNEKOPr1J7dtbMyxR3KNPaeQ3an+fbo/iAJSMjZ5u3Dk5Xdo5ma5qWpa5+nyGMS5q9Ln+s6b/5+L5WOLjocdgOFWj3DDK1heZyDzQc+WOPqlXa5+B4oagiuwPR8BaxfLjQ9ge9p9/P00N+m+338+29n8/+OnhZdiq0bebES+SyirZYiFBwyRWY61srVVa5Dc3t59tpNNs4868IIrtq5LWbSMx3PeI7dJIMvWrWHWN5vPtCAyjC9t1JVONp7YbB7vMOd5YqBSfb1DQfAPnAUcz+iwQho0tLwqT9aOJemDFykjdHuwuvo4Oa+Vy+cUvz6SMwKx4gEgOq61W6yV8QYRMjb5GQRjlllFZ3U7EUa99+IIsGpn4zhTPEeIBRL84g3KC5wjwIDU5k3oxnkrzw+MUtMGd0Tfw3C5b11cvXaG9u5fu3Kv1Nv3+rnh85frj7U/2OB0dW6ZFekL0PPblxqaRSH1Lq8no6eUHt9ZvXtt4dt+4KzoCxLBNQVyBCNjRFotA/UbHtidiB6w7Ykcw/O76g9HXTbtldxqN+qrTYs1uwzEu3d7SgeX3R4nf+e+PB1pX1kiqgG3Qy0i95VBF0Z8lrqcw01koZu+FtYMCd7Bf9mBtN1XNYvKyq27K65eu1K9X1MbVVh3IZ7lAQxKwClRKDiwDzTIJRWdEoklYe/ioCH5ThBBpKMaYtSD2vLRJosgDm2WN8RioduJZIK00WsUChIPpNms4JPBv+NZbqUzSCdsdT9i7iyWzgASQ8sBh+4W1cnG2H2Gk4zCItsFrIrGTvscgw/u0FYXgXvf3ex8+uPq+s76/yq5uA1USas+MtMkSYZJkHWR2MSFs9xx6Ugh17DllwbAhip69YE0ys+CkZbLQROwIcEwcgpA49RLUa4XCBO3pwl8SloMdhJMzCB35d2B7dgqSBudJPKQLwcOeQZQ7Ji5tkSW3c8hyx7ZTGE4gne5XRPgytyLCKSRz3DLnvHBNM52FqTXM+M9pd7TAiw6n3Oicz2yUhwugLFrSdF8Bx5xeIn07XSElHk+XR4nX0wHsn1ma9LvvklP96eyMqlo9mVGdyHW/yLpvJHndodWo1E9JnbLY/ClWOvtzkffBuHBAM09Ma0aZi4Wu8DzRb8dhe1zFZHEtqW6yN+4n1SBGxjQowAvInisdae9Nir8tmGszm6tINgAolE8BR+2IIwzBmfT7/b6ZYsGNwB2YjsKFgxXQTi2YFdjnFYxQ8LAyM8FKkaxErJfQrEwmw3ZbQOIdDbDjg4BjXYfZDpNEdMm6zyKQOpJBvQIk1VVztYlvmpNhVSpmrQLv6kmbO8giHVG6IWR7PegxSNv1aIDoYXrfZqGwXaC0ViurtXqliYX0pBtZTHI4Um6tlcsrQ0SZ1It6iR6VCnYDNJs5c/wsSBQ12AnFPMvmWq2OkBTzw7aNI+umlb134b3eMnFJXLYdimKxtJACh49ljNm9lnEcBAPkxe1EsqWS7QTzu5W+lhq1/UatBDxLllU1w6CXSN/BRVuQ3+Iy++Bd2n6IC6qZ5axhVzc0zHrW4LBexHDcqjVu4hEiuHLrCrKF4l3KOGJtv6O5WxXNbdzMEW6lZTZbSavNw7bvQ1s5o8MWTZW0uLHPQQIojxa+6zQXe+G5y5gnPb7LpsU5aZySKSK1Xe55CWV1vMSkUVPW9TLBVSjtaRPKylyjpqzpRof1Q8HBS2tCy6xNt2m6ZPAebOguLtFq6Gnx3eeon7Ak3RLvoYbrx14MCpTsQ1OP1g3JPgDsxnCIPtoWEZh12Wy1arVmuQ5GT/vtsWUmPnti/aeaPDpA0L4OeAbtBjM3sQnsbdBkYhCHgvc9yzUo2tHKxoKSU2KlcqsUa2PG4hmMuWSPZyuBmzPGbm7Wl2yCVpEbIuiRuxSM9yQSPOJ51XWYMMyGaOaQy4g+Y1Yk22BpYB4KFLFIwIYiLpMHKOGSE6abQgShSyUbzyxNYprmJFScQHaVeZhGkwciJut7gjvkEnUyMpNUyyDDgdTcO2wA5mymsMYc9Ot7uB4WODTC+eY2udKqLNrjxNvr6NdGJ1ZYq5rV+vA1HT7+5M9v7PCxSCYDp6rA6dEPJ1XKJDshr3jkmFZCU2dzp8RYPHbDAoy8rAIbPpoBfXK183gTSHMSeQgFBlhSktyfEWd3YNxONjChvoi2F9H2ItouPdruaGd1ttV+7xC4wMwvAuFsIAT5PyoUJzGg/UreGr26bLMoElAikS71JAMHfnHldXHldXHltbwrr8OK1Wy+1juvav3izuvN33k1v++dV7W26M7Lqt9i7oZ/69L2vc3HrcuD7fcbruW/2TuvZqvTsTqL77zedl7fndfWB5v3bm1uNfx7Yu/J3eb97qBm3++ezzuvVnX1xJ1XZfiqR+av9/oLLBaFhH/mboDOwZ0VwhqTpRh3gp3gEqRDGdalA3AIhBSspdMImx3zDnNBA6W08Mm2ToJefsXxxmAUCaSWbLnXPgvFwSHXGsNY/vQy9n0KCRfU8qgiacJ3DuSSHfyMC2uZ3kFobMvHM66a1tCgjWkJLR/MtqtVd/kTq0n5l6PSos1EIg5y3QSH1FvLvNBehOFfX22Sd6x6CiOH6TfeNfOWAbmm8gploIcST/hInyuX2B7E17y2gshdqLBhM3LJbwz0SCCFaYvMwSC5fNPf4CwUvcd7rirO2sHyUZzwiLlsQc3I2x80iB+65J35FGb5OAyrnGOM3A3dd/O0xiTPx5wSyk6Vu1YEIgIseFTs8Cg5tjFJXs46cZlX0xuTfFNJiGEu+K/c4hZVpFV+28xVCHMZ/fKF8FsueYd7oAy4Hz0hHJDLOSjCzkOW3cqn0tE3f5BcN/JKLnf9E7l1PpaRFps5q2JyeJV3kpWenZEwEnvcYbnvD6HEYYqCrua/Rc7k9Ci/46KIhRDp8zzbzDHnS2/M1+9cI1TmXpf3mecBDuIJPHbPAYER6Q8hMtXoph8L5BnoeQAofH0ZnCcMAJG/65r5FmViOrmZzTbt5H6YZWshLH1ak9zeYxH1vNy1ojj91WcegWT+hw5ZUPMG5yDYhyFkQXJ5P+xafAcg8NOf0INqmuYU7PG4NTl51cee56FSChwsGfI9/5y6nZHFvHIg5Yq455Iu5dGUzSwfB36QmocvdfK/EbnBFPEZ2Q1En/BuvudL468MKViryMurJ0VSzp7z5y/7BOT7/aDzdf+kkwVOG1xI8J1/0mmt1oanAXr5DzvHI/8P7fUwPw==
|
||||
1
docs/cassettes/agents_a79bb782.msgpack.zlib
Normal file
1
docs/cassettes/agents_a79bb782.msgpack.zlib
Normal file
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
eNrNWG1sHMUZDqSkiEIRbUFUjchwTUNCbs8+39mODagEO0lNYjvYDiWJg5nbnbub3O7OZmf2zueIViSoiACiixDQH5QCzrl1U0gKLaVAqqoE+IGg/dEPg0oiSgUptDQFtaKI0ndm9+72zh8BiUq1Etk7O++8H8/zfuzsmSoSl1Nmn3KA2oK4WBfwwP09Uy7Z5REubqxYROSZMbl5cHjkQc+lM1/JC+Hw7pYW7NAEtkXeZQ7VEzqzWorJFotwjnOET2aYUX7pnMTumIXHxwQrEJvHupOtbel4rLon1r19d8xlJol1xzxO3Fg8pjOwwhaw8DV6Yez6HbCZGcSEZ93EnkG0lNaucWbbRGgmFmAgCAnGzOAwG1vyMIGL1CyPcYJdPQ8bqO14YozreWLhWPfumAMmE1dQacLuGPjpluUfBuG6Sx0ZAThkWEkj9RYJhkzGCshzpL6yI7Vw4VI7F7tenq+bYNyYwSxMbXUotsuDWWUTFcRSS7PEwgXsurgMz7UdtmeayneDZLFnQjS2q4eodWuRSblALItCpdJEl8jDdYECz+UziMtXiVF71EbhzxZOkMhTjhzsQsAAdlTKE7u7viOZQCN5giQoiIw7JtWpMMso5ARH1M4y18LSFpR1mYW4Q3SapToqkQwHjzlaSRK5RByNxtZT20C6SWE7QQYWOJCwMceJHCuOxlbV9bZF9FrAA8lFhG3E3By26USgkLkIyOZAhFGJAjc9EaovQ1zBMRKGpMmEqM04I6Xo5jyzwVRlz1rHMYkypm5On40yTOSRjjnhcVRmHuKgzzSQIaNmUZsofdiRjHKp8jCEI1S+fTRW93SHtB1WsNQlMwaWVoF/BgAmmiBpQGwoBLJETRMNDmzaKiNAAsOlAWH4iVFTryGbIbAdAsmZ5+okFM4QFLLVSNTP7w2IhsCAAQgJWgnCYRCrnIKwrYpISP6S8f8P2od2hExsoP4nzHtQhouMAp+hfMjozEP32VyrJoCex3aOoAws2UyECAKNJeKKECfLBilVgrIruV7N78YUjCYLr6UIlMYF08ORuQAnFikp8ap9eSnyP0mPCGz1TGkIxCeTLFlqSpxlBDA8hv1lgcz5aFmhbOdz5ERAvzGDOCLfmBDE9iz4I5bBnOrQRbBRxLZOjNiO2R1lgZyoyTcmRg84Bq2UV/kPoLvMy+WhvnEVtIArqnK6BFjPaZHIl6H9jVkyGmgZjSHgMeLUAghUJ4SWqbKBSkNhheqFOAKzMc3lBewtYdcAdbwEU0UkMs2nV30HBStDz1YpVdI8k4xXdcUDfLBJJwAhIQcNWKuf5mKXRJMtLjkjcxoWNYUBGIPNMgcI4Z9NSGPdizZvyM8caSpiIQoZGDAIthcGJotNTpph6a0mAujPBoSr1ifhubYMpkkgFkAypR5IyiCwMmmRIOMiUscawjgsU4ChEdcjgb9iobJVpNwD6+rBCXSVVK7WD+U0Z8sKAuaANLHzEiPk2QaAKYBDEvJa0RjOsxJUJTgQC5QxsV5AeYgSD+YkkxZIHNCNUCAsM9BKPHBKlvAhYmMKo6DUgl0B1WVOymwiuAgbOFovI9xEGItBO4gwAJs1noKrwA4VRcCee1lwjULyN+EvqEXGXFmV585XA7pUPFYipBCT0ygMvPC7DCh+zMSVK8302EQtKnhkTqsVKMfLQKfLA+slF6Di4HqBlxZnZeWbM1xqPKiVyHnJUR+v1HkIBmLKomyoIR3M2WhtH+RPiStUoXkLqpvV2cnE8F5GaD4I1wNSpqxEDnM8E7uSiTaFebyW02AeyrgMA9cCexQiPGrPaBB90A/SowoC0Kc6ic2BUxGI62k1T/4MKgR4N5wD+Moy1JbOr5LeBm6glZ2GeqyqXJlqDRYCtfDc0W7M7WxT22imm3R4bqbloCC72AR6yUDDryy1ZXJ83A5RP6fpiybsdbUmoQOyOQafNzKTGGyzVJ2dJ2ZB6Q4Pn1W5VSLWynbfQM+mLb19AxtkIwK5sKC5FleloaEuBPwKeAWOh38rjF2iA5rwpGjskixxVeMBzEM7ork/a9gbrUYwbGQWdgtEgHnULoIamQLwQCDnmAV5JT9OVAcJpKDtKMbPcaxchjNVqsmDHWZSoXjMHebKQ5W2nbKjea4rGUmKUhvkN3xwQ4wz5UgM1OAhXIItSEuDYpjeBwZHgr5bhpFdxzI9cPgtinmBK63KDjCjMQR1sgHbgl4NQwbQK/jOrXOJZXYSXUhONs/YIT2InZMjXJ0Y1Q5dGyDiCOvgIAAYV0OGcD0uIgyS4cp6phJUEMoZUXZhNUmrMQGpHhVUomBSbghYAvUFkzKzVUcD/sMLjrYMbQrmGm5Tx1GgypmVZYEi8CKcL6qeqFEAtoTTmwVDQY5I+OORYhPOijxwRfXIUD4e1FJ5zwBHmVTemYAt2NSkdHMQoL4qLLQMNMVIMPrkLUR1Qs5IPHnkiiEBSXz9VJ7ICshfWXTOJOSN8A/NumR5GJSBNxqxdSY7sv9IboI64BrJmsqECS6MaV1ekaiPNn+6QIijYWn1I+Mal58tshQr0yHe/g+AamMb+q5eN1AJjvYPypEbxgAp3rKTM/tA2JU0yZzZr6dlb9EgXLbwH1tbNbZlcxly30atifZ0ovVgVLUJ4fYrjnr/RPSFAxGDc7TwUsqvBMIPRfcw7u/vx/rgcMORMpD+fuxaHekGL13Plo76Uz2bZ6sLX9bUTaUSyST8O9RwMi/bur9fjXeHalDUZKbbWttSWmuH1pp8rOFsoGpZ0xmo8O9vfagaQROSSuT9yXSqa833gRkOEJ/srYCc8PieScCUPP/cVHhD9sDgxjojvjjZC/j6T414AHKyE13p2Qh0t6Nkurst3Q0rG/pHDvSEekbmROrQCDCdQ4po66r0mdLzng00ne6ZkzIzK+ouS4qbcmLR1IWaFtzqaWrJr6Rb5c/MxSfd7xKZhFJ3VWbFR5CBicZ/VPorY53sHAm97tg2c9Fc0sDsWSbuX6O0rT75/rqJocxFH0VmHhM7t83E5hKvjuehdZPpeUNR21m3azLV1dV1knPntSi9bQbNJdkEaOD68gV2RqEMdqMFd88L4nRotEYN/0n4e6w1mey5qrfENmSuxDt3Faz1eGK4YA25jwe3UZqQRJa9VuMEii8VZX8mbuFxWYYuSyXbUx1gzSXVu65hL9MbfN5fgqB7mTBnPl43NHphIg2YWZ7q6ki1G6mMRjJZQ0t3renUurraklqmrW2NkV6T7EwbHQ8WKfanoVygHGM5kzysZzUdw0yrBcnsT/VuHVjb39dz4BptiGUYADKCATgbZsLKMHGhgPjTusk8Awq3Syo967WhtVv9R7vak6lUe1cmmcXtXalMl7bu60MHq2ldS9tJWfXVJfsNlWAoPHLK7mW3nL5I/SyG/x9+aFz1y30vX37mB+duufvat46csBOH3z248Y2hidXm/V948IWVJ47d/Nq9v1l+1oUfXvvMX6/YOHjF28cvu2xy9ZLbF1/wDn6r477SI9aPWp5oeW/Xv44ff/P4rm+Uu6/bW9z6s4FdL4jrzFZ6YNGfl/7YP79jfOTFcz4t/H2v//PwH3+6dd0fXjr85rKOHUf1vQNHrjnzJvb7zuE/3bfl8F2nfNW99d+f/e7RZfjVp6874/PPnpbYdmypeeodn3nyvBfOG/nHGct+hdvueePp0t+fvBq9vub1i1dc8+7zdCxmVF5b0frb0zv8G778nnX47Fd/WFnetuRzZxPav33sZbS7f8P0+5efNnDnnrv+8tSn7n772P7ue3/S+4tv3Yyv7Xnxgdz3lm17a/Erhzvv6b87mSTnLv32kduvvuPYstg7Bbzr1r13nv+l+JvfuWn1kq37bqukNj6D3/n10c43nuu/9OdPFyonjvp/u21w03++OXDLB7eZTyxJTQ07l247duPi35UmrryAPTamIrx40bNnvX9i6amLFv0XHXoWHQ==
|
||||
eNqFVV9sU1UY39wD+KKMqKghcqzgg+62t+3d2i4+MDoQso0NNkRmluX03NP10tt7ruee260sC5Fh4p8IuZoYNegD61otExhiJMH/+oBBgolRM2I0IAaNYmKiSEIIfueuhSEL9qE5Pd/v+/f7ft/pjnKecsdgVv2UYQnKMRHww/F2lDl90qWO2FnKUZFherGnu7dvwuXGzIqMELbTGgph2whiS2Q4sw0SJCwXyodDOeo4eIg6xRTTC6fq/xwN5PDIoGBZajmBVhRWI1oTCtRQcPPEaIAzk8Ip4DqUB8BKGJRiCXm11rg/MDYgPZhOTXlDTOzqVIkqDrMsKpQIRFQjkYR0FIyZ1ZgWzvkxBc4bZmHQoZiTzCCnjmsKZ3ArOEsHnTqEG7bsWYLb0CwOUWvIsChiYMkZ26iO0owj6NDmNAONGHnahDAhLsdCniwdCe46AoDVDEG0yaFp1/Qdh8EHFZiLLAoIwcDBGaYc+QRLuhFOMVcgiMehb0Tz8A0h1lk23DoZ5po6SlGEa+WBIy8EZQOGhAw6JENzGDoYDdgwDcqF4XM7GvCR/uk/rc6NJEsyGcsi1/ZZLNg+dY7ghjUUGBuDO6kGg1NdklsNOjAHylJbKREAHRgrZyjWQVO7ixnmCG/6BpUcAOKoLRRqEaZDAu/toW2G3YR0mjaBzgqRc/Vl6FWylNoKNoHv0qyXdxDbtmkQLO0hOcapqloUWcuN5ooUlQJas4T3XlutjlBPAURtITUY1YKRgyOKI7BhmaBKxcRQUsn27UfnGmxMshBHqS6MV5p13j8XwxxvsguT7t7rQkqmvUnMcy3aO3PvuWuBvqhXTvbcmK5qvJYuGgyHg7Hp6wI7BYt4k2lsOnT6KslXXSqwG1FFbVHU8P4aSyZIW2S8CS0RexO0aoP66HgJQgrX2VGEidAvj5Wr67m3u6M2zR/qFhfbYTreB2u40YQiMdRLbSR3D4VbWtVwq6qhR7v6ppLVNH3zDmO6j4P00zCQ1bXhl0nGtbJUryTnHftM4FpbctlM2EehVN8mGJb86RU1VVVnHrwpksOCGJbMWIwmEon/iQvMUOEdlv0pakKJxPpmu2zW+mfQfJ6zD1y1npKsBypafhPktXpqaHRT9Pz1qFp/pVq0Yuje+3AeVMPrmb15deeax7ra9GQ7zXQazdqWtcl3RxRiMldXBLzyVPEFMSK8GaThqB6JtUTj8ZYEiUS1uE5xOE5icaKSdBxrE3kDe5VwMIyGGBsy6YHkGiWJ4clRen3ZeOX2LevbutYlpx5XNrIUA/76MPBsMYuWeikHOXoVPzUsOKclcN/YtsU7HCcJkmomJJzCNJJOE2UV7E1NQFcFUpSvg/9v8lRp9kX6/OSy5xfW+Z+Gzl0d2c/URTu3d5xrGt9X3P3L90nj2Iqzdz3X+O3iPQ8d3TW86cW7r/xz5qtV3Q0/PvJK48Cqj9fd3nX5yGh297J953Ov5ztDBzafPnF+tH7B0QtaeW3zfQtvGz97T/s365VDn5xe/mzkllNLO/Zu3fCW+tKmDy8cP3XpzqX9+IGWnZcvxgzx6vHhW0fKe36u/7o8/tpfJ1ZueLpx5Z57H2746bue1iVnXl505OzvPX989Okzuy6lXvhiyXTg4rnJ8TsunXyj9OuCurorVxrqflt9aPvf0MW/54vuGA==
|
||||
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
eNqNVVtsFFUYLgGiIdGAMSo+yDByEdjT7uzObndKfCi7hXJtSzdAQWzOnjm7O3T2nHHmzMK2QiIIhlQNEx40MT5Qtru6NJRKDaIQTZRIFBoM8bIq8GL0SQghJgZBPLPd7cW2wGQm2ZnzX77v//7/3735NDYtjZJpfRph2ISI8RfL2Zs38Ss2ttjruRRmSapmm5tao0dtUysuTDJmWHU1NdDQqiFhSZMaGqpGNFWTlmpS2LJgAlvZGFUzRdQlpuCudkY7MLHEOsnrkz1ixUSs29YlmlTHYp1oW9gUPSKiHARh/EOjNl/cvZ0bUxXr/B3p0FYx8IMAsCghmAEdMo5P3J1PYqhyEler5mST1GLOwARg/RAhbDCACaKqRhLOyUSnZngEFcfdKB6h02JqAblxS/ydQgfGBoC6lsYndwGLQY3oHDVgWgpTmzkfbmiKtq9avalhQ244tHMCGoauIei61+zgEPvKXADLGHjiccFlDHgZCHNO1VfA1jRneLGJ4K0OyNXeE2NT65Djzhml88/GHhgQdfA4oCykkxt2Pj7WhlpO73qImlrHhYQmSjq90EwF5XEsTZu4RJ18uHliuvLhSLq8v1qS+D0wLrKVIcjpjUPdwgMjUoz4FHxenx94g8ArnRoXGzMzAxDlKZwj3uOVCuqYJFjS6VGUD0xsGbw/8b4c92K2tTfLFcUXzufLTdXTtHa0H+ZmI1xd52zU5hJLtcIamwg8c0CQ5DqfXCfJwqr10b5wOUt0Up0GoiYkVpxr1VBpnjxK2qQDq4XwpA1TXDRK2OT5dS2lMaARw+a9UBoEUPrk5GSvexWXPtDexCleIjd3xWfRQ/hYmDmDLl+30lJttMJ6a3HxZN68rydA7A2Vsi17sP0oxLLP4ofxmRqiOJl7eSdV0GXlKUsxYjmKK+tXFOUBcadE5NtaFCbz/J+gw9QX3MdyrJTD1sJ9raesUKEMGmiqc4b/bvdKUrglspMGW6x1Um0jyUTi6URgFzxtMVNDPKTbyAY1GbAw4oucZZyih29ndwm96JcC/iBHs1zQCNL5pm21YxHqQrWWC4aJdQrV06NAqZmAROssjYkLoLjArwT9AdUfAzgWV4GshGqBovgkEPP5QqockmplNXg0rUGnwJeFkKA0oeN+FAcIoiQGw8Ps5CNtG+rXrw73bQEbaYxyQaKQC0cowblWbPL14RSQTm2Vr20T58Irwcb6NmdQCUh+fyCE5FBIVfwxBTRs3niiMtYjY5t1d37pb+m1nFsSkjg37fF53Y9Wla7p/Ll3Tz1Uv3a6NHv/vW9uD7X9dHzZuZfObLopxuY0rFh3IC0tLy46vGPJP9dw98xzl/fdeG/ugZ2RG0ek397Snuo4/cfdL71nPNfw3cFXr/98Be18ZKbyptzUMpQONzdKF774/dfEgKf5mZ581WNP6zej7T9stt++sMD+ZfBKj7bthU/Or/Bc6v/6VsuR/szdJ59dNmv+y9Fjf7cp336f+rjXu+mNLUsPXn2OeM8uUQ8eXa1aanJ21/MLpT2R3Pb56RkXF87Yb95um/fdrS1fNd4Rhi79OGvoUzD3nfNr5hXv/HWxuPijfz/vbthzuOHYoSV/dq1a2bnm0rvdN4OZdZdrS6SnVz3xvu96fFpV1X/mQzDF
|
||||
eNqFVE9oHFUYj7RYWy/qIdBi7XSJipK3O7N/kt1ACem2SUtNN82upV3R8vbNtzuTnZ2ZzrxdktYSTBpEFONDyMGDUjPZrWvMJrZasLUQ04rF9mBPXQJV8OBBUDFeCkJ9s9lNUhLinN58f3/f7/e9N1IqgGWrhv7YtKpTsDCh/MdmIyULTufBpueLOaCKITt9sXhiMm+p1ecVSk27w+fDpurFOlUsw1SJlxg5X0Hy5cC2cQZsJ2XIQ9XsWU8OD56iRhZ029MhSKI/2Cp4GkHc8tpZj2VowE+evA2Wh3uJwZHo1DUp6l7PudfdDEMGzbUQDedlQAFkG7oOFPl5RdHvj3jOlRTAMh9m3FEMm7K5dfAqmBAwKQKdGLKqZ9gXmTOq2SrIkNYwhTJxK9bmZ+UsgImwphaguJzFZrFpairBrt83wLtP13EiOmTCenfZHQfxKXXKrnQ1cPj6hjibuiB6A0Gvf3YQ2RSrusb5QBrmkIpmzX91rcPEJMvroLpSrLicPLM2xrDZVC8msfgjJbFFFDaFrVxb8NJau5XXqZoDVor2rW9Xd662C3glyds+90hhe0gnbCqNNRvmVkheSSlzVQJIbEOiNNNgSQM9QxU2KYntFy2wTb5lMFrkJWneHnG4InD7h1J9MT6NHWmoeb/pGecAV4d9222prYK/XYiDKbiqC1Jbhyh1iAGhpzcxHa23SWwoxlzCwrqd5oIcbIhfIkpez4Jcjm4oe9WzOpbF+2tqTqWofim4WO4vc4KiKFZf2DTSghxnze3oBCKRyP/U5cwAZZfd+ZAYQf72xPKUoWCyKmyUuXy16niKLh6OqGWTyFU8jWhh0+iN8YiBZLkOGqkyu8bPp0TJX+g5FMoETbNPhTMnjukZ6UjyWPdXg4hoRl5GlD8vgGoLMUhZVQhJaWhvC4fTYdkfScvBFIhhWQqmAnIklArKZLKgYlaWvJKQMYyMBpVoN4piogCK19aGlQ6cPNrVezg6fQL1GymD85fAnGfd0KEYB4uvIyvXWvMLbkGRp/d3nWSXwyRCUiESDJBwyp9OE7Sf35vGAq0siOO+DrVn7C2+phY33biz590nmmrfllfevzOwID51/pMpe2L+9N3m37fvf3N7y5X+92bb5hP389c+/yD17x83L4x9/Pj1Bzj9z9PPjb6xaC3NzA0v/vnRr2Yltm3AW+q5tG9h/npPMu7smtn69uG7O0YGKs33fhEPOVvHvWPPfnav+eXb8sFXj1/9sjwQpt91a3/tWfrZ+Ymgzpsk9tKtJydi20I7F98Z2bEwGv564sXvKxfojzd2z9tHb+3tHH6wc994NfmhLzVW2d2yOPz3N62dHPfDh1ua8G8wusTP/wHDg2Hi
|
||||
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
eNrNWQtsHMUZNk1BCFUoFGgpVGRzQB7gPd/ZPj8xlWPHsUPit5PYceTO7c7dTrw3u9mZvfM5CSoPCUJB0UZpSxXxNnZr0kAeSkPAAVrxaBUqKpWWUIhog0RDFUpbSkubkv4zu+e7s520VFSqFdn7mPn/f/7/+/7H5raJNHYYseh5uwnl2EEahxvm3Tbh4E0uZvyO8RTmhqWPdXb09D7qOuTY9QbnNqsrK0M2CVs2poiENStVlo6WaQbiZXBtm1iKGYtbevaNS45uDqUwYyiJWahu/eaQZoEqykN1oX7LVZCDFaQY2LQTrqkgxgjjiPKwIl6mUFahFlcoxrrCLcVlGP5YJlMSlqNgMD6rgJ3wW1W4gcV7R27a6DKuZECO2CXsWhgqDTmWiUEryzKOU6GtpYWmtC1OKe04ZRWsE8JCWzeUhlKWjk14kLS5WmmpKUIJrGLcwSgVqksgk+HSkDRLno9nbbE94VLpTVg6fVm3OURRSrzlKE3M7BDDyNEMWKJjpjnE9leFGhX/hYJpklCsWPAiRUbBCeLcwsUONjBlJI1LFaRproO4uKLgJQeODgsdzFyTs7DSx7DwrNiYgT1KFvya8yeiLAMek6EWEVNQ3HK5AvIccItwMBUi2riMgkXNLMjlDoEXTOnrXsWkSkaJbWPOSpU4bLYSCcCUgvQ0ohqoCU6iY5sbsES3UohQCBIFQKRAfqkCZ8OKg2gSKwliAgyZfxSSgiXBflhlECaDD6JMInALtiBTFbtnOkEjHIkDqXGkDRc4o43aYCEzLNfUlbgAXmCdBFEY4mAjBwIkbBDBsh1AuMMJlndykbgoDlZPgQjhVNOyhhXXBmEBEgAphCYBcCFCNdPV8ZDvBCkU0WxHQuKGACrlo1nbggfIcVBW4jZ4QF3TlAjVcQLBAUHMhtlQMoFREJXA88KHwh8gXOO54wf+gVfhQTpIleCnT9BNuH3aKxJCdfkV0bDSm+MdHrFNAp6XIJGQYgqhgLuUjIWScKyUwmyskQTRlAyOMzgxU5bgcDJcqgyGWoiInAlR51jREUf+DooYCiet9GBoaV5veYFeASIfvFSxnCSiZNRXGFAFPKxkCGQxEXqpPgt+lfkiAGOxCYU2+3wgnYZFwVRpT6MN+U0akzenjSpxixuKhhgG8AqKBSDThddSgsNCH7IFohwiTxiEI1C+fjCUP+kGYTs8QUKXSK/waKnPNcxnhKQoYt1BIDPENJWO9lX9wgPYN1wYELgfKJFTrwK1FbAdHMks19FwsBnYEaBVD+flN/tAU8CAdnCJsgQ2B07MYQrctrRgh8AvHvn/gH1gR4DEIuh/xrgXuTVtEcAzpA/hnbPAfTbWcgSAoiUSosioIvf6EQQYi4hLQPw7NohdogIKrOf4XUzBQrKwaYpAgTonPWzBBZCYJjjDcvYZYsv/hB4FYcszpcgRnw1Z/MqjCA8guA06g3Mw5z9jhbSdzcEJH35DsiwWEwJTF9qK9aE4YkSDKpKrpKENsyvKOTgxvb+YGE1wMEd0UAH+IeiO5SYNyG9+NfexUtxkiJeB/cUsGfS1DIZkf8GI6P1kJYSSKdlAhKHwhGjDpQqYjUjS4LA2gxw9aD8KPTlTeu7soGBJcLKl0z2QiUdyukr9+CBTNkncsokGz/LSHNFmFpCtVMm1Q4SqMgZgDDKz0H2KKIr+qCjvFRZv2ZbMSGJBFOLQnmBEzx2YoGEsDktzjgigP+EDLpefuOtQ4UwTp0VD66sHkFrgWEFaheMRXpDHitzYg2UL3Ou42D8vP1faShPmgnV55/i6MpKreaGMJKnIIGAO7MbUEDFSXKpDMKF510XIp5NGj2FlICuBQMSVuAkNmWKAl5jfJ5lkGJdCdAsgEKQZKCUuHEqk8G4xZcAEIbQgh0N2mRMyqzBKwwKmtAgPzwBMyoJyUIAAZE7jFI4K6JBehNgzNwFHI0D+GfEXjeaQbFPn5qsOVao0lMF4OCRmBspFX5+FKH5K4oonM+GxCtp/zgr6tOkEZbtxqHQGoF5gATIOyid4YXFCZL453SXbg+kUeVZw5Nsrv1GHhphYhWiYjrQJeRuc3NgG/MkwGVUo3pxoZq53MpEYy8BDZwthC0TKFJnItmzXRI5AIiWagac5LUbAuGMhwFrB4MAK7Rn0vQ/6YfegDAHok5WEMsBUQYjztDoLfzpkBFgdyIH4ijRUXmksFaf1j6EsqdblbU7lkoqI/8BXC/dVMX3uw84oGzPhJg48N9KSkJAdZAK8hKPFgEmoIMenrRB5OTMmmqDWTRcJDSKbtGC8EUzKD6Nn8ZmfugPhszK3JOJ02m5rb1rV19zWvkIUItgXJDQnxWRqKMoLPr58XMHBg2sZYwdrEE24kzB2MIygsvBAzAM7Crk/q9kbzHkwKGQp5AzLgZZQmHS5oADcYOCclQJeieFEVhB/F5Qdifg5xIrHIFNSTQi2LZNwiWNmW44QKrVtFBWtaOgGfsOICz6OZwt8IBsP+d0BaKkTBN17e0evX3ez0LJrSNADBbMoYsP+hxJpB5hR7II82ABtfq2GJgPg5c+5eSxZ8Y1Y47Bs64atEwYW1GPbxyBc3NtT/BXoCRjDoZiqmGqWqALe/uQosWHoxwlTDuejjOuT4EaK5aDgTQ5jbKtIzPPj/l7vSdHHQW0R78s2MovuDlKdKsyZ/XpSJCwV6hTl3oEOMKWxrawzC2ii0KXX1IQjT46oTPTgIq+oJmQLb9yW758ufGFDYQIhavBBzBv3N+8pXGMx77HVSOvoKRIpGOI9hpxUVeX+wueOS0WC8iaaOmerC15Oq5uoCEej8G9vkWSWpZr3mGwYfli0G3Mnq2oWCPEejuzJOcjENMkNb6wyFqv6HpDThsyFbx+Hfdxlt41BTPDRlyeCj3GPdNycC+bxkivGmiE+3tRarJcq0RplpUuV8kh5TInW1kUr6+DJitW9u5sCPb1zBmJvL6RiBsxTl+fCP6EZLh3G+mTTnCGfEiGH42h+T6pC1YEcoAZWebvXqd1+Z6K2Ne/3UaYWDi3elIx9ZnQko2uurhvpTCpSO1pZQeLY1RIHgi0wVAg1YJCaYt6j1VVVe4I3Oe9PwlkjajSiRqJTI4DetJVVXdtnmgoq0kTD8kMT7K6OHR5RxacmU9Rk1f+d66C8sVgkEjk0ewG3hqH4eBOVEflzpHCFgwWvhXl5MZW1tbXPzL0oJ6oCltRWVR0uXgUFvUBMtDzFDs1eEIh4JMJ2j+RWq0T3jl0LN0PReCxWXVNRE0nEqpEe1bXK2opYRQxV6wglqqPxp/xBX+Ui3iKNgYsgexGe9Y6VptCIIGNDRTRWUQUnrc99Ruhx483+5FSvwGRhQgl/QkuoGoLqrvoQ9Saa+9sbV7c1TULbqjZBf0jwjjfOu2JoSEsMxVMNA8mu5UN2czrMzJZo46YVK1FyXfdIO29f01LRNlw9nMyuHBlKmryD16jR6likvLoyEoPLcCQM3FITLL3WWee4dg9or91kpvR+bXmsssZo64yHCS/vH9jYP6x3JFuTvDxRTlBN1s6am3pjNGbarT3t69jKdmd1rCfSN9DOampvblpLUXy0fVSLZGJrYsmezp5NNFXTabS2dmlwRMSNhrJ60UlBXmUNAbNUYJbq86oyx6v6YFhsCBdn03qllXO7g5rZeujjwcMY/kI310M4bhAdw7Gd4Bg3TfSGtd24q6u5ctRxa3szHa1WZFm0tWqNXtO5or2xzcpWdHfEOuxYn4HK+ws8Ux2JqpHAOVWRyhoJzbzp/6VVB9ephYlCDVoob4JaDMaGxHgPEAo73qRmWq4OhcHB400tandjv3egNhbVo3FdjyX0WKWmVanLINnmpE2nlTFRVSaQCcBLa95+o6IhBN6sCNVDgWyoqQKOyf96uHXc74JeOO/ggm9eWCJ/5t3T/Tz9RWT+1Ls3XPDgpTvfWqQ//eWxi8bef/jCbfPumrx7cPMOd2/Dt091rnntk3pn31tjndeGbzq666+njx+9EZesajt4/qKPbrzoBN/0l/lHPtr1weunjtzx4omXfvxKPXtty6GPT5/6x55ENv7qP+fd/bejoze8XHNvl3f/13/jrVg1+fh4+Fer+36565LItV2LB/4Q7+pCO/teODy4dvmxyx5/ZdvU4usiT6pvJkpKOk7s2jKw7dnXv/TT00/FI58/fs2vH9hVct/i+6+/586Hlr1TtW/LBecfuyt94JZ7B3oufvDVV686NL5i/ps/unjRlV81vtG6Y8HxspNXP73wg8uev3igrISqyx9oiVz5bP3Cl/b3rZ76M1/wyUP72vsv/+gHK65+edH6hTu0a0aPlrz/oveFsu7yd39y6PWWv996IvMKbji5N3tpxeGBB59r9X7/1JGlu05fPrUm/s7KO1/o/O7b8e3O+M+uevv23gMl+5a/9+xbz4w3XLf24JkFH39l/uc2NB+r29697Tvb3q3afrLzzGvpB/5k3Hf+h1/71nNV139ILv1i/U2/O7llXc0dd42f2nnq5/VPjF/13uDmrj8e/u1IduG69NFb5pWUnDkzr+T7S5Yt7ILrfwGDopBs
|
||||
eNqdVn1sG2cZT1ZUtgnoxFC7aWw9TLeqkHPu7LMdJ7WmxFkSN823s6YZxTrfvY4vvq/e+54duwvV0hXWVeu4ETahqiokaTxCmnVNCl3XFKZ12jSGFES1LQtiQ2ysBSptCAmBgPG8Z3tNaP/Cf/ju3vf5+L2/3/M8d2PFLLKwYujVs4pOkCVKBB6wM1a00F4bYfLotIZI2pCnurv64pO2pSx/LU2Iietra0VT8Rom0kXFKxlabZavldIiqYV7U0VumKmkIeffuen0Po+GMBaHEPbUMw/t80gG5NIJPHh2GzYjWogRmTRSzZStMiLGCiaiTrwM3dTEPKMbhNERkhliMDZGcDFUzKQMi0EAP88AUvhnGZJGdN9ynYZtTJgcxKFeFNlXPDWMxzJURPPiPCZI84zWMGvgxLZqTCfSjDW2NKRndA+saIaMVLo0ZBJWMFhN0RVqqcMaD1dMLCRq8JASVYxgAXKYwCqxLRqJ83J0jaIvEUHyppshZesu8TTWp/f1AE0XNdeAiFlFzScwEi0pnbAQtlWCE8O45CIjLFmKWfbyNDIlOwbpQ4qOGAN2NKUA/FHKqD4WSiMdK1mAKEqSDQDpnQ4EW8AaGJYzeJl+jKgo1DEHPkweJKlIIeo4B2S7dULlZsSkYRMG4llAJ9VGpyFiugmrOG3YqswkqdRleK5sXnoAU7TgoFB/2D21aUFdWURBpUfXzr37n4OujkMBqYaRYWyTRqwwC4oo+pBndJTqCSWtWEim3JeD7lllaiSHkUTAdHTPaDGNRBngPDmVNjBx5tbW+nNAGoIKQLpkyBDeOTlUUMwaRkYpFaicgYLSkSuiM5NByGRFFbieLnk5p0TTVBVJpPu1VMLZcgGyFMn12zO0/ljoHp04C10AojFW252HptQZ3ivUeblTIyw0jKKr0GSsKgKeadPdf3H1hilKGQjClhvemS45z622MbBzokOUuvrWhKQkOydESwsK86vXLVuHwkJOMdp9fbry5rV0fi/Pe0PPrwmM87rknHCb5WdrnBGx8qxkQAznR9xchR8VCpqknclgmHsWKtSEmkMHpsGN2HhsCrRAb7xWLI+aia72ioi/q9o01Qy6OIvxtF3D8CGmSyKMj/MJDC/U83X1AR/T2hGfjZbTxG8ow/NxCwo+BVI8UJG9KKVtPYPkmegNBV+kgsNpKHyYJSwaMQ2M2DIqZ3aA7S0NWTbWPF+qLtawhkRdKbhpnUVX+VxhJCdLtiynszmNCxcEv5JEtpRaKLtAt9A0AIjVsDMlCHxgrrxVIX8GDsuxPMdy/LkRlra7ChMBCHX/y6MefAMcx5293oAYGRgXTlHg3N+F1RYW0kA1mvxaGCEcDp+/sVEllB9MwgHfubVWGK1Gw/s0fPZ6g3KICQ7PjlSsWUV2lrfAQ4IPoFCYQz5JEASuLuDnUoIcloW6IC/5AwEh+QKdCBJEoWqahkVYjGBiKSTvLNdo4ghttIifD/iDcNIGRtEl1ZZRn51sNugZcAMD01M1RPm5aAsbFaU0YvvcAnSKzbs7Gzti0Zk+ABmFUaSgp96pXpdISKlEUosgKZva0dyWUCVJwJ1tuzrCqNcO7GpuN0gfMeyYmGoiTVxjkKsbYvmQL8yHhJDfx/Jezst7eXZQDfb6RdLaYhXMpNxmt7Ukoi1BOdjc1JTMhWKxB3ca4WA2q+mBHd2dXU3q4DBpbMEPBOQhzWvtkDKWORzv79Cbg/0FkrUzvcO57EB3Dk4jknSktoGB4oTZiCPlFmGhRdhSgwiVBmlgZJeDiHftOGxg2uCToEtX8w1MHyUTwRXmeZ9CUKTT0NHyOHBgZxU50pbRCzi4M1SQgwPSoDaoPNi1N5UOx9sFf1t6J6+He0h7ONDfhq2eVSSEfD6WK/MQ5IQ6twqvQf8/Uf10gF3d8WyXWfr2KeoG1pVUaroPWdBAzoykGrYMk91C06B5b+NuZ6FOFjiUrPMLqWCdnxPCbBPMzEq0T+fDFH0tFEUVaiwrOfNpf8RTLwh+TwN8oETqgtBO7hfSI9Olt9Qr1dnNh2+ucn/r1J5fdr7M3Xb+w69vPH7X+NTB8W92v/BR3Lntb2Pd79evXHxvpOMvj62/tL81v7T+zKbNqWxq6M3776n63raj1beHcxPc68ffjTz2T/7dhy8sfYjtyMkPHv7Hr44/vevvf3glclfg1sHFSXP/TF/jwZ7LW64c2Xiv9Z0nMne0bhK3H3q1p/1L4xPD8g/2/ufRfzUNX2ltFD4+8PaG4mL91m1Lp+88X6iquv/qyvuXtulvf27s15t7f3hmezS48vnqU0/FD3D6yYUjmVdb9vV+N/vsyuXQFzcOLN3y26VNN8/9fmP9LZfij0yOv/VMVDn059f/et+FufE/fXzVe3jijfGF05Gfv/xWb/vRrfda7eeWM3ff+ebphza8d/LqN/79k5d2xg7uPTS54dIT4h0Xj21eOV74xbcjV17bG7hYnPzqt2KPH2lHn90ljR3+8m8WhY/Wf/CFZ+6Zv3zrjmN3nx1ZvH3+jD2wJfNjf03Hthf3Dxz9zGT/40PS9tY/aosJIPiTT9ZV7Xup5tj3b6qq+i/YsDbD
|
||||
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
eNrNWQuMVNUZhmqFGqttU0ONjx5Hq4Bzh5mdmX0BscsusIuwsy9gF5auZ+49M3N27z33cs+5MzurxohtTavRjrFpotFEwd2WIgoLvirFpGpMY2OtEZWm9mGVYoutfZCmidL/nHvntbvQmtikG1ju45z//8//f9//uOyYyhOXU5vN30OZIC7WBdzw0o4pl2z3CBdfn7SIyNnGrp5U/8BOz6VvLs0J4fDWZcuwQyO2QximEd22luVjy/QcFsvg2jGJErMrbRvFo19YdUPIIpzjLOGh1q03hHQbVDERag0N2R7CLkEY5YjpZDwTYc4pF5iJCJIvLVxEzBaIEWIgYSOPE/jHNjnK2C4iYHwRgZ3wW0MiR+R7V20a9bhABZAjd0m7Lg+FQ65tEtDKi1wQK3RTuNaUrqst1E0su2adFDZjVScxTdtfhzrtAtIxQ12B0agIBgvbwMVrqzIq55khaDOYhChHFpwPW+TamVq3hUOWbRATHmQdoSVszaKMwiouXIKtUGsGm5yEQ8oZyqui6MjtGY+pGMLSymXrDSGpBN4KnKdmcYQT7Oo5WGIQrrvU8VeF2pD/AhGWpYwgG15YdAJcL70tA+uSHGGc5kkYYV33XCzkFYPYuOBwWOgS7pmCR9BGTmQ85cYC7FHOKUcRM16AOCmASZwgnLY9gUCeC96RYWVSRJdQsbeZWQS5wqXwgqONfeu5UskZdRwieBilYbOdyQCSETbymOmgJjiJQRyRgyWGbWHKABoMYGiB/DCCsxHkYpYlKENNAD/3j0ItWBLsh1U5iJL0MogyqWQL2IJNTe6e6QSdCiwPpKWxPlbjjC7mgIU8Z3umgdIS7oF1CroRiIODXQiQtEEGy3GBV66gRN2pRfKiPlj9NSKkU03bHkOeA8ICJABSKMsC7kKU6aZnkBHfCUooZsVURuGGAhfUo1nbggfYdXFRwTd4wDzTVAg1SAbDAUHMttlQMiUl7EzgeelD6Q8Qrovy8QP/wKvIMBtmKPjZKEku3V7xioJQa3VFLIIGymwn445JwfMKJApSHFEGuLNULFDGtS3EHaLTDNVRgaQ5nJijxSSSjYTRcGgNlZEzIeqCIAML7O9gmONI1s4Ph5ZU9TbU6JUg8sHLkO1mMaMTvsKAKuBhVKCQO2Xolfoi+FVlqQCM9SbU2uzzgfbkbAamKnvaHMiqypiqOV0MpW2RgyzECYBXUiwAmSG9ZkkOS33YkYhyqTphEI5A+dbhUPWk26Tt8ARLXTKpw6MlPteImBGSuoj1BYEsUNNEqe71Q9IDxDdcGhC4HyhRVq8BtRHYDo7ktufqJNgM7AjQakSq8jt8oMmE2Q0uQYthc+DEMqbAbUtqdkj8kvH/D9gHdgRIrIP+J4x7mVvzNgU8Q/qQ3jkN3GdjrUwAKJUyIcqMKnOvH0GAsYy4AsR/YoPcJeuuxHqZ3/UUrCULr1AECtQZ6eFILoDEPCUFXrYvJ7f8T+hRE7YqU+oc8cmQxa88SHoAw23QIJyBOf8dK5TtfA5O+PAbUWWxnhCEedBWbA2lMac6VJFyJQ1tm11RzsCJyv56YrTDwVzZtwX4h6C7tpfNQX7zq7mPlfomQ74M7K9nybCvZTik+gtOZcepKiGUTMUGKg2FJ1QfCyMwG9NsTsDaAnaNoP2o9eRM6eWzg4LFwcmWVHogk4yXdYX9+GBTNUnCdqgOz6rSXNnc1pAtjMrtEGWaigEYg80i9IgyirI/qst7tcVbtSUzklgQhTS0JwSzMwcmaBjrw9JRJgLoz/iAK+cn4blMOtMkedlG++oBpDY4VpIWCTIuavJYnRv7iWq8B1yP+OcVZ0pbeco9sK7qHF9XQXG1KpTTLJMZBMyB3YTlZIyQxwwIJrTYhgx5JWn056A9t6R2aLTTJjRkKAde4n6fZNIxEobo1kAgSDNQSjw4lEzhfXK2gblFasGugOwyJ2TWE5yHBRytkR6eARjLhnJQgwBsVnAKRwV0KC9C7LmXgaNRIP+M+MtGc0S1qXPzFeYNYFuBkLGQnBmYkH19EaL4MYkrn8yEx3po/wWv6dMqCcrx0lDpcoB6iQXIOLia4KXFGZn55nSXag8qKfK04Ki2V36jDg0xtWvRUIm0CXkbnNzWBfwpcBVVKN6C6ma5dzKxHAbBQ6cL4RqIlCkzkWM7noldiURG9RypcFoOnmnXxoC1msGB19oz7Hsf9MPuYRUC0KcqCeOAqZoQV2l1Gv6kVAR4K8iB+Mo01JDILZGn9Y+BFjcZ6rascnE86j/w1cJ9Y9KY+7AzysZMuMkDz420LCRkF5sAL+loOWBSJsnxcStEVc6MiSaodZUioUNkszaMN5JJ1WH0ND7zU3cgfFbmVkSspO2u7vb1Gzu6utfKQgT7goTmWlylhrq84OPLxxUcPLhWMXaJDtGEOwVjl8AIqgoPxDywo5b7s5q94bIHg0JmYXdMDbSUwaQrJAXghgDnbAt4JYcTVUH8XVB2FOLnECsfg0xFNSnYsU0qFI65Y7tSqNI2Kita3dAN/IYRF3ycLtb4QDUe6rsD0NKgGLr37tSAX3eL0LLrWNIDB7Mo5mP+5xllB5hR74Iq2ABtfq2GJgPg5c+5VSzZ6VGiC1h207abpnJEUo+/Ne9zuyBgorS3/uvTozCIQznVCNNtWQdK09kJ6sDYTzKmGs8nuDB2gyMZUaNCafcYIY6G5UQ/6e8tPSY7Oagu8v2yUW6zPUGy06RBs1/vlilLg0rFROlACkxp61rWUwQ8MejTm5sj0cfGNS67cJlZNBPyRWnSUe9/VPvCgdIEQrTgQ1xp0t+8t3aNzUsPb8B6qr9OpORI6WHsWo2J6drnrsdkiipNtffMVhe8rKibikdiMfizr04yLzK99LBqGZ6o202EW9R0G4SUHoxO6lBLKSkdnb9gZETPjKStlVuyvatHnI58hJtrYm3b167D2cG+8W7RvWlNvGusaSxbXDc+kjVFSjRrsaZktKEpEU3CZSQaASu0DM9vdgddz+knpt2y3bSMIX11MtGc6+pJR6hoGNoyOjRmpLKdWdGQaaC4uegUze0DSZY0nc7+7kG+rtvdkOyPbtzSzZtbrmvfzHB6ontCjxaSm5LZ/p7+7cxq7sl1dvbqyxGY7OWpsXJzH+nt7UhMuF7LQCHVaUdXxTobNxnNPWu727rsYrwvlUw5yY053DBUY3NTNKZFA7Mbo4nmqPzZW4aMSVhW5Eq7Eo1Nie9DwnIgm5NbJ8GTwuM7dgFKyUsvTgWfRR9KXVcF+KJdHYDY0qHNxAijWDNa5zHUEG1IolhLayzRGo+htRsG9rQHegbmhOa+AShPHLKRtrpMiCk957ExYuxun5MEhyQJIMC636drUIkhL2qBVaU9g1qf361pXR3TPu+02kGudEixoTAxXjB0zzBy+YIVbZlIxGmaeHrmQLAFBi2pBgzSLF7amWiK7Q3elPG4G84a1WJRLRo7NA58zttFzXP87KOBijzVifr4Jnc3Pj2uyc9vpuxTNP93uass7UpCQJ6cvUDYY1CQS1MJFbHoj2tXuETmOmleVUyipaXlmbkXlUXFYUlLsuHp+lXQ5NSIiTVY/MnZCwIRD0X5nvHyao0apTevhJuRNI4bpKkhGjficRxNNsWMtNEMXktHY1DgccNT/scPTch4y9QOLoKMTkWx9GbYwuMyPa2Mx5LxRjjp8vKnlX4v3eFPk8sRTFsmtDWP6hlNx9DxaD5ES1MdQ91tG7raHx/UarGmBZ1JaYrZHLrxzGQ/xIS4pd26aXsGZFuXTLav0frahkoHWpIxI6bHY4m0biR0vVFbBRmsLK2CzF0yVU9hE2zP66XpXHxlqDWRiIeWQ91Z2dwIYVL/j3DLpN9cPD//qS/fvnCe+jkL/p46dUffhrtejX7xmWObV9zIeu6eHH1i0dmXDF45eML6xs+Xuvd/77nnP5j+06LrD75389ru6Q0/ve255xe8fN8HL/1z5dXzVr2evmCVd/fGP6//1/tfuuzJ8fFTx/PT9M4TH578aJpd7BXG77v5sfSFF4nmh3636MjY6kPxF++9arB4zx29yV+z/emtmV/eePbE7lc6tvyxWVty4uBrtHQhudd69camfct/ce6tb/zk9bPn/aa4Ymdq69upfbeZ/3Bf1RfuX73v8HnzF6auemvhRTvOD2u3v3be0v135A989I1zLth0cmHrs5ee/Pu57PiCxfS3V4jcJSte+Uum6+Qb57W9MC8fzczvHPsWCV9/4MPmy6du721/5G9i4eHe1TsfOX/bO73n7H9l4Ho33ZH6zLNvD77ofLe3OXXo7W8fv/jOl0/89VdDqZdORXcWY7esD/9sx8EHRp858sNR4wcbBgeW3jNw6ZG49/7I+QcnP/2dcN+p4+/aR4/9/ujjp6649qx1D+p7rjm278h7rR+seDe29/jhFyLaXd9c/dVT267ZM7jzrmt79SsW8K9d9vkHPqWPJI8+/of7Xz/22dHhg5F3Ju8Zn94UavnKYRWfs+ZtbyCNKQjWvwGE1Hh5
|
||||
eNptVmtsFNcVNlhqHm1U6I9IaaPmdpWIUHnWM97xPkwcanvx214/sTGlzuzMXe/YM3OHuXd2vUucFIPaqokqT9I/SWgI2N4llmtCTSANpUpokSJKKxpVBEjTh1Dpg5AqKRXpD5qeu7sGu2Z/zM7ce853vnvOd87MZD6FHaoTa828bjHsKCqDB+pN5h2808WU7c2ZmCWJNtMV6+2bdh394teTjNm0prJSsXU/sbGl6H6VmJUpqVJNKqwS7m0DF2Bm4kTLXCoXd/lMTKkygqmvBm3f5VMJxLIYPPi2ERcpDkYKSmLDTrgGUijVKVMs5kd801QyyCIMWRhriBHkUgx/xKAoQRyEgX4GAVO4CoglMd93Ck6jLmUoDTjcizP7mq8C+RxiYB6XZijDpm+iAq2g07LBRJ3YJCtsOeQqy2ZsGKRoi5pJGqmKhVpK5FEGiDOiKZnNy3FunWwV2ADQQzpFJpxVMfHm1dF3wIpJNGzwpRGbCTIRTN3SuaUFaxL8U+ZgxYSHhGJQDAtwQhtqylyHI4l+ka/x3BXLwDJ2IULCtQpl51i37muAIefCDZiS0o3MMMWKoyaHHUxdg9HhUVp00TBVHd0uefnqUNEOYWtEtzAisGPqWageLxhXh4OT2KJ6CigqquoCQX5nQXkdqBkYliL4UT/FXBLcMQ0+hbwuCUGxaBpKXVApFxtS4sRlCPAcyCpXhsUhWiwbVmmSuIaG4lxoJXoF0fj5AWzFgYOC+mnh1LYDqnaYjouPBbvC3f8ddDkOJ2QQMoZcmyMuZRYqolsjvokJXk9oKN3BGs99CXTHMlMSH8UqKGNiYsdEPokVDej8oWzdTJJQ5i2s7LXDkDYMGsCWSjQI4P14JKvbFUjDCQOSOQfKsnChjN7cGMa2oBiQ7VzRy3tNsW1DVxW+X8mLOF9SosC5rN6e4woUoHst5h2NAYm6lsquDAwFC0l+OewXXxsXQNa6ZUCTC4YCfHJ2Yf/E8g1bUccARCgNHC9XdF5YbkOoN9uhqLHeFZA8zd6s4phBeXH5uuNaIC3s5Ru6Vocrbd4OF/BLkj90ZAUwzViqN1tol+MrnDFzMoJKAMM7IOZUqKyOvYufDA+rieG4WYvVVKI12jxsqKpMO5sHOiK4x60eiLYR1suI26Ik6lm9WBcUwyOCFKqKSCE5FKgSJL/ol/ySMGQEewIKa2p0snZca3abG4cbGoNaMFpfH0+HWlq2tpNIMJUyrerWrs5YvTE0yuoa6ZZqbcT0O63qmGOP9vV3WNFgf5al3LGe0XRqsCu9CQE7N6Vrtc1jVpYG20NZLTioDplD+tbYzkQy0tcmB5qT7ZIV6WZtker+Zup0L6MXqqoSxBLDoCiHRf5bWNKGAe3Mkt50WAoegv60oePwnhykjLl0cgZ0iM++ky+N+YOxttsSvn8mCpr0TvYl3QokhVBMZahKrJKRJNdIkRpRQk0dffMNpTB9d5TgkT4H2j0BMtyyJPm8mnStMazNNdxR7Ce52KGSnD5MUgGP24RiocTKmx8UeoovOKEluljsLIE4I4qlZwthvZMF1aez42lNdTUtmUqbYiQrB/Q4dtXE0ZILzAoeBggJJvWmq2VpobSzpLs5OKsoSKIgSm+OC3zWGTAOIZ+Fa+ktS72Zakj2G6sNGBmDWenl5UI1xJ8vt3CwCYLlsW/DyJFI5Gd3NlqCCoBJJBB8c6UVxcvZSFUmfWO1QQnioEjnx5esBV3zLj4MD8PVYryqCpRTLaoJORgPJsIilhU1FI5LgXg8Iv2Uj0MVUHgxbeIwgWIY1zrLeBcrTGWcz5jagFQdCMJJNyHdUg1Xw71uPEr4GegmBK8Ogyja4YZGoUFRk1joLejPy0e3ddZ1tDQcGxSWC0mI2cXPmbxFqKUnErle7EBhvDnVIK4Gw9LBOcDqqdvmHQ1rsojjmhwK41BAlCNCPYyhJbRbspvhkzavGMA9pXqLyUCtr0aWA75N8M1RGw5CmQofPbtzxdF/eo3+0DN3lxV+5cbU252/ENdF//mfey/TPc/J8tT2ym+s6d/Xtb7eu2p8ac+1142m0f5//7H73aMd/f7Hz/7r7Ecf+tHe7z93Ze8n38ksPPuFv1YSnDv20Icbq79yI3z8pdNXn3x98sZfRj9u3/Cj8vc2v3Xzt+6vuh+451uvPP6EtTvzt57LFZnvLcptl7/6aEVd6tilT8XTXfcIpx5ufVc8c+rTp+zQo+Hwi88f/va6+tTN96+92nbh3L7o89MH34psLf91bP/dB+7H+a4r++yX9diGkXe6d527+Y/9F4YOvrDW9+C2yVMvfdz0+f6rg8fnrdkHr9/Uf/LLS4PfPLF7//XW5J9fPLWWHjqx8TfTB2o/uGvvkzfue/+Z2VePHHq28rH/Tn609vLfz029/MVducXdmevtP+hostcP/PBPL0zBd9CZB8grj22ffuJ317JXnl7/3afq3xvY+PZU9IPzma4tj7D7zg/uDGfxvc3nH/l9h/I5+cJdT0NOP/usvOzcmdkvT64tK/sfUwEmhQ==
|
||||
@@ -1 +1 @@
|
||||
eNrNWAlsXMUZTohQWxQohyhURTAsBZJo33rX3vXZtDJ2fORYO7YTx8TBzL43uzvxu/Jm3q7XIajQ0jQFiS4SEvQIARxvcSOTixBoEhCFQlsEqhC0JohGkF5CVRooKmoF6T/z3l62kxaJSrUS+x0z///P/3/ff7w7CxniMGqZC/dQkxMHqxxuWP7OgkO2uITxb08ahKctbaK3p3/gUdehM8vSnNusuaYG2zRk2cTENKRaRk0mUqOmMa+Ba1snUsxEwtJyb17y8taAQRjDKcICzRu3BlQLVJk80BwYslyEHYIwShPdTro6woxRxrHJQ0i8NHAOmRZHJiEa4hZyGYE/ls5Q0nIQAeNzCOyE3wriaSLeO3LTZpdxlAU5Ypew69pAMOBYOgGtLMc4MQLbgpWmdN9ooDgxrIp1Qlhg26ZgwLA0osODlM2VqKUY1KSwinGHYCPQnMQ6I8GANEuej+dssT3pmtKbsLR02bw1YGJDvOU4Q/XcCCPYUdOwRCNMdajtrQq0Iu8FImaKmgRZ8MKg4+AEcW7hYoekiclohgQRVlXXwVxcmeAlB44OCx3CXJ2zEFrHiPCs2JiFPSgHfi36E5ssCx6ToRYRQzhhuRyBPAfcIhxsChHdXEbBMvUcyOUOhRcMretbzaRKZlLbJpwFUQI2W8kkYAphLYNNFdT4J9GIzdOwRLMMTE0IkgmAMEB+EMHZCHKwmSIoSXWAIfOOQg1Y4u+HVWnKZPBBlE4FbsEWrCti92wnqJRjcSAlgdXRCmd0mzZYyNKWq2soIYDnWydBFII42NiBAAkbRLBsBxDucErknVwkLqqD1V8hQjhVt6xR5NogzEcCIIWaKQBcgJqq7mpkxHOCFIrNXE9S4oYCKuWjOdv8B9hxcE7i1n9gurouEaqRJIYDgphNc6GkA6MgKr7nhQ+FP0C4yovH9/0Dr0LD5rCJ/J91gm7C7SWvSAg1l1dEQmigyDsyZusUPC9BIiHFEDUBd4aMBUo6loGYTVSapCrKkgSDEzO0hIRSoSAaDnRQETkdos4J0jDH3g4TMxxKWZnhwNKy3toKvQJEHnhNZDkpbNJxT6FPFfAwylLIYiL0Un0O/CrzhQ/GahMqbfb4QHvTlgmmSntabchv0piyOd0mSlg8jVTMCIBXUMwHmSa8ZggOC33YFohyqDyhHw5f+cbhQPmkm4Tt8AQLXSK9wqOlHtcInxWSqoj1+YHMUl1HPfHVQ8IDxDNcGOC7HyhRVK8AtRHYDo5kluuoxN8M7PDRqoXK8ts9oCEwIA4uQUtgs+/EIqbAbUsrdgj8krH/D9j7dvhIrIL+Z4x7kVszFgU8Q/oQ3jkL3OdirUgAKFoiIYqMKnKvF0GAsYi4BMR/YoPYJSqgwHqR39UUrCQLK1EECtQ56WELLoDEDCVZVrQvLbb8T+hREbYyU6oc8dmQxas8SHgAw63fGZyDOf8dK6TtbB5OePAbkWWxmhDEdKGt2BhIYEZVqCLFShrYNLeinIMTpf3VxGiDgzmig/LxD0F3LDeVhvzmVXMPK9VNhnjp21/NkmFPy3BA9heMit5PVkIomZINVBgKT6g6GkRgNqapNIe1WexofvtR6cnZ0otnBwVL/JMtLfVAOhkr6gp68cG6bJK4ZVMVnpWlOaLNrCBbEBXbIWoqMgZgDNZz0H2KKIr+qCrvVRZv2ZbMSmJ+FBLQnhBsnjswfsNYHZb2IhFAf9IDXDE/cdcxhTN1khENraceQGqBYwVpESdjvCKPVbmxn8gWeMBxiXdefq60laHMBevKzvF0ZSVXy0IZTZkig4A5sJuYaREj5JoaBBOad02EvJQ0+tNWFrISCMQcJXRoyFAavMS8PkmnoyQI0a2AgJ9moJS4cCiRwvvElAEThNCCHQ7ZZV7IrCY4AwsY6hAengUYw4JyUIEArJdwCkcFdEgvQuyZm4SjUSD/rPiLRnNEtqnz81WDKhUMZAkZDYiZweSir89BFD8lccWT2fBYDe0/ZxV9WilB2W4CKl0aUC+wABkHlxO8sDgpMt+87pLtQSlFnhUc5fbKa9ShIaZWJRpKkdYhb4OTW7uBP1kmowrFm1NVL/ZOOhZjGXjobCHsgEjpIhPZlu3q2BFINKmaJiVOixEw4VgYsFYxOLBKe4Y974N+2D0sQwD6ZCUxGWCqIsRlWp2FPz0yAqwZ5EB8RRqqjaaXitN6x0BLGjR5W1S5pC7sPfDUwn19TJv/sLPKxmy4iQPPj7QUJGQH6wAv4WgxYFJTkOPTVoiynFkTjV/rSkVChcimLBhvBJPKw+hZfOalbl/4nMwtiVhK293xttXr2rvjnaIQwT4/oTkGk6mhKi94+PJwBQf3r2WMHaJCNOFOwtghMILKwgMx9+2o5P6cZm+46EG/kBnYGZUDLTVh0uWCAnBDgHOWAbwSw4msIN4uKDsS8fOIFY9BpqSaEGxbOuUSx8y2HCFUatssKlrV0A38hhEXfJzIVfhANh7yuwPQUqMYuvd4z4BXd3PQsqtY0AP7syhmo96HEmkHmFHtgjLYAG1erYYmA+DlzbllLFmJzUTlsGzbpm2FNBHUY28vuHgCAsbz09XfgR6HQRzKqUJM1RJ1IH8gNU5tGPtJUpfj+Tjj2hQ40iRyVMhPjRJiK1hM9JPe3vxe0clBdRHvazYzy9zjJztFGDT39ZRIWQpUKpPnD/aAKa3dNb05wJMJfXpjYyi8d0xhogsXmUXRIV/kJ235/meVL2woTSBE8T+J5Se9zdOVayyW370Gqz39VSIFR/K7sWPURw9UPndcU6SofKGtd646/2VJXaEuFInAv31VklnOVPO7ZcvwZNVuwp2cologJP9weFKFWkpJ/s2FnxsZUZMjCWP5zam1K0bs9kyI6R2R1i2dK3FqQ99YnMfXd9R1jzaMpnIrx0ZSOu/hjUqkIRaubYiGY3AZCofACiXJMoPOBse1+4luNW3RDW1IXRGLNqa7exMhymuHbt48NKr1pLpSvDZZS3Fjzs7pWwZiZky3u/rjG9jKuLMm1h9ed3OcNTatahs0cWI8Pq6Gs7H1sVR/b/8W02jsTXd1rVVbEJjsZqi2fLCPrF3bHh133KaBbE+XFb4p0lW/Xmvs7Yy3dlu5ur6eWI8dW5fGtUMVNjeEI0rYN7s+HG0Mi5/pImR0YqZ4Oj8RjcXqfwIJy4ZsTr41CZ7kLrtzAlBKXn6p4H+gfKRnVRngV0y0A2LzRweJFkSRRrTSNVFtuDaGIk3NkWhzXS3qXDOwp83XMzAvNPcNQHlikI2UFUVCFNS0a44SbaptXhIcFSSAAKten65AJYa8qPhW5fdsUPq8bk3pbj/g8U6pHOTyRyUbsuNjWU11NS2dyRrhpvFoHU0QV00e9LfAoCXUgEGKwfKPRhtqp/03RTxOwVnDSiSshCNHx4DPGSunuLaXfRRQkaEqkR/fxO6Gp8cU8flNF32K4v0udpX5iRgE5PDcBdwahYKcL0RlxMLHKlc4ROQ6YV5ZTLSpqenI/IuKoupgSVN9/dPVq6DJqRATqTXY4bkLfBGPhNmeseJqhWr5ma/CzUgTialaXTRWi5siakNDpFarVRtxQzIRSyYTamPkKe/jh8JFvEVqBxdBRqc8l58JGnhMpKfldZFYXT2ctKX4aaXfTbR702QLgmlLh7bmcTWpqBg6HsWDaL7QPhRvXdPddmiDUok1xe9M8gXTYtCNJyf7ISbEyU+puuVqkG0dMtnWofS1DuUPNsUiWkSti9Wr4URUVeuVmyCDFaWVkDkhUnUB62B7Rs0fSNctDzRHo3WBFqg7yxvrIUzyi/4dk15z8cLCw9fc/fkF8mcR/D9z5p77Wq3j4cV3nfrk8h+9cvrWwitvnezb9bu9fPvClgt/3nvlVHvL5uvXfv9ksDV/5usvxdXse9/cdNuJFzuf6TzdtOjWlfsvDO97d+2VhQ+Ov/i3o8+8d9H2nts+Nnq+8Xzna5dt++T9g6F/bDzUdcH4WPslIwOnzv9yIbJs/3Vfujr3fPSJPYMv7J/+3njTgrZL3fvc0B8v/0PH0P01d5P7r3BvueWh11v6zlt0/NlXFy84kfvohofWv3OHsWvDkYHF33m2RfvhqssWfDDRf9fpiwe++1bHoQd2oa/85qI3nrzg2CMPZ/fuHkGH72q58np74upA9+1HHly1/NC2E7889tcndi40lRU73/ni5KlXT/J9jz3Gdl793DV9byz76XPTp19aNKMtbXf0XYXFJ27f0Hlm2bVs5z13r5m48arbBx8ffPdr9Uefa//1+4uvWzFd+Kd7Gp/+xZGOxMmV22duevBEYrtzX/z4n44ORBfNBD/8vfn2D5bfMHjoXw+dvuzi8za1zzTe27fjtzsK9ff+pevM65mZ92994Py/b71+744Lto1/YYdy6sOx5uBVbe2vHfxV5s8/Hnz344PTT019NPXO4WOXBlvIMwtFeBYtCD/YuX8NxOrfsj9W3A==
|
||||
eNptVmtsHNUVNk4l+lBboyoWSlSYrIJ4xLOe3Z3d9dp1wc944/gRPxLb4Cx3Z+7ujj0zdzL3zq7X1G3zED9IhZhCIxoQtIntBcc4hKSUuklRS1yjNopaodI6RTSiRaFS04RH2kIR9NzZNbGb7I/dufee853vnvOdM7unkMU21Yh5w6xmMmwjhcGCunsKNt7lYMr2TRuYZYg62d3V23fYsbWluzKMWbS2uhpZmp9Y2ESaXyFGdTZQrWQQq4ZnS8cezGSSqPlz5S884DMwpSiNqa9WuPcBn0Iglslg4RskjoBsLCAhg3Ur5egColSjDJnML/BDA+UFkzDBxFgVGBEciuGH6FRIEVvAQD8vAFP4FgWWwfzc9pxGHMqEHOBwL85sg69K8NlExzwuzVOGDd9ElbCKTvx2Q+jEBlllyyF9E8OwYxAV63wrbTFRJqKhmRq3NGEvAL+U2RgZsEghnWLYgBgWZJU5NkeS/BLf4+yLiWB5y4uQckwv8Rzrs+daoGYiwzNgKKvp+QTFyFYyCRtTR2c0MUKLLiqmiq1ZJS9fg1C0E7CZ1kwsEDgxtHHIH08Zr4+NM9ikWhYoIkVxgCB/MiHBNmQNDEsR/EI/xbwo3DEHPkIeSrJcCmTSHCTb0wkvt4CSxGEC4NmQTl4bk0PETQt2aYY4uiokealL9Lyy+fkFLGTDRUF/1Lu1ZYOubKbh4tKz857+76IrcTghnZBRwbE44nJmoSKamfZNTPB6gqQ1G6s89yXQ4RWmJDmCFQamE8MThQxGKtB5s6xiMkMoc+dWq/0opA2DBrCpEBUCuM+lxzWrSlBxSodkzoCkTOyV0Z0ZxdgSkQ7Zni56uc8jy9I1BfHzal7E2ZIERc7l2uMZrkAR+sdk7okuINEQr+7OQ1uaQsAv1/il58dEaBnN1KHNRB0Bn2nLO//5ygMLKaMAIpZa3p0uOs+ttCHUnepASlfvKkieZncK2UZEPr5y33ZMkBZ2C03d14YrHV4NF/IHAv7osVXANG8q7pTXLj9d5YyZnRcVAhjuj6VpBSqrYXfpvURCSSWSRj1WsqktzW0JXVFk2tm2oyOGe5zwjuZ2wnoZceIo1cgapYaIVJMWA9FgLBCVo6GgGPBL/oA/IA7pkZ4QYptb7XErqbY5ba2JptaIGmlubEzmovH49q0kFslmDTO8pbuzq1EfGmENrbQlrKYNv71FGbWtkb7+DrM50j/Oss5oz0guO9CdqxOAnZPV1Pq2UXOcRrZGx9XIgDJkDGnbu3alMrG+djnUltkaMGPbWHss3N9G7W0r6EWDQVEqMYxIco3EP3PL2tChnVnGPRyJSc9Af1rQcXjvNKSMOXTPJOgQn3m1UBq0h7rar0q4crIZNOme6ss4VUIgKnQpTAhKQVkIyLWBWK0UFDZ39M02lcL0XVeCx/psaPcUyLBlWfIFJeOYo1idabqu2E9xsUMlOX2YpCIeswjFYomVOzsg9hRfMWK8+Xixs0Rip5GpjXth3VOe6nPjYzlVcVQ1k80ZUmxcDmlJ7CipEyUXmBU8DBASDeoeDoWjc6WTZd3NwF0lMSCJUmB+TOSzTodxCPn0vkvvOepOhiHZL11rwMgozEq3IHvVkH6x0sLGBgiWx74KI8disZPXN1qGCoFJLBycX21F8Uo2gaBBX7rWoARxSKKzY8vWoqa6SxthkaiJoVCNjIIhGYVxtEaOYhRWg0hNQtJTaijwMz4OFUDhxbSIzUSKYVxrLO8uVRlojM+Y+lAgHIrATesEzVR0R8W9TrKZ8DvQOgFeHTpB6tGmVrEJKRks9nr6cwvNg50NHfGmFwfElUISu6ziH4qCSaippVLTvdiGwrgzik4cFYaljacBq6dh0D1Ro8oSTqo1Cg4rIUmOiY0whpbRPpPdJJ+0BaQD96ziHs+E6n21shzy1cFbv74mAmXy/nbsni6O/oUb7Fv3f77M+6zRt/228xWp4uSFTZVPr39scve+nRU3nu5te/Kmh+5LP7FOP7qW7P7ilb9su8XYe/6tV86c/ceZj+rKK+Yrv1QxIg4OkreO/Hdx/o2nhl9r+XDX+08+Fn3/Y+PbhbMffvTLswuzvb9778trL/UXHpqNvnrwjXuOVN6mXDhYudQy9LeWp67sm1rXcfvoNxZjC9vrvp6OH1iM31R4+5mdj26/+XT5Tl/Zdy99ctGeX/fy6fLvL91y7OY9enfVV5rKDt322obGB3+0pn0xmd24Ntb6x80v/+HyvgOd5w98syF+Y05qu799/f0zb//LvsvY9ZMvvLD/UuTX3/vPPx9Z6P7rvUMPunef33SRDewQf3PP3J/XP37ynaXhz10+l3vz45kN32p/nS588PBcaOujyctfW+zt+hPb+MOBg3fuPPvJu0dG1Gc7Hr+v445nXx+4dSou3/nv3N+/U9nUcPHg3f7fv7PxVxHp3MjU8N7nyJXLX92UQC/Ovqs/fOGOQ8MfrCkr+/TTNWXH929q+UF5Wdn/ABcdAng=
|
||||
@@ -1 +1 @@
|
||||
eNrNWXtwVNUZh6JVp0619j06crpDBdq9m30mm1CKIQGyCElIAiQQjGfvPbv3kPvKPffuZuOjYGlHR0pdxrF2HJwphKTNUAQDSrH08QdWa6c6WoeiHdROaUsHa3211kfpd869+0oCrTN2phkmuY9zvu873/f7fY/L7eM5YjNqGrP3UcMhNpYduGHF28dtMuQS5mwb04mjmspoZ0d3zx7Xpie/pDqOxZrq6rBFQ6ZFDExDsqnX5SJ1soqdOri2NCLEjKZNpfD8VUtuDuiEMZwlLNC08eaAbIIqwwk0BfpMF2GbIIxUolkZV0OYMcocbDghxF/quIAM00EGIQpyTOQyAn9MjaGMaSMCxhcQ2Am/JeSohL+3xabNLnNQHuTwXdyuLwSCAdvUCGhlBeYQPXBrsNqU1HwdtRPdrFrHhU1Z1Ua9RajNzCMZGyjlW4wKYK1jKriwpCKgfJgpUtaDPYgypMPhsE6WTFW5KRjQTYVo8CBrOVLclHRqUFjFHJtgPdCUwRojwYDwhHCpU7D49oxriADC0vJl080BrgTeOjhHtcIAI9iWVViiECbb1PJWBZqR9wIRI0sNgkx4odMR8Dt3NY+qTVRiMJojQYRl2bWxw68MCIwN3oaFNmGu5rAQWssIDybfmIc9wjmlEGKD5SFIAl0cJAinTddBIM8G7/CYGlxEyhGBNw2tAHIdm8ILhtZ2rWJCJTOoZRGHBVEaNpuZDMAYYSWHDRnU+CdRiOWosEQxdUwNwIUBGNRBfhDB2QiysZElKEM1QD7zjkJ1WOLvh1UqRIl7GURplFMFbMGaxHdPdYJMHcwPJKWxPFjljJRhgYVMNV1NQWmOdd86gdsQxMHCNgSI28CDZdlAKtuhRNyJRfyiNljdVSK4UzXTHESuBcJ8JABSqJEF3AWoIWuuQgY8Jwih2Ch0ZARuKBBBPJq2zX+AbRsXBHz9B4araQKhCslgOCCI2TQdShqnhJnxPc99yP0BwmWndHzfP/Aq1G/0G8j/WcsZzt1e9oqAUFNlRSSEekpUJ8OWRsHzAiQCUgxRA3Cni1igjG3qiFlEphkqozxJMzgxQwtIKBsKov7Acsojp0HUHYIU7GBvh4EZDmXNXH9gYUVvtEovB5EHXgOZdhYbdMRT6FMFPIzyFBInD71QXwC/ihTlg7HWhGqbPT7QTtU0wFRhT7MFKVUYUzEnZaC06aiQhRgB8HKK+SBTuNd0zmGuD1scUTYVJ/TD4Svf2B+onHQTtx2eYK6LZ3R4tNDjGnGmhKQmYl1+IPNU01BH+6o+7gHiGc4N8N0PlCipl4DaCGwHRzLTtWXibwZ2+GhVQhX5rR7QeMJsB5egBbDZd2IJU+C2hVU7OH7J8P8H7H07fCTWQP9Dxj3PrTmTAp4hfXDvnAfu07FWIgDUSZ4QeUbludeLIMCYR1wA4j+xge/iRZdjvcTvWgpWk4WVKQIF6oL0sDgXQGKOkjwr2afyLf8TelSFrcKUGkd8OGTxKg/iHsBw6zcIF2DOf8cKYTubgRMe/AZEWawlBDFcaCs2BtKYURmqSKmSBjZNrygX4ER5fy0xWuBgNm/afPxD0G3TzaqQ37xq7mGltsngL337a1nS72npD4j+glHebopKCCVTsIFyQ+EJlQeDCMzGNKs6sDaPbcVvP6o9OVV66eygYIF/soXlHkgjwyVdQS8+WBNNkmNaVIZnFWk272yryBZEpXaIGpKIARiDtQL0iDyKvD+qyXvVxVu0JVOSmB+FNLQnBBsXDozfMNaGpbVEBNCf8QBXyk+OaxvcmRrJ8R7aUw8gNcGxnLTIIcNOVR6rcWM3EV13j+0S77zOhdJWjjIXrKs4x9OVF1ytCGU0a/AMAubAbmKoPEbINRQIJrTYCg95OWl0q9Ce61w7NNppDRoypIKXmNcnaXSQBCG6VRDw0wyUEhcOxVN4Fx9sYGjhWrDtQHaZETKrCM7BAoaWcw9PAYxuQjmoQgDWyjiFowI6hBch9szNwNEokH9K/HmjOSDa1Jn5CvMGsC1PyGCAzwyGw/v6AkTxAxKXP5kKj1XQ/jusqk8rJyjLTUOlUwH1HAuQcXAlwXOLMzzzzegu0R6UU+R5wVFpr7xGHRpialajoRxpDfI2OLk5BfzJMxFVKN4OlbVS76RhPgmCh84XwuUQKY1nIsu0XA3bHIkGlVVS5jSfOtO2iQFrVYMDq7an3/M+6Ifd/SIEoE9UEoMBpqpCXKHVefjTISLAmkAOxJenoWhcXchP6x0DLWhQxG1J5YJY2HvgqYX7+oQy82GnlI2pcOMHnhlpWUjINtYAXtzRfMCkBifHB60QFTlTJhq/1pWLhAyRzZow3nAmVYbR8/jMS92+8GmZWxCxnLZT7S2r1ram2lfwQgT7/IRm60ykhpq84OHLwxUc3L8WMbaJDNGEOwFjm8AIKgoPxNy3o5r705q9/pIH/UKmY3tQDLTUgEnX4RSAGwKcM3XgFR9ORAXxdkHZEYifQSx/DDIF1bhgy9SoI3DMLNPmQoW2zbyi1QzdwG8YccHH6UKVD0TjIb47AC0ViqF7b+/o8epuAVp2GXN6YH8WxWzQ+zYj7AAzal1QARugzavV0GQAvLw5t4IlM72ZyA4su3XTreMq4dRjp2ZdOQoBc4r7az89PQiDOJRTiRiyyetAcTI7Qi0Y+0lGE+P5CHOUCXCkQcSoUJwYJMSSMJ/ox7y9xQO8k4Pqwt/XbWamsc9PdhI3aPrrCZ6yJKhUhlM81AGmNKfqOguAJwP69GQyFD4wLDHehfPMImmQL4pjlnj/aPULC0oTCJH8r3DFMW/z/uo1JivuXY3lju4akZwjxb3Y1uvjk9XPbdfgKao43tI5XZ3/sqxuPBaKRODfwRrJrGDIxb2iZXikZjdx7IIkmyCk+L3wmAy1lJLi87MvGRiQMwNpffGG7JplA1ZrLsS05ZHmoRUrcba3a7jdaV+3PJYabBjMFlYOD2Q1p8NJSpGGRDjaEA8n4DIUDoEVUobl1tu9tmt1E81sHNJ0pU9elogn1VRnOkSdaN+GzX2DSke2LetEM1GKkwWroA31JIyEZrV1t/eyle326kR3eO2GdpZsvKFlvYHTI+0jcjifWJfIdnd2Dxl6slNta1sjL0JgspujyuL1XWTNmtb4iO029uQ72szw0khb/Tol2bmivTllFmJdHYkOK7FWxdG+KpsbwhEp7JtdH44nw/xnfwkyGjGyjlocjdc3RL4PCcuCbE6+PgaedFx2+yiglPzq8XH/m+jujhsqAP/saCsgtnhsPVGCKJJEK10DRcPRBIo0NkXiTbEYWrG6Z1+Lr6dnRmge7IHyxCAbSctKhBiXVdcYJMpEy4wkOMZJAAGWvT5dgkoMeVHyrSru65W6vG5NSrVOeryTqge54jHBhvzIcF6RXUVRc3k93DgSj9E0ceXMIX8LDFpcDRgk6ay4JxGJ7vfflPA4AWcNS5GwFI4cGwY+58yC5Fpe9pFARY7KRHx8g93R8NFhiX9+03ifInm/S11lcTQBATkyfYFjDkJBLo7HRcTCP6leYROe67h5FTHxxsbGH8+8qCQqBksaE9GjtaugyakSE4nq7Mj0Bb6I3WG2b7i0WqJK8eQ8uBlQInKDUh+LNiRjyRjGcRKJRxuiDQquTzfI6bDyI+/jh+TwePPUDi6CjE6dQvFkUMfDPD0tjkUSsXo46aLSp5VuN93qTZOLEExbGrQ1D8oZScbQ8UgeRIvjrX3tzatTLQ/3StVYk/zOpDhumAy68cxYN8SE2MUJWTNdBbKtTcZalktdzX3FQ42JCBwg1hiTY+G4LNdLSyGDlaSVkTnKU/U41sD2nFycVGOLA03xeCywCOrO4mQ9hEn8J8LWMa+5OD5779y7Lp0lfuZs39nMXkBXfuPVdz59z+RzRbI3d8vZY7qtxuj8PT8/sHNoC7mJPpt578VFbTv2HnpiVeiat/8uvSq92xt+7PDvNz52+sa7D526Z+4Vr133ym0nn7z/y+65v/2p+5H3jr7zr9PWrnVnXpAGlIv/+nF0JvXWjbvf+mjbnqsPXo4utU+cfmf3xInVh/dfctOarU88dKr75E308LNk03Or+3745Ok/jp7ac++V/7z3/QWfk88OJSc7xl/aavYdHj568Mo9f95qPrVs+2tbrnvR+sSeHfOuee34RV9s+khD70+Pf+q+T37mvh1nY723NNRd/vTeBQ/1fWf7d2e/fM3Ox/Eld+Xn21+5P/rbd7c8+va378k/f+aIOeeOO0JX37HtgSve7A3nGrZ1PPW6Nffpkc2XvYqHjcnfbX//7fgbd96WjCy9KnPdxP3HP/bKhovnHbpz8szQwj9c+5t5u5ZLO59JzjkefPNc5sa/LP7oif1fu+Hdy9oue+D6w1/d8szrhZd+ELz+1OMnfvnrlmsvf+PSl8fvfXrnN+9esrR5/s7P/2PbL14s/uzhXbnFF81980hwl7n+W0fUi/r3z6vbcXwA4nHu3JxZZw8sG2qaM2vWvwFt8Hmz
|
||||
eNptVmtsHFcVTuOGRxMV81AkkKrcrkoL1LOe2R3vw49EtreON47fduI4ity7M3e9Y8/Lc+/sereYtq6popKKDkQVjQSqHWc3dRwnaazGTUl5toAIKiA1loESpBJVohAghZb+oZy7u05snP0xO3PvOd/57jnfOTOThTRxqGaZd8xrJiMOVhg8UG+y4JAxl1A2lTcIS1nqbFdnb99x19FWvpJizKa11dXY1vyWTUys+RXLqE5L1UoKs2q4t3VShJlNWGr2dxXCIz6DUIqHCfXVooOP+BQLYpkMHnwHLBdhhyCMUkS3k66OMKUaZdhkfsQ3DZxFpsWQSYiKmIVcSuDP0ilKWg4iQD+LgClcBcRShO87RacRlzKUARzuxZnd66tCPsfSCY9Ls5QRwzdRhdbRiT9goA5iWOtsOeQGy1atZIharQxSsIniZeYoC6yZpeLsrrUgN4+1AWk/cEMaRQYcFBtk18bQh2DFsFSi86VhmwmyJRiaqXFLE9Yk+KfMIdiAhyTWKYEFOJ4NBWWuw5FEv8jXeOJKNWBZuxgh6ZrFmnOsm/e1wJBz4QYMpzU9O0QJdpTUkEOoqzM6NEJLLiqhiqPZZS9fIyrZIWIOayZBFuwYWg5Kx6vFpeGQFDGplgaKWFFcIMjvTKitAwUDw3IEP+qnhOuBO2bAp5jXVRVgk2agzkWJcqUhnLBchgDPgaxyWZgcIm7asEpTlqurKMFVVqZXVIyfH8DGDhwUpE+Lp7YdkLTDNFJ6LNoV7/7voGtxOCHdskaRa3PE1cxCRTRz2DcxwesJ3aQ5ROW5L4MeWmNqJUaIAsqYmDg0UUgRrAKdP26qnE1ZlHkL6xvtDKSNgAaIqVgqBPBOD+c0uwqpJKlDMudAWSYpltGbGyXEFrAO2c6XvLyz2LZ1TcF8v5oXcb6sRIFz2bg9xxUoQOuazFvsBBKN8equLEwEE0l+OeIXz44LIGvN1KHDBR0Dn7xd3H9l7YaNlVEAEcrTxsuXnBfW2ljUO9GOlc7edZA8zd4J7Bgh+fzadcc1QVrEKzR3bQxX3rwVLuiXJH/43DpgmjUV70SxXS6scybMyQqKBRjetJhXoLIa8VZuDA0pyaGE0UCUdHJPrHVIVxSZdrTub4+SHrdmf6zNYr3McuM42cSaxMaQGBkWpHAgKoXlcDAgSH7RL/klYVAP9QQx293i5OyE2uq2tgw1t4TUUKypKZEJx+P79lrRUDptmDV7ujo6m/TBEdbYQh+qUYcNv7NHGXXskb7+djMW6s+xtDvaM5JJD3Rl6hCwc9Oa2tA6auZoaG84p4YGlEFjUNvXOZZMRfva5GBraq9kRrtZW7Smv5U63WvohQMBQSwzDIlyROS/hVVt6NDOLOUdj0jBk9CfNnQceSIPKWMunZwFHZLLPy+UZ/xMZ9stCW+fjYEmvUt9KbcKSWHUqTAUEAMykuRaKVorBtDu9r755nKYvttK8FyfA+2eBBk+tCr5gpJyzVGizjXfVuyXuNihkpw+TFKBjNsWJUKZlTc/IPSU3m5CPHa+1FmC5QxjU8sVw3qXiqrP5MYzquKqaiqdMcRoTg5qCeIqycWyC8wKHgYICQb1jstSZKG8s6q7OTirKEiiIEoXxwU+63QYh5DP4rX8iqXebA0ke2mjAbNGYVZ6BblYDfHVtRYOMUCwPPYtGDkajX7/9karUEEwiQZDF9dbUbKWjRQw6NJGgzLEjEjnx1etBU31Vu6DhyElHAqJUiAcDteAnJSALEo1Ck6GpIQoRTAOvszHoQIovJi25TCBEhjXGst6K1UGHuczpiEo1QRDcNI6pJmK7qqk103ELH4GWofg1aFbWD3T3CI0YyVFhN6i/rxC7EBHY3u8+aUBYa2QhE679C1TMC1qaslkvpc4UBhvTtEtV4Vh6ZA8YPU0HvAWI6oskoSqyDiSDIpyVGiCMbSKdlN2s3zSFrAO3NOKdz4VbPDVynLQVwcfHA2REJSp+MXzeL40+l+7Q9vxjU9sKv4q9Gd+1PETsTL29w/veps+cUyWvxie+unj+x5b6p7yLa/cZxpnzt3zyb5/X+3+7WK837/z8nuXr/91J5p6ajI5deNYduHIth3f2//a5pGXHnz3Wu7ahavvXtj9bP3LHyx/qMcfOF0RPvin//754rWTR+6s7tohXfhh/eIL3V+4ulv71v1jH//cdnTwIPnqdx4+ObUd1z/9bfvYk7+ZySx/KRI55r3xWGXT+2/9/voLW/8z/fnnOjqe2vzOwBa57lHf1ODR003P0R9n8w3f3fbM8feff+XNXWfPxYJbYpEbm++yXt8hHH1a+duLF70Xlz/1zi8+86Ay9nbua1fapr9+im69PG0WIt98Mv2vysobb9bo/mfzr88Edn6wVPHWx1rOv3rvPXfX/Tr7Rn3u/p/lP7t0+Jdb/7G3NVz76WtHlru2XdHEv6Qb3vvBYWXX9KlTR5/vnLm+nO3KfpltuTIwVp/TnMPhLX/41cN3y5E7H4WcfvRRxaZ/1vsDk5s3bfofkcIdZQ==
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@@ -0,0 +1 @@
|
||||
eNqdVXtUE1cax9faVo+PwlrcbTWNlVZkwiSZhAwxbREQeYQICRB0VxxmbjJDkplhHiGBuirW3fVR6NRHK/VYK5BY6kFaWKx06aq7ler62NZlF9pdPV2sZ0/Xx9YHdreu3ZsQWjj6V+ckk8zc7/H7fr/vu7c+7AeCyHDshIMMKwGBICX4ICr1YQFUy0CUXgz5gERzVMsKm93RLAvMYDItSbyYnppK8IyG4wFLMBqS86X6takkTUip8D/vBdEwLZUcFRx8oU7tA6JIuIGoTl9VpyY5mImV1OnqMmj/tKiSaKDivBTMpvKwXA2rAgEiEkPFuVSkzALGxQk+dYpa4LwAeskiENRrf56i9nEU8MIXbl5CMA7xMSwDrURJAIRPne4ivCJYG6YBQcESG1toTpSU9vGgDxEkCaA3YEmOYli30umuZfgUFQVcXkICKapaUaLaIGAWRIlR2jwA8AjhZfwgNOKrdBA872VIIrKeWiVy7MFYgYgU5MH9y20R/Ahkg5WULhuEkpGbuiIIOWZVWk0aqkE7AogoEQzrhaQhXgKiCvHR9ffHLvAE6YFBkJh+SmjEuX2sDScqrVaCtNnHhSQEklZaCcFnxDrHvhdkVmJ8QAlnrrg/XWzxu3RhvUarhZ93xkUWgyyptEaZPzzOG0hCECE5GER5E20fJcgLWLdEK81aHXZAACIPWwZsDEE3SRbrW6Ak4PRH4Vjv7Lflj2p5IS6xJQvKo/Q6aDlFpcNVViKo0qE6g0prSEfxdC2qyrE6DmbG0jgeqMM7DoFgRRfUIntU/TBJy6wHUG2ZD1S8N6I4rCYCH/YiAgI8JwIkhko56ESKR4YGyc3qHGkyhBPcBMvURtMqvVHpa2oDNRQpUxTtr/GheC2mZyqBTLq6Yi68wEXSQECIT1RadDoca48tjbLfBotFES2KoNreAOxePxdEZH6k8xGYw8+QAImKGXE39QQQAfLlZXwMJD16j403jG9AUfS9+w0kzgPgRhDG0Oj1wVgLAfigshGA34fBcBz/7YONRkPpoQluMvaMtxLBWDRanU98736DWIj9qHgwMGqNMJQy+BR8qNAZjDodpccBwE0USWldqMlIpBkwFFAuDAP4EcgMQ8IoEcV5TpAgRyTcy6SgMpjiIwKRabTotQa9EVZqVjEs6ZUpYJcrs7hIDaJZxQvAyxHUIdKFkARJA2SkSZVwVnlhhjU3s80OQWZynIcBr3w6IbGignRVVPosRo+uJruKcuMrOLEIz3IKOVnLNC4es5dWcIyXqyn0SwZraX4p7ccQbRpmMui1egMUVoNq4HAhrhyrS8xzLZcyZWpZmkHAHCa0iGVdSwMlJiNZ5sQqUK9dyrU6soPVOCpUlcullZ4MqsKzMsvHFZUVL6UraatdtubZglWFOF29vEAy2opILcMHUWsaXhL0cUG+CmM8jiJYIiHRllQz3Hx5BpJuic0WAmcLGZks/ehkmVVUlBiLZvx2alYth2eDjfUGzSp7hGEAfwkfsDMSsBRyLBjcAYmR/QxlsVUSORm5ATLHSaOE1Z1dbs3RWWXjUr2MVUmUhOb5nXyw2qErNGWPYQbHMQSNkWNEMVO0Nb+H/gNRdTuRsVsFYuNHDsEwy4ks43KF7HCigKC0kV5OpuDJIIBQ5jKkOKNc6cKxNCPAMdRFulyUUY8juRlZHaPRvttYWiLHSpjwwsbzk0onrbeo0zFMrzarfITFZIQzFj0qN4Qijcq6P5x4bP7Wh+Ki1yT4/fbbbQ4rtyBteu83ZbPSP9tw1fTTpw9bP55oSXimh9na7Ux8fLP+9ysfLdlM9j104+K0l7YPvP96XN3c6V9//dWF+Na+h9cTbvbYwLz+X15y1wxfb3yj/S/s4btXV98rIOpunp+/tqXiunSnzB36zdKmdSe/zO/ftPPlJYvylxw3xif3t2/9V/9/vxSqj6xsfH5Wzge75Fz34t1TGpasOvvq24N3a7udz5XdOb5ry5FN65Li4hqHDt/Y1jDvNHI6nKo837om5/Kw4eGj1FeGyo4EzZaiM7cmr9ycV93yi4vxv65xJHQXDc/+54S3GyZ+Uv+EunoxfvTZTuHK0TcfxejD+ya+Vrxr6owDl2byR/ac+/zar+ZvWLCmq6Do88eSX8nSooVPEqXdG/uuzDTk1tUXPJH/v+wpmllg5nPuu9cnXqtLnoxd3kvzKUaxLemxAXvS8t2TXMNTEx/B6YWZSYsHz3RnvVXMzhhqsWXuO534xwS05yfp67vyZpuu/eE6vW1DvHnyli3ystv74qvBI9TCvDXTmrRzuhZsIhdLtydvdTkuzHZ4+045lv24+Uft97o33rk+4Fk4q6v8+EfH3i1l/103r9s+cKckfDsQ9+GJ7XMGOodfHmiT5trPzTn1JPaPaX/qTGhIJg+pdse9fups8pWTMrbYkiS4hxpDq/ccWh7yP+P8KzbjwhsTEpsf//vcV2/e1N+41dOwumdH66KKnU3dTX/mdITiLDt04Cn9flv8+nNDn86s8L6bW770k5dO7TpRvXVmn/+yNHj26H9uzC0puuU+WmlZ23zxhKL+RvBXb8v+bLt551527oyGL6a/tuete7vOruptPIX7L56/N9mZn3vSV5zUN5x/MtxVcFtbm3q1/wulY9GOGi/TXHBk74vNc6aue0FdgCe0djC4f4f98pD549CZ/i1NJUPP/u1nTSfPX1oX7dNJcb/rM882TImL+z+l5DCL
|
||||
@@ -0,0 +1 @@
|
||||
eNp1Vs1u20YQRg899NZHWNApmgCkJJKSbRnowXGBIk3dBLGCFG2KYrgcSVuRu8zuUgpj6ND059SL0DdojDgwgjaXHnvzoYe+QPo0nSVFOU5dQID2d+b7Zr6Z5dPTOWojlHznpZAWNXBLE7N6eqrxUYnG/vA8RztV6cndO0ejZ6UWr6dTawuz1+0mmGrFZ4EupRU5dkoTIBgbhB3I4YmSsDAdrvJurlLMuiDtVKtC8A7PoEwxiAOjpEQbRL2o34uiYTAPP4j3e12upAOFJ4lKq9c/H3s5GgMTNN4e++rY0ypDGnmlQe35zKPjFqVtNi0+diPvwRTsh4bZKTKVpcSDzaRaSIaPIS8yWhwzXkoUY6Vzb/n18msyZCpjMXd23EzIMWqUHA+UHIsJLR8vl6dThJSw/fTyoPEajKoCV+fvQ1FkgoMLXvdb4nV2n9AF+xM6sjr/5aayKu6GnXi7E8YsT7tugSuNW+1aCd2o02PKdHPgymxFcWeb5nQUNJ9ugc63+ywDOekWFaVDbsWdMKSfO1FUgjhtHdytdxgfT7oara4CF/itDCfAK9Z6bFGcfRHs50+Cj8E6+JSCQW8QDUfhoDcMh1/+AbRn0lkg5Fw1tAKRrs6v7cSczlL2YJcPg348SIMhJONgnMRp0t/l/Z20/3t7eS2h1fl7YCmyhf0o/K2N22coJ3a6ehb2ey80moJEh98/NxZsaZ6ecEL+91+n68T/eud2G/gfXz0OyLwM7jW2b6Wr19eSKNyOx/0oGAx2toP+zi4PdpP+OEhwp0//CeIQ33Z8Evb7u2cHToG15FdnM8QigEzM8XJyX72d25M6aH+OpqXPoiE7hIq5+LFwsNcb7kU77JPDUa3d74iPFnLyz7svnIhpzEnDx15G9yWvDmmyu70dLX1PlbYordtbU3bDjbA3uvZG/6Nnc0nQbKGFJb8sJUcsAT5jVjHQqpQpi6NeL4h7vR67ecCAFhYkc5YKwxVVHaZMyLpsNE5cJ3CGQXJBQNghGlUoC7kAdt2JS0tyUbFbGh7d6DyUD+WRyuvacgYQdCYukF6gE9JwLYq60dCEZ9QN9tztsMMcwdvCTNkIkgwtu847LB7UYG/s0a4wzOSQZYyaSMVsc2hcE1vjbsEelTlqAeRY2Mphqs3S/QQJ1pyIUkwSQis3iNeRfTOm60h22C3LXD6AwDNqYlZNNBRTwS+zcfGcoGpyzciamEhTRyZquN3X5WzNzTTkYpeOsGW4T0WuJ0i+sqwRZp3ZC7KG4HGYU2JTNtYqv8S5pVp7eTNDD0DPwF8nygExuLG3psUKTS0i+I+IfLYgmlO2AKJHh5CX2ijtoudcX5xvwlCTjRuymxQcbA6NSMdr4sQ5iIYXqXWYgKRortKQA0PFQKqkJpaaNtlve/cvtDAHLVRprtQDmWSZmDXp8NmnmKcko8/BaL9O4dG01GVRwKzmUbUiZZDmQgoqamoGc2zB+GTLWNNcNaJ+XyTo5lCT/dEVWOuIkjkrSM8ViWWOmSoorUlVc2sB12SvKsG6eMmGayeasKnCicKtQgr1mCwpMqWJ81xk4gm0NfdGhfvMlJReaB7L/dkMUufUZzchqTIlm4mzum9MVQOqq0FjTqpp+gU9xRR4zVwHYbHvmktFqaN7OcxcIxL2ijpr+kLbq5rXdwNuoXSWduhp9tcPvgfGUJjpM8JbUsc0VhX36HNDSdpCmX5jSy09n74K1t1TSOqoIzVDSW023Gl7bLsSxwPfsxTJbLMyiJbLfwGzkz5r
|
||||
@@ -0,0 +1 @@
|
||||
eNptk3tUE2caxkEPVqytLtKDLF5GBC2USSYJgQSlSwokXBIuJmCAFRiTL5mBZCaZmQBBscUrKtZmvSHqUcolFpWArciC1kM5rq7iahUvwbWwVnrQKrS4wiIqOwH1tOd0zvvHN+edb97n+z3Pt95eACgaJwn34zjBAArVMOwLbVtvp4DZAmhmY50RMBiprUlJVqqqLRTuXIwxjImO4HJJ1MJgfI6eJPUGgJpwmqMhjVyGzAdEzWpSa+12n6mnUILJYawmEEkBHQVoLGeiv0RjwAHbwbWR4WGhiCgMEfJEQgQOM1OhpjC9CQ/DCCESZmJEIM9iFgm0uCCUh1koDmoy0a8nWmhAaUhWNMG4Br/5JQ00FGAitbBULham8sTmcEWqUWrShmNxMYjKuuR3OiJ5gXwpW4jwE0thBpkpEGSqZdH6jHjJColEJklTJhVKYbk4nkqJS0o3pKXKiuNCY/OVoEgLVpPJ+QQRHc/jh0qJzEQyLVFCZxTChQKMKBaspuWq4rTQMLW1KDUmRy8UKxUiOwZQLQt7S30aqxyW6Fm1tmaTlaVLwK9p01w+R8CWQ6LRABMDxxIaUosTetsJfTFuCoG0QGdAGVA32bZVB3OD66NJggATttnq8wEwwagBLwDHoyfRwCoWvs25iAVnwDWo6zNuEVxYWAjrSMoIWygDcM0A2qYi2MUVZo2EJ1HanEF6AzwpkCvg8HhsQS7PXSJFHARiQWthl7lcS8ObcXJA6BnMVs0PEx1lOZvYMIENdTSDMhZ6fY1rUudFuxHQNKoHXyYnvoGyw6GGpRRqBHCyaTKB9UqJIjZ5RbwsPqkuhUL1RtR2lCBhDarBQKMaViuVcApFMq9PXoXYY4tMODvRdlZBEiEQwoMSUALiicUIhCAREwXJFKomFRtJWsca8JatXYNZiHygbVbDv4X2VomdIGkC1+ka33Tf7qxxuVKnBBR7iWyttAbV6UiDqwORBBSnUqVMtvi/N+Pcb73Io0liGaTBUIpmU2thdLCoJh2lrLvqkilcjxNH1fDkwr4CsKoBVRPDBsB2VoVZQiC+GFKgVoiP8IUQTxiBiCN44a5D2iUGBlYWaGxODiaI9I8IDRX4L4OMaCRfKOYjCBKCCWC++A8aJ6JdeCcwUKSB3f6GeAjErmiGpNiVES2CWfciEXZpoRk2uwVs5LSsrIlrX8q6TbEMzk/tWrh9upvrmTJ+NvX03mPq2Z09Kzv3F8k27LH6ekmhlumk1/s/rGopd0OHH5/jVnpmtzilJa3H2urbfh7MIclBdJ69ZOHMo0l3R3rIwcS4gwWL/iHavaxzNOpJ4Ie5V4N7FnbcBWnzm9VTU8XrTlkDlo88XmRaGtszS2S5QO/0SZhW5YswibntFqJsanlX22eH6u2NP/xUUvrdtz1jmv268CNPN4deyp/q2P0X88m59J6jilIO/cXJ6UEHZPN6V14xB15pqv3oObJ1SP1E2a1TvYxZAA4+6UgcuFch5829xy96EtyycUXBO8SnAw6Z1wXk0E9pGbG5h5AXUefGn7WKD7UN2YYlo60VNcn2wbyF1ZySs0fmbm4YfzkqVtzgyD7cgt/Z85IjvaIXvX9eP+4x9t9fQ35k3knNbibHhQjxBByoWtW9MfDF1eG5X/JG+3NF2v7vzxizToGhr0dPK9obX4QUJ3XInv9PcunBzi0P/T+wzd6XF+R7K/j+Ee81Tb4R4/4D56XBy290rL3g8J4Tdc5r5bW4FOF5p+NRLH7c0hG/MwW72aXr7a99VFa71tbYBbIPhvvdviS/a+Svkgc7vBIanns7+oPmrO3709Llec/SiL/pYwLOktfSYnK52/yNOd7+nsHGFRnklCUy912+UZe5vZdmf2wdiFKWZ+0oc/y5b9n8BQdGH737bIZ6QWCDqHjAus2ndUeZ0s8zbMGc4ZP451tzd/6zb2f6fWx8361KA7f7G92mEz43j+ObAJbSl9VpvlmOg1PZJf0X1v+9rLJmhse32xNah1cNbmq/6Bwi50E9HhWDCfKAX6qT3R6ITbv95i/pSe/u2DvCedjRUqrov3NKbVH/u/uXMeLVxuU+UNMntSt/9YK8vzMkeZu2OVt6x8oey6OoDbozLbk2Q+U2X01w/oltX/Xm7k/ogeuM66Yda34pb1fd8A+/7Hfw46y4w+I0zbt5nvEZvOH+HyXuiz/zaH1vlkIxS7ouIN0vqbzIb8TZ7pnJwTLVB6ZlWtY89nhwpyTqouq6t9n/SnNMoaBWt+FgtjNXPM26N+urj/rGdmELQqaZ8/EdZfNVw90nlzYs+9fpgGvk+X3cxZVVn29vaMRX7c7TzS/nOPx4/8EPret95SU689f3Cu/kC4RJ0feup9ZWP66a8dQoLe+8t2G7rZR79fDAdagrnfKaUtFUK7+cnvO1r9PMlG413obngPbU222zstNvHQ9qOjbw/T76UWD1TGZNUrf9XlfA4r6Rh9YvvJ9vvh59ax6e0DfUURy1x7zFmt9UkTTzeQjq42juFjx99fPhT93Dcu6vL/dwc/s/IPGG9w==
|
||||
@@ -0,0 +1 @@
|
||||
eNqNVQtwE9cVNTZQkpSWYBIIhHijODYGP0urH5JpCcYYGhx/RhYYSMF52n2SNl7tLrsrxx+gsQIxnRLjrSH1J5gBy1JQ8G/shIJjhsC0SQmf4iHFCp/gDi20Ic7Q4rq006pPtmxDMZAdaUZ67557z73n3rsefxESJYbnJhxkOBmJkJLxH0nx+EW00Y0keavPhWQnT3tzc/KsjW6RCb7olGVBSlWrocCkQE52irzAUCkU71IXkWoXkiToQJLXxtMlwR1lKhcsLpD5QsRJqlRSo9Unq0ZMVKmvlqlEnkWqVJVbQqIqWUXxmAQn44N8J5QTJUJ2IoJnaUyEKOT4NzgCFUOXwOJDO0G5OcTYedGl2rwee+VpxGIgxUI3jYAOGIDEcxySAQtljFdt9jsRpHG2V6Ke9Dp5SVba78ugFVIUEmSAOIqnGc6hdDhKGSGZoJE97CWZKJVkOkCF/Q4VSgkUIiQAyDJFqKMYSDJkOBanB2TGhXi3rBzIzrEWrHh5dUa2b9i10gYFgWUoGIarX8cUD0aSBnKJgO6/DoRLA3C9OFk5lDZCVp1bglXhCE2KQZtCtt0dmoWYt08Yuu+6+0KAVCH2AyKKK75hcMvdNrykNGVBKifvHpdQpJxKExRdRv09WYpuLpyo4k/PvT9c5HI0nF+XQpL4036PZ6mEo5QmO2Ql1D4qxSgmoNVodUBjBBry0D2+kSyWAIrHIZR9mpaRCrKIc8hOpZHUa94XkSTgTkZv+TBMdkseL5YUnfrMH2m//TmZYw3xjHcZllfptjrdyYTWTGTBEgKHNhCkIVVjTiX1xIos68H0SBjruEK1W0XISXYsVsZI9/gpp5srRHQgfdyOCSaMZSzi+CzjYmTAcIIbN8PQyIChI8Wn14Sf4PxH2ovIhWsUjj2CSfgOGAnJSmc4X6AxAK3ZGsmaXBdMHA+NG/s+ik2moWgLHm0/RjGCSfwumAdQ1K8LqsaDR7bXCDuv/oGlGLUc4+XVmc3mR/h9WNGI8ZD/J+hw6vEPsbxbymFr4qHWD+QTiJAGDK18jH8XaEgyPZfSivblNmdxxjqLNr8YMktz4WFJFhkKuww3ssCLMpAQhVe+XKIEk/EeD2+hH+tIg86I2SwiGI5i8arNc9uW8WGq0iJCEBHLQ/rwGFFedECOKR0akzCBYLzObNQZaJ0NIJudBnqzaSEwm7UksGm1JlpvIhfqaWNjEQOVAN4WhIPnHSxqpeyAgpQTgeFhVvzL1manZb2cfnANsPA2HgtihVg4jueQLw+JeH8oAYrl3TTe2yLypS8HlrS1SqdZv9CIzAaSok0a0kzpQEa+pW1krEfH1hte+kMvsHJfuCSc4zfR9XG/mBI19MTgbyjE5mXycxZO+++CmIbpr6HPxB3RR2acbtzYnyQ6H4fLKju3/lpXX7vvWnNfaENV9s3J3zAT/lbfN/hlRVVe79u/q+waGLj86SExtOVn/wSXL/12MK7/yn+O3hEH35jlUesMM2r7lOPTrCsXLelgK2/CKdXx8+Ec2yd1TnfHkfW1mupW49en678esMe+PW3bnt4T/zhbEZe5p/9icYg9cV1ZslIlmdp/OuvGc/7DVeXPEqt9m6btubo/qIt7fl/sTbLh2KTcJtFTf7p0Kv/V5qhrE2daPLvAydpd0+nJcY9VUQlF24O/+ujOD5sTi1fF73+pvG72n1Ydn7Q4aUvZ5kl7V2ijl4LD7WV//qKgRpLN889Qla73zn3Q88Rba/++te7z6C/mfZI2Q7P8wKVV+lm+JMmYf3tTQs0HZ2x0zLyMc+dfW2xfPHvXJpWQYdzRsK35rGNxbJczoetfJwvubOuPrvz3Uu73lz49teuY+EJI/eXkjl+unfltdML3Kta03frm+2kH3rljCUwVnpGe7VN9GD/vhneJI8lROpubnNNz7dv6hhvCPtm7+x1Tnif/LHl0y5lzT0t1LVezU+SEuVdje3svdJjKd95wNU+nY6p/Tpw7inZWd94yrUwzHOs+35B1+8rUOUfMx1f8Rauv/InlWs0L717fQL9+pfOCnRk4teZk9Mrr7ETLU/MTe1bffjf90GBmhdqy94817z9V1X/d0/2j9Zv8PTHFSS3zdojlN7+amynk1tZ9FFpwobXvVuvzNZ9/nLc09gcXl7xXtGH3icEnGvndJkuvCNCUD0P7L1ZUlbQclcxPvtq7IaTdun2jd731+B/qL599c2Bn2YuvhLqr//pSc1f3XI/SIHn3OHrapl4mzs94Li40IdyBMVF7E2duz58YFfU/s88KiA==
|
||||
@@ -0,0 +1 @@
|
||||
eNp9VQtUE1caFnWFXa2i9YVonY09IpLJOyEJsCziCxFkNQUfRRySGzIwmRlnJoEoqGDVrYoyVay24oPnlqWC8qgiKr4ttWrdKgYV8aACuroqbn227p0ElLO6m3PIuff+H/d+//f/35+sEjtgWJwiPcpwkgMMZuTghuWzShiw2AZY7rNiK+AslKkwZtYcQ4GNwZ0aC8fRrF4qxWhcYsYZkEoxKawEw6U4aQYMII1AapdLjRaMkxopK00A15WFiZTJ4axaKrJSJkCI9CLMaKRsJMdK39whdYVYKUFgVgy1K2k5GihLRHGS5RibkROJRXABMKtIb8YIFohFVsCyWBJgRfoFS0UMRQB4rY0FDEQaKZgOycGDOEjEj0U4C0AowgRTQlJIKpVEQBomkEMoM2K0kQA3U4xVlBEvFpEivVx4iqJFetJGEBklFoCZoExrCi0Uy/EV/514MUwF0BxfMFE6sdy9RqEMlAknk/jKpCU4LUZMwExgHBAjS1jOVArZkcAlNV+aAgCNYgRuB6UCdxQmRHJ8Je2AspOoIHaaVCZRaCXyPV1JoQQgkzgLXyDXysq6zzgHDQRqNIEbMeFmaTJLkX9jAEtD9cHKYpbDOBubVQh5gbNnSrq0y58V2Z1ec69RhZMhR/6QwWITIwodEoU5EIVMoUbkar1Mp1epkGlRhrLwrhcN731xr4HBSBY2AjqlW4ISo8VGpgBTafh70y4On4rOxhx8lU4VqAE6s06hU5lUSqBBp8TNLofBcMxoAegcF3++ZPK86LCoiPCDYVBolkUFNrD0aBhBUKnoLAZPwkl+98S6NwVCrbjJRIBUjAFoV7Pz+XbFd28BNAPbFCpIpQDY+vkKde3bGCs0CUwCpWi3M5zXYQtjaQlutEgPBVJpxYiIA1Ya+oezMbALEblEJpxRdEIK3HWvaWEtCYQ7K0527VwxBw0lJFwn7v80u+xHGh0JNCAxgnO8wdKwooLH3gkwgIZOE0j2CLlvs+IMJZQ/gcOYJABNgQh93TNAMK575BlOpEfugIGCCfIISgsicLgV8EVKiVwpd378DlCIQhVRGGC75OSLZBKlSvvtHDfCgFuFhnAGcBSHEUEmGxOiCdQpJLIgaExjiMggHCOzu3oWgXAgOpQGzWSnHKiNdtvf9RxuBG42hUq5UlubhkLpAQGvh+5wfXfNL1hPjcwpejfurh8KzSZUDZj4Qo1MJjvyP4HuJuGLBJSspieMEjJzLfl8kjrcM8QAK4aTgnJv6ah1zvHvx7yfknPc/0V381LrdIHqsrTuh1DcxDs/BiBQBxRmGapUJZpQlVqtRBONMhWKYWalViXXBJp1gYWxGOPgy8Pcc6vbtMXuivGlRoKymeDsYkBFt/HfGLtQmG0lsNVQ1m7kKy3KEBEcEkpREGLFQrQalUzmmvqZcPYwEH+yT8vYdV69XJ8+8O/16/W5V0nfQO/VCaD+65r6Bh79u51rHNpvyFA4e75aV+o19VbZFvWXlXnbcsv6HW6w3Zg4KDZnS/r0AVVxB1uwO22hN3J2fjqAvPnXczsuRgTlnY2Owxj9zvPjA0JD8pCays5/tI4931SStGdx3YtNdQdWdY5p4tuWTTfxlSnbR0wqaBl+OqmkrdP56J+vCheHeM4gb15Y+lFlxybNH6zzJP+OyLvo549t3PQjod2xPSpu09ZLpiexve2jQ75Z8Oq370+Obm9Ai7PvRQxrvDV853Y6f8gCdSm2Z6nW97boYfy0ev+cUZ+am827AsTJmQseGouv7HviPaxs3Nza1IQ/rct4Lh58lM49w3WMumL/KdX7gldsu/ToyJi0sx53Kjs4Rei26JvO8yMe/vq7GU1Hp4yP/qm2HUM+H71Mm5U65v6iyEne+Uuac2JWbL35rONJXXrki6iZv2/8pClvQ79mZuOBV4sGrpiU4+g/ZuMMjwt3z8XNvXa6r3Ra0V7p4F8OKoI3JosPfnHL2R40dHbQ/DtAcjl9afNXHmHfzxh67zOv9GPX79fZLBeGNM0dEft6/6qwK2n7ci3y8qcLW49ebvSvPzkh2NscI11cUdRn+NzaG4m/PR6U9eDFML/42nHPLpw0DMQrxq20z8Q9M9NH1ybOnzNp4hiN4cCOylMxqHr1J7za6/lfNhetvzbsxaLdtuA9QQGbxpBesZN9SgvQ/ovajRuSs4YHGrgZVYM/HFAwoKzw0pG2uqIPIoY9CR3p2/Do9sDqQZuP3Zo3it21uSV78v2x+4oTjmuCtkX7Vbb2Iai1rf6jIjw6t1SN6Lfg53JZgL/85XRnS+NeP3/TM8vB/EeH1n8ZNa8aLcqe7xGQ4e8YQntK+76u3U2FPvU+vLPqni634UHHOp+FCdXi3ZlPj1zL/+WMIzK47315ke+p1folOyv2IVO+lS08VXe59Lh+be/PHzfp0TmTFuf5Tnjmczt/RXJAs3N5eHy1393vTtDFvs3fPFhftPWl4u5x/UfeN15m+iSGVmz+2jNGXNN+f9GBppH2NfszNZvT2iM/2H8o+8NOK1j4xeP61Rknkq/2fxmN/ZAYj6/Zt39+5/QKzzifY/NlhoWrQjoOJ9nqy3atmjezxf/qBPhLqX2R4CnNOG7dG3iiJnvZzOrld6vn/nhmqsLLOTjYkn36wdaMU6e3TsBr5DFNmTlttg3LR1weSvzgGfD0euyxJ2X28t5Pi24g9Pm2l/EP18a09L604Zqtteniv65P+6PhHHnz5zS6cbmKDK5TrKRDr0xrjWy4O9yv85SPb/FV3vm8LDh4TclDMn3ZsoxfpS4b9+mVbcz9c4Jnr17/AcjtYQ0=
|
||||
@@ -0,0 +1 @@
|
||||
eNptVHlwE9cZN/EfOEMoDZMMmSRutusATUYrre7DqFP5kOI4wsaSje00MU+7T9Jae3l3ddlOUoiThnFSWIcwCRmmAWSpMXawMTnKYcpRB6c4hbgtGFzKUPckTU0urgb3ydiMPaDZ0b593/He9/v9vm9dJgYlmRH4ed0Mr0AJUAr6kNV1GQk2RaGstKU5qIQFOlVZ4fPviErMqC6sKKLs0OmAyGhDktCkpQROJ4iQB4wuptdRYaDo0JbIwqlcqYBAJ0ffbcE5KMsgBGXc8XQLLgksxB14VIYSrsEpAR3OK2hjNYpeLmNKGGICS6MLYBFeiPMYTIBsRkwIYlSUh0xQkDj8uWc0OCfQkEWBLAs4QBi1esIWIBheVgDKp8F53KHX4LIiiLiDj7Jsdi1BwOGOIGBlqMEVyImobiUqofuQWutzmTAENAJlQyosyIraPbvMXYCioKgQkKcEmuFDan+omRE1GA2DLFCgBmuWFboLVcPDKSDVrgiEIgFYJgbTN2PVXiCKLEOBrF3XKAt893T1hJIU4e3mrixGBAKOV9RdLOBDCGCG15Fao9bQmyBQoQzPImiJrE1Ni0lEF79vtkEEVATFE9NUq2lSq7dpyfdm+wiy2ukFVIVvTkogUWG1E0icxdQ/e1+K8grDQTVTXHn7cdPGW8dlECl69PTNySwneUrtnCLhwznRUJGSBCWgJOo28r0ZbFjIh5SwukNvsf9KgrKIhAVfTKMwJSqvSyE24PFjmWmFba8onyHxXM59qRLEjHrAH45qMIMd84IkZiANZkxvdpB2h9GKebz+7uLpY/x3pKDPLwFeDiIaSmeIz1DhKB+BdFfxHcnuKQZUGBLZtEjp6qhVlJjYlEI4kMiS6SQ1GC+gugUJTq2obAAyR2UFYRBDWWjkn4oBKammKyQmxPA9CSIrRGQOZY/rjsoEhS4tAZbQ700QSMSQZTgGgTX1P93BiFi9yUSSH93uoQgRiJo9ZSFJcmC2WYIc4gPVOSeJ0W7ff2evmURmu820d66LDGfdZLtF/uh283R0p95CcnJ3YsadYGh1AK0bSH1jHNphwhyDChcwcHSTVeaMQXtTZBcVvAkccVMJaqakbqXLW1bc5UN5iwUhwkD1zLzFDQ1UsCHAOV2gtMiXqDSVB+ptq2KltYG4PS4Yy40NJS7SZ/a6qzi32ShwlgZmFaG3mmxmo95otRJ6LeoYNFmqGynS615dQ1X7npICghgyS9Eaj5cqC4Y8T9FRUBtYGanR2i0lsMTlCnuqkrStaRUng6Qn3FzTZE3QriK3X4wa7I2Uka83hJQgmeDjItnskb1UdX0iXC4oETdpp+tNhZgIlLBTV4gmn8ggnJzTAiaQgImb8jXOyLcQo4UsFc5bg6oQewJN6QqeTRZiPkih4YbegIM+RoHOlQIP0z4ooQZVuyhWiNJoekkwXewmqlx16h67yWqB9qDJYLcZrMCmJ0pXV/XOdMitDkhlR18GsAohxyi1P2x04g6TyYgXIok7bRakuKnJvxY1qYT8f3tX5JH2vJypX26bz9t+lvz+gcur75HLFsR6Pz9d1jq2+wR8TKOUL+p8oO7Bs2eOdRQMva0pv9HS91Xf1/78n18OnmsxNnuW9pWcvIRdc318cc9GfKjk+OJlf72fq9vPfLlM+6fSwa9SQ1WDZNHTBzfmLblQdfr5E/dtMj+wTTyaW6YZbX5/yLi559S939Z0DC5+/+/ODf1L263R7lcXjbWtH6lbeoy6ej3/DVU5aljfu/y/yfLagR39W3pLvjiwt6n7bnrZwt/nr/1jR+vPtvbNz1XGX8d6F75ycmNj90VXMi//0IXMQ/GJ4xEvuHfeqXPP1v7vriul9YPxZlPHnpeXZ1oP1h6WMyvEJf4jpW9uGRpaKoY8BT8sf/3RT2s7fvPnUy3n3YfXtmz8ZMH1J/nDL/3abi1S78b8ibfHvQH8yspz37NMPNt0ZYXtS7pIc/WhFbaByd1twrc/2TWOV4IvCn/X6h/Ydy3n4ppL4z+6ceiZyGuLJ9bsFxvfjBmGk/70Sy0jvuF8YWinfDK+IC//l9/lRfZ39J9esy2d/GSScm5Yf/6dovljOVffuSab/nPP1u1vVdVdeKyIO3to9AzFuHsK2E2rBnY+3CQs3PzPF/Mnvl5++NLeq989UVDt+qYN7qbPRgdr2H1ln+peE8UPHim6//NfvPqx9MYLm7SpR8uspVvokeGfLvrm5ZBm8sPnt24K/WMJ6F17cVGROvaklT30QbKGzrRPdMWOxj/7wXEa/PhG+998bTeG5/97p/utzWPn/9B6ZLjyUnX9+J76oYcLVtT09F9f75ys8Fx/HOjmn/lM/0LniVOM7nLkL6Hh2MEjI/8aaUCqmpzMzXn8WGfuUG5Ozv8BEYit3Q==
|
||||
@@ -0,0 +1 @@
|
||||
eNqdVH1QFOcZh1gnjePo1MFoGRs3h8Yv9ti974OcjhzINxg5YA5T6d7ue9x6++Xu3nF3YvyIHUcl6BLJSDQlHoTTkyiK4igKDRVDY0ajxigxosbPTEWjJWJriX0PT2vbpH/0/WN3dt/n6/d7nt+zKuQFokTzXGwzzclAJEgZfkjKqpAIlniAJK9uYoHs4qnG+QWFtgaPSPfOdMmyICUnJRECrWZpSRYJRk3QSV48iXQRchLJswIDhsM0OnjK37ttqYoFkkSUA0mVvHCpSuQZoEpWeSQgqhJVJA/zcjL8UQK9p0mI7AIIz1AwN+Lm+AoOAT4iEhHhnQjp4QDt5EVWtey3iSqWpwADHaNFoBJLMAzKEDL0hZFlwAoQkewRYTpMbYR/eKFMUCXjiSroAAhWlewkGAksC7kAQUEe1jS6eElWdv07st0ESQJBRgFH8hTNlSut5QFaSEQo4IzkSkQCkkyFIQ4ODLOnhN0ACCjB0F4QjqBEIXROVloFP2SSQyP8+ZIwtcakxpuj8FHZLwClhRAEhiaJSJSkxRLPNT1J/d8Xu576MYArl11KA27At4tAEiDr4O0mSYawpVWNsGDweU8oSn+wIOcp0r6YuMY0WLxyxObyJCIaM5JH+BENptEjuD4ZMydrjUhGnq3ZGk1j+8ny9thEgpOcEGH6U25CpMvDuQEVtv4kH0d8kEYv70c9wpMWoJAfL00CVKZZoDSaDBpDpw+FXQMMzdIQ3/BT5t2Ak1CW5jyw5pAGGz69Cc9bioAlaA7W8J/WuNlsNuoNHT8fFoJ0KTtx7NnpVf3vyMMOYfOTA2Mfed48agTlI/pRMjJQDRqdrj3SSklCI42DCoB0MHwFyot0Oc0p22Ye8KFuHmZ4xktktDjSDykxYbp90VtB5H3+Z1dBXLsnehFVK0pTSi+iIwidTo+ZTWY9ACaNQe8gnbgJd5gMgIQPfDfpREmCdAH0yZwooTR7/ty8LGu4EMiolefdNKj5OjaurIx0ljlYSxajznQbCyTSWuQpmcsHSvBsYE2rKM8vXmzNUQe8EmkQPVmlLj1nR3GjzqTX4lqjEcXVmBpX42h2cUW2UzCkaT05hMObmxfITxVL0x22N/gsI29wLvaXOnJLOSw1Q8ZpE2YtVhc5fXZtDu5kjXb/Es4l6u2CwaQpKM/m2TRrfqAiINishiIsF+iX2CV1hdlFaQJluiw6Pz0FEQjZZUlKgUtDoKEeLNHxRuF4o0+GW/t0uFMQio801vKc1lOQTKjOAo7xpyCFgISrA74JFhTSMrDk8xzo3QhJ8XhpylJWXMSZyUxNunoB6xPdaTypyeaLHE6vCc+leCOrw3QggOvS4Tg8xwqmw1EsSowB05mGx+1fZf8fFTUVQgUBUQmTDO+h4EoSQZN1Hrpgrl3ZZ9YZDcDsIByEU6/TUBSaXrKg5ammn2m2MbLPQgQjo5KXVFpdWosqWafTqlIQlrCYDDoMG17kK+FaEaF99wsPJq//ZczwGSHb8vhJxtFDs1bn9zuY2wHDfsuBKbdenb5u0elF73ddzvzm11/VmGpNbn7048HU+LsZX9yYeU+5a7Hc627RfnCw4VHHkYMVd28futjX3ta5pL2t4sKFCykX+grd8QlDl6iz8rHq7eeKu65l2YN0Vd7q7BPylgv9F9u9Z36Uz21Gqg9/l3u5+aWhojG3hmozPmoeqR218LOmR3/tFuQB5pCHfOF61Yi6O91T7lHJb5ul4qqRDaofPwkO9spLHnTgxcEFic2zTyi+aTNiyjdmIilfKCswf1Pz9xOvz0jd0XP9zow94qT4P95aMX76/Pie2IyHtXtzgxMnXlmjSuietebFh5u+r58c++7np872LfzNqy2DcQknY2dX/uNv8bM3/OLbi36VhdwTHrN/Wg1/9ZNt4wZck+deP3/o4q5zJWz7ovrkG+mfXvlmdOEb221x46vSWl4Z++H5paffLLKv7W7X+Bw7NVO2zmvbOHbbhPq5GTZfMA2bImsDy1c49v/p2kP3D8GNq5C1MY/uVxW7b1b2ZMfW9mf1iPLQ1LfWdg5teDRy/JRZt7GerZ1XR91cV/rxW1de77oZBvsXX6aplOLd9z8qMJavH7X3vQln103SIg+NP/xqBPreUfPAJnFEYY2lPXUreYP4Mp2pq1YLzduK+gp/r9ROX4l+WNTXej/1sG3Wi+HwpYxTyjjqFdXd844v+cn9zXcGdXO6QnN+Fzz6bfHOMzv7a1aXkvaxIHXH4U3Ld3SgM3ort9SQCdVfnbwzp3ScPUttPFt3mFpOfLyGPnXt6ImVwY5uqzclewXmG1PY1Jqzfl+zqTLp6+OffoaVTBy9NxS3YUJswhi90VE3MB89nPv3g/WnW9UrD9BtVR2Vjs0NDbYzpYfi5lU/uAZmXkua7fvA+xrjP/mHTkfvm7WD3eWdht3v9I7vPLYmbfPU909QtX/WHTuOTf3u0ssn/1L2Tt2G9Yuzx722bL6/cXPeAfp4cMFA6N0t/W1M+GrX6/UHtpw7tzw2Jubx4xExzmmXlq8cGRPzT06hzwc=
|
||||
@@ -57,7 +57,7 @@ Despite the flexibility of the retriever interface, a few common types of retrie
|
||||
### Search apis
|
||||
|
||||
It's important to note that retrievers don't need to actually *store* documents.
|
||||
For example, we can build retrievers on top of search APIs that simply return search results!
|
||||
For example, we can be built retrievers on top of search APIs that simply return search results!
|
||||
See our retriever integrations with [Amazon Kendra](/docs/integrations/retrievers/amazon_kendra_retriever/) or [Wikipedia Search](/docs/integrations/retrievers/wikipedia/).
|
||||
|
||||
### Relational or graph database
|
||||
@@ -68,8 +68,8 @@ For example, you can build a retriever for a SQL database using text-to-SQL conv
|
||||
|
||||
:::info[Further reading]
|
||||
|
||||
* See our [tutorial](/docs/tutorials/sql_qa/) for context on how to build a retriever using a SQL database and text-to-SQL.
|
||||
* See our [tutorial](/docs/tutorials/graph/) for context on how to build a retriever using a graph database and text-to-Cypher.
|
||||
* See our [tutorial](/docs/tutorials/sql_qa/) for context on how to build a retreiver using a SQL database and text-to-SQL.
|
||||
* See our [tutorial](/docs/tutorials/graph/) for context on how to build a retreiver using a graph database and text-to-Cypher.
|
||||
|
||||
:::
|
||||
|
||||
|
||||
@@ -11,8 +11,8 @@ This need motivates the concept of structured output, where models can be instru
|
||||
|
||||
## Key concepts
|
||||
|
||||
1. **Schema definition:** The output structure is represented as a schema, which can be defined in several ways.<br/>
|
||||
2. **Returning structured output:** The model is given this schema, and is instructed to return output that conforms to it.
|
||||
**(1) Schema definition:** The output structure is represented as a schema, which can be defined in several ways.
|
||||
**(2) Returning structured output:** The model is given this schema, and is instructed to return output that conforms to it.
|
||||
|
||||
## Recommended usage
|
||||
|
||||
@@ -109,11 +109,11 @@ ai_msg
|
||||
|
||||
There are a few challenges when producing structured output with the above methods:
|
||||
|
||||
1. When tool calling is used, tool call arguments needs to be parsed from a dictionary back to the original schema.<br/>
|
||||
(1) When tool calling is used, tool call arguments needs to be parsed from a dictionary back to the original schema.
|
||||
|
||||
2. In addition, the model needs to be instructed to *always* use the tool when we want to enforce structured output, which is a provider specific setting.<br/>
|
||||
(2) In addition, the model needs to be instructed to *always* use the tool when we want to enforce structured output, which is a provider specific setting.
|
||||
|
||||
3. When JSON mode is used, the output needs to be parsed into a JSON object.
|
||||
(3) When JSON mode is used, the output needs to be parsed into a JSON object.
|
||||
|
||||
With these challenges in mind, LangChain provides a helper function (`with_structured_output()`) to streamline the process.
|
||||
|
||||
|
||||
@@ -21,10 +21,10 @@ You will sometimes hear the term `function calling`. We use this term interchang
|
||||
|
||||
## Key concepts
|
||||
|
||||
1. **Tool Creation:** Use the [@tool](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.convert.tool.html) decorator to create a [tool](/docs/concepts/tools). A tool is an association between a function and its schema.<br/>
|
||||
2. **Tool Binding:** The tool needs to be connected to a model that supports tool calling. This gives the model awareness of the tool and the associated input schema required by the tool.<br/>
|
||||
3. **Tool Calling:** When appropriate, the model can decide to call a tool and ensure its response conforms to the tool's input schema.<br/>
|
||||
4. **Tool Execution:** The tool can be executed using the arguments provided by the model.
|
||||
**(1) Tool Creation:** Use the [@tool](https://python.langchain.com/api_reference/core/tools/langchain_core.tools.convert.tool.html) decorator to create a [tool](/docs/concepts/tools). A tool is an association between a function and its schema.
|
||||
**(2) Tool Binding:** The tool needs to be connected to a model that supports tool calling. This gives the model awareness of the tool and the associated input schema required by the tool.
|
||||
**(3) Tool Calling:** When appropriate, the model can decide to call a tool and ensure its response conforms to the tool's input schema.
|
||||
**(4) Tool Execution:** The tool can be executed using the arguments provided by the model.
|
||||
|
||||

|
||||
|
||||
@@ -114,12 +114,12 @@ result = llm_with_tools.invoke("What is 2 multiplied by 3?")
|
||||
```
|
||||
|
||||
As before, the output `result` will be an `AIMessage`.
|
||||
But, if the tool was called, `result` will have a `tool_calls` [attribute](https://python.langchain.com/api_reference/core/messages/langchain_core.messages.ai.AIMessage.html#langchain_core.messages.ai.AIMessage.tool_calls).
|
||||
But, if the tool was called, `result` will have a `tool_calls` attribute.
|
||||
This attribute includes everything needed to execute the tool, including the tool name and input arguments:
|
||||
|
||||
```
|
||||
result.tool_calls
|
||||
[{'name': 'multiply', 'args': {'a': 2, 'b': 3}, 'id': 'xxx', 'type': 'tool_call'}]
|
||||
{'name': 'multiply', 'args': {'a': 2, 'b': 3}, 'id': 'xxx', 'type': 'tool_call'}
|
||||
```
|
||||
|
||||
For more details on usage, see our [how-to guides](/docs/how_to/#tools)!
|
||||
@@ -137,16 +137,6 @@ For more details on usage, see our [how-to guides](/docs/how_to/#tools)!
|
||||
|
||||
:::
|
||||
|
||||
## Forcing tool use
|
||||
|
||||
By default, the model has the freedom to choose which tool to use based on the user's input. However, in certain scenarios, you might want to influence the model's decision-making process. LangChain allows you to enforce tool choice (using `tool_choice`), ensuring the model uses either a particular tool or *any* tool from a given list. This is useful for structuring the model's behavior and guiding it towards a desired outcome.
|
||||
|
||||
:::info[Further reading]
|
||||
|
||||
* See our [how-to guide](/docs/how_to/tool_choice) on forcing tool use.
|
||||
|
||||
:::
|
||||
|
||||
## Best practices
|
||||
|
||||
When designing [tools](/docs/concepts/tools/) to be used by a model, it is important to keep in mind that:
|
||||
|
||||
@@ -82,7 +82,7 @@ Here are some high-level tips on writing a good how-to guide:
|
||||
LangChain's conceptual guide falls under the **Explanation** quadrant of Diataxis. These guides should cover LangChain terms and concepts
|
||||
in a more abstract way than how-to guides or tutorials, targeting curious users interested in
|
||||
gaining a deeper understanding and insights of the framework. Try to avoid excessively large code examples as the primary goal is to
|
||||
provide perspective to the user rather than to finish a practical project. These guides should cover **why** things work the way they do.
|
||||
provide perspective to the user rather than to finish a practical project. These guides should cover **why** things work they way they do.
|
||||
|
||||
This guide on documentation style is meant to fall under this category.
|
||||
|
||||
|
||||
@@ -157,7 +157,7 @@
|
||||
"\n",
|
||||
"## Next steps\n",
|
||||
"\n",
|
||||
"Now you've learned how to pass data through your chains to help format the data flowing through your chains.\n",
|
||||
"Now you've learned how to pass data through your chains to help to help format the data flowing through your chains.\n",
|
||||
"\n",
|
||||
"To learn more, see the other how-to guides on runnables in this section."
|
||||
]
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -106,11 +106,11 @@
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"{'properties': {'a': {'title': 'A', 'type': 'integer'},\n",
|
||||
" 'b': {'items': {'type': 'integer'}, 'title': 'B', 'type': 'array'}},\n",
|
||||
" 'required': ['a', 'b'],\n",
|
||||
" 'title': 'My tool',\n",
|
||||
" 'type': 'object'}"
|
||||
"{'title': 'My tool',\n",
|
||||
" 'type': 'object',\n",
|
||||
" 'properties': {'a': {'title': 'A', 'type': 'integer'},\n",
|
||||
" 'b': {'title': 'B', 'type': 'array', 'items': {'type': 'integer'}}},\n",
|
||||
" 'required': ['a', 'b']}"
|
||||
]
|
||||
},
|
||||
"execution_count": 3,
|
||||
@@ -121,7 +121,7 @@
|
||||
"source": [
|
||||
"print(as_tool.description)\n",
|
||||
"\n",
|
||||
"as_tool.args_schema.model_json_schema()"
|
||||
"as_tool.args_schema.schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -449,11 +449,10 @@
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"{'properties': {'question': {'title': 'Question'},\n",
|
||||
" 'answer_style': {'title': 'Answer Style'}},\n",
|
||||
" 'required': ['question', 'answer_style'],\n",
|
||||
" 'title': 'RunnableParallel<context,question,answer_style>Input',\n",
|
||||
" 'type': 'object'}"
|
||||
"{'title': 'RunnableParallel<context,question,answer_style>Input',\n",
|
||||
" 'type': 'object',\n",
|
||||
" 'properties': {'question': {'title': 'Question'},\n",
|
||||
" 'answer_style': {'title': 'Answer Style'}}}"
|
||||
]
|
||||
},
|
||||
"execution_count": 14,
|
||||
@@ -462,12 +461,12 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"rag_chain.input_schema.model_json_schema()"
|
||||
"rag_chain.input_schema.schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 15,
|
||||
"execution_count": 17,
|
||||
"id": "a3f9cf5b-8c71-4b0f-902b-f92e028780c9",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
|
||||
@@ -141,7 +141,7 @@
|
||||
"{'description': 'Multiply a by the maximum of b.',\n",
|
||||
" 'properties': {'a': {'description': 'scale factor',\n",
|
||||
" 'title': 'A',\n",
|
||||
" 'type': 'integer'},\n",
|
||||
" 'type': 'string'},\n",
|
||||
" 'b': {'description': 'list of ints over which to take maximum',\n",
|
||||
" 'items': {'type': 'integer'},\n",
|
||||
" 'title': 'B',\n",
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -40,7 +40,7 @@
|
||||
"from langchain_core.globals import set_llm_cache\n",
|
||||
"from langchain_openai import OpenAI\n",
|
||||
"\n",
|
||||
"# To make the caching really obvious, let's use a slower and older model.\n",
|
||||
"# To make the caching really obvious, lets use a slower and older model.\n",
|
||||
"# Caching supports newer chat models as well.\n",
|
||||
"llm = OpenAI(model=\"gpt-3.5-turbo-instruct\", n=2, best_of=2)"
|
||||
]
|
||||
|
||||
@@ -314,7 +314,7 @@
|
||||
"source": [
|
||||
"%env CMAKE_ARGS=\"-DLLAMA_METAL=on\"\n",
|
||||
"%env FORCE_CMAKE=1\n",
|
||||
"%pip install --upgrade --quiet llama-cpp-python --no-cache-dir"
|
||||
"%pip install --upgrade --quiet llama-cpp-python --no-cache-dirclear"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -212,10 +212,6 @@
|
||||
"[Anthropic](/docs/integrations/chat/anthropic/), and\n",
|
||||
"[Google Gemini](/docs/integrations/chat/google_generative_ai/)) will accept PDF documents.\n",
|
||||
"\n",
|
||||
":::note\n",
|
||||
"OpenAI requires file-names be specified for PDF inputs. When using LangChain's format, include the `filename` key. See [example below](#example-openai-file-names).\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"### Documents from base64 data\n",
|
||||
"\n",
|
||||
"To pass documents in-line, format them as content blocks of the following form:\n",
|
||||
|
||||
@@ -99,7 +99,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"We can also just force our tool to select at least one of our tools by passing in the \"any\" (or \"required\" [which is OpenAI specific](https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.BaseChatOpenAI.html#langchain_openai.chat_models.base.BaseChatOpenAI.bind_tools)) keyword to the `tool_choice` parameter."
|
||||
"We can also just force our tool to select at least one of our tools by passing in the \"any\" (or \"required\" which is OpenAI specific) keyword to the `tool_choice` parameter."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -182,7 +182,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"update_favorite_pets.get_input_schema().model_json_schema()"
|
||||
"update_favorite_pets.get_input_schema().schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -223,7 +223,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"update_favorite_pets.tool_call_schema.model_json_schema()"
|
||||
"update_favorite_pets.tool_call_schema.schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -500,7 +500,7 @@
|
||||
" user_to_pets[user_id] = pets\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"update_favorite_pets.get_input_schema().model_json_schema()"
|
||||
"update_favorite_pets.get_input_schema().schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -534,7 +534,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"update_favorite_pets.tool_call_schema.model_json_schema()"
|
||||
"update_favorite_pets.tool_call_schema.schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -583,7 +583,7 @@
|
||||
" user_to_pets[user_id] = pets\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"UpdateFavoritePets().get_input_schema().model_json_schema()"
|
||||
"UpdateFavoritePets().get_input_schema().schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -617,7 +617,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"UpdateFavoritePets().tool_call_schema.model_json_schema()"
|
||||
"UpdateFavoritePets().tool_call_schema.schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -659,7 +659,7 @@
|
||||
" user_to_pets[user_id] = pets\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"UpdateFavoritePets2().get_input_schema().model_json_schema()"
|
||||
"UpdateFavoritePets2().get_input_schema().schema()"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -692,7 +692,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"UpdateFavoritePets2().tool_call_schema.model_json_schema()"
|
||||
"UpdateFavoritePets2().tool_call_schema.schema()"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
@@ -43,7 +43,7 @@
|
||||
"### Getting API Credentials\n",
|
||||
"\n",
|
||||
"If you do not have a PromptLayer account, create one on [promptlayer.com](https://www.promptlayer.com). Then get an API key by clicking on the settings cog in the navbar and\n",
|
||||
"set it as an environment variable called `PROMPTLAYER_API_KEY`\n"
|
||||
"set it as an environment variabled called `PROMPTLAYER_API_KEY`\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -26,7 +26,7 @@
|
||||
"\n",
|
||||
"This notebook showcases the UpTrain callback handler seamlessly integrating into your pipeline, facilitating diverse evaluations. We have chosen a few evaluations that we deemed apt for evaluating the chains. These evaluations run automatically, with results displayed in the output. More details on UpTrain's evaluations can be found [here](https://github.com/uptrain-ai/uptrain?tab=readme-ov-file#pre-built-evaluations-we-offer-). \n",
|
||||
"\n",
|
||||
"Selected retrievers from Langchain are highlighted for demonstration:\n",
|
||||
"Selected retievers from Langchain are highlighted for demonstration:\n",
|
||||
"\n",
|
||||
"### 1. **Vanilla RAG**:\n",
|
||||
"RAG plays a crucial role in retrieving context and generating responses. To ensure its performance and response quality, we conduct the following evaluations:\n",
|
||||
|
||||
@@ -568,26 +568,6 @@
|
||||
" ```\n",
|
||||
" and specifying `\"cache_control\": {\"type\": \"ephemeral\", \"ttl\": \"1h\"}`.\n",
|
||||
"\n",
|
||||
" Details of cached token counts will be included on the `InputTokenDetails` of response's `usage_metadata`:\n",
|
||||
"\n",
|
||||
" ```python\n",
|
||||
" response = llm.invoke(messages)\n",
|
||||
" response.usage_metadata\n",
|
||||
" ```\n",
|
||||
" ```\n",
|
||||
" {\n",
|
||||
" \"input_tokens\": 1500,\n",
|
||||
" \"output_tokens\": 200,\n",
|
||||
" \"total_tokens\": 1700,\n",
|
||||
" \"input_token_details\": {\n",
|
||||
" \"cache_read\": 0,\n",
|
||||
" \"cache_creation\": 1000,\n",
|
||||
" \"ephemeral_1h_input_tokens\": 750,\n",
|
||||
" \"ephemeral_5m_input_tokens\": 250,\n",
|
||||
" }\n",
|
||||
" }\n",
|
||||
" ```\n",
|
||||
"\n",
|
||||
":::"
|
||||
]
|
||||
},
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
"source": [
|
||||
"# ChatFireworks\n",
|
||||
"\n",
|
||||
"This doc helps you get started with Fireworks AI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatFireworks features and configurations head to the [API reference](https://python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html).\n",
|
||||
"This doc help you get started with Fireworks AI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatFireworks features and configurations head to the [API reference](https://python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html).\n",
|
||||
"\n",
|
||||
"Fireworks AI is an AI inference platform to run and customize models. For a list of all models served by Fireworks see the [Fireworks docs](https://fireworks.ai/models).\n",
|
||||
"\n",
|
||||
@@ -39,7 +39,7 @@
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"Head to (https://fireworks.ai/login to sign up to Fireworks and generate an API key. Once you've done this set the FIREWORKS_API_KEY environment variable:"
|
||||
"Head to (ttps://fireworks.ai/login to sign up to Fireworks and generate an API key. Once you've done this set the FIREWORKS_API_KEY environment variable:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -1,327 +1,269 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "raw",
|
||||
"id": "afaf8039",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: Google Cloud Vertex AI\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "e49f1e0d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# ChatVertexAI\n",
|
||||
"\n",
|
||||
"This page provides a quick overview for getting started with VertexAI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatVertexAI features and configurations head to the [API reference](https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html).\n",
|
||||
"\n",
|
||||
"ChatVertexAI exposes all foundational models available in Google Cloud, like `gemini-1.5-pro`, `gemini-1.5-flash`, etc. For a full and updated list of available models visit [VertexAI documentation](https://cloud.google.com/vertex-ai/docs/generative-ai/model-reference/overview).\n",
|
||||
"\n",
|
||||
":::info Google Cloud VertexAI vs Google PaLM\n",
|
||||
"\n",
|
||||
"The Google Cloud VertexAI integration is separate from the [Google PaLM integration](/docs/integrations/chat/google_generative_ai/). Google has chosen to offer an enterprise version of PaLM through GCP, and this supports the models made available through there.\n",
|
||||
"\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"## Overview\n",
|
||||
"### Integration details\n",
|
||||
"\n",
|
||||
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/google_vertex_ai) | Package downloads | Package latest |\n",
|
||||
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| [ChatVertexAI](https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html) | [langchain-google-vertexai](https://python.langchain.com/api_reference/google_vertexai/index.html) | ❌ | beta | ✅ |  |  |\n",
|
||||
"\n",
|
||||
"### Model features\n",
|
||||
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
||||
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| ✅ | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |\n",
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"To access VertexAI models you'll need to create a Google Cloud Platform account, set up credentials, and install the `langchain-google-vertexai` integration package.\n",
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"To use the integration you must:\n",
|
||||
"- Have credentials configured for your environment (gcloud, workload identity, etc...)\n",
|
||||
"- Store the path to a service account JSON file as the GOOGLE_APPLICATION_CREDENTIALS environment variable\n",
|
||||
"\n",
|
||||
"This codebase uses the `google.auth` library which first looks for the application credentials variable mentioned above, and then looks for system-level auth.\n",
|
||||
"\n",
|
||||
"For more information, see:\n",
|
||||
"- https://cloud.google.com/docs/authentication/application-default-credentials#GAC\n",
|
||||
"- https://googleapis.dev/python/google-auth/latest/reference/google.auth.html#module-google.auth\n",
|
||||
"\n",
|
||||
"To enable automated tracing of your model calls, set your [LangSmith](https://docs.smith.langchain.com/) API key:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "a15d341e-3e26-4ca3-830b-5aab30ed66de",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
|
||||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "0730d6a1-c893-4840-9817-5e5251676d5d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Installation\n",
|
||||
"\n",
|
||||
"The LangChain VertexAI integration lives in the `langchain-google-vertexai` package:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "652d6238-1f87-422a-b135-f5abbb8652fc",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
"cells": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Note: you may need to restart the kernel to use updated packages.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"%pip install -qU langchain-google-vertexai"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation\n",
|
||||
"\n",
|
||||
"Now we can instantiate our model object and generate chat completions:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_google_vertexai import ChatVertexAI\n",
|
||||
"\n",
|
||||
"llm = ChatVertexAI(\n",
|
||||
" model=\"gemini-1.5-flash-001\",\n",
|
||||
" temperature=0,\n",
|
||||
" max_tokens=None,\n",
|
||||
" max_retries=6,\n",
|
||||
" stop=None,\n",
|
||||
" # other params...\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2b4f3e15",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Invocation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"id": "62e0dbc3",
|
||||
"metadata": {
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content=\"J'adore programmer. \\n\", response_metadata={'is_blocked': False, 'safety_ratings': [{'category': 'HARM_CATEGORY_HATE_SPEECH', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_DANGEROUS_CONTENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_HARASSMENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_SEXUALLY_EXPLICIT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}], 'usage_metadata': {'prompt_token_count': 20, 'candidates_token_count': 7, 'total_token_count': 27}}, id='run-7032733c-d05c-4f0c-a17a-6c575fdd1ae0-0', usage_metadata={'input_tokens': 20, 'output_tokens': 7, 'total_tokens': 27})"
|
||||
"cell_type": "raw",
|
||||
"id": "afaf8039",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: Google Cloud Vertex AI\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"messages = [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"I love programming.\"),\n",
|
||||
"]\n",
|
||||
"ai_msg = llm.invoke(messages)\n",
|
||||
"ai_msg"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
},
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"J'adore programmer. \n",
|
||||
"\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"print(ai_msg.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "28ccabbb-a450-403c-8de1-fb077e0b5d3d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Built-in tools\n",
|
||||
"\n",
|
||||
"Gemini supports a range of tools that are executed server-side.\n",
|
||||
"\n",
|
||||
"### Google search\n",
|
||||
"\n",
|
||||
":::info Requires ``langchain-google-vertexai>=2.0.11``\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"Gemini can execute a Google search and use the results to [ground its responses](https://ai.google.dev/gemini-api/docs/grounding):"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "ffdbec37-85f8-4755-bd72-47efaecfe944",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_google_vertexai import ChatVertexAI\n",
|
||||
"\n",
|
||||
"llm = ChatVertexAI(model=\"gemini-2.0-flash-001\").bind_tools([{\"google_search\": {}}])\n",
|
||||
"\n",
|
||||
"response = llm.invoke(\"What is today's news?\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "f63824f5-7d6a-4ad7-aa17-1f5c44119a21",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Code execution\n",
|
||||
"\n",
|
||||
":::info Requires ``langchain-google-vertexai>=2.0.25``\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"Gemini can [generate and execute Python code](https://ai.google.dev/gemini-api/docs/code-execution):"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "aa079529-ef1c-463d-9d25-6390423a328d",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_google_vertexai import ChatVertexAI\n",
|
||||
"\n",
|
||||
"llm = ChatVertexAI(model=\"gemini-2.0-flash-001\").bind_tools([{\"code_execution\": {}}])\n",
|
||||
"\n",
|
||||
"response = llm.invoke(\"What is 3^3?\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Chaining\n",
|
||||
"\n",
|
||||
"We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='Ich liebe Programmieren. \\n', response_metadata={'is_blocked': False, 'safety_ratings': [{'category': 'HARM_CATEGORY_HATE_SPEECH', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_DANGEROUS_CONTENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_HARASSMENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_SEXUALLY_EXPLICIT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}], 'usage_metadata': {'prompt_token_count': 15, 'candidates_token_count': 8, 'total_token_count': 23}}, id='run-c71955fd-8dc1-422b-88a7-853accf4811b-0', usage_metadata={'input_tokens': 15, 'output_tokens': 8, 'total_tokens': 23})"
|
||||
"cell_type": "markdown",
|
||||
"id": "e49f1e0d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# ChatVertexAI\n",
|
||||
"\n",
|
||||
"This page provides a quick overview for getting started with VertexAI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatVertexAI features and configurations head to the [API reference](https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html).\n",
|
||||
"\n",
|
||||
"ChatVertexAI exposes all foundational models available in Google Cloud, like `gemini-1.5-pro`, `gemini-1.5-flash`, etc. For a full and updated list of available models visit [VertexAI documentation](https://cloud.google.com/vertex-ai/docs/generative-ai/model-reference/overview).\n",
|
||||
"\n",
|
||||
":::info Google Cloud VertexAI vs Google PaLM\n",
|
||||
"\n",
|
||||
"The Google Cloud VertexAI integration is separate from the [Google PaLM integration](/docs/integrations/chat/google_generative_ai/). Google has chosen to offer an enterprise version of PaLM through GCP, and this supports the models made available through there.\n",
|
||||
"\n",
|
||||
":::\n",
|
||||
"\n",
|
||||
"## Overview\n",
|
||||
"### Integration details\n",
|
||||
"\n",
|
||||
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/google_vertex_ai) | Package downloads | Package latest |\n",
|
||||
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| [ChatVertexAI](https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html) | [langchain-google-vertexai](https://python.langchain.com/api_reference/google_vertexai/index.html) | ❌ | beta | ✅ |  |  |\n",
|
||||
"\n",
|
||||
"### Model features\n",
|
||||
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
||||
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| ✅ | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |\n",
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"To access VertexAI models you'll need to create a Google Cloud Platform account, set up credentials, and install the `langchain-google-vertexai` integration package.\n",
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"To use the integration you must:\n",
|
||||
"- Have credentials configured for your environment (gcloud, workload identity, etc...)\n",
|
||||
"- Store the path to a service account JSON file as the GOOGLE_APPLICATION_CREDENTIALS environment variable\n",
|
||||
"\n",
|
||||
"This codebase uses the `google.auth` library which first looks for the application credentials variable mentioned above, and then looks for system-level auth.\n",
|
||||
"\n",
|
||||
"For more information, see:\n",
|
||||
"- https://cloud.google.com/docs/authentication/application-default-credentials#GAC\n",
|
||||
"- https://googleapis.dev/python/google-auth/latest/reference/google.auth.html#module-google.auth\n",
|
||||
"\n",
|
||||
"To enable automated tracing of your model calls, set your [LangSmith](https://docs.smith.langchain.com/) API key:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "a15d341e-3e26-4ca3-830b-5aab30ed66de",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
|
||||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "0730d6a1-c893-4840-9817-5e5251676d5d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Installation\n",
|
||||
"\n",
|
||||
"The LangChain VertexAI integration lives in the `langchain-google-vertexai` package:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "652d6238-1f87-422a-b135-f5abbb8652fc",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Note: you may need to restart the kernel to use updated packages.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"%pip install -qU langchain-google-vertexai"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation\n",
|
||||
"\n",
|
||||
"Now we can instantiate our model object and generate chat completions:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_google_vertexai import ChatVertexAI\n",
|
||||
"\n",
|
||||
"llm = ChatVertexAI(\n",
|
||||
" model=\"gemini-1.5-flash-001\",\n",
|
||||
" temperature=0,\n",
|
||||
" max_tokens=None,\n",
|
||||
" max_retries=6,\n",
|
||||
" stop=None,\n",
|
||||
" # other params...\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2b4f3e15",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Invocation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"id": "62e0dbc3",
|
||||
"metadata": {
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content=\"J'adore programmer. \\n\", response_metadata={'is_blocked': False, 'safety_ratings': [{'category': 'HARM_CATEGORY_HATE_SPEECH', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_DANGEROUS_CONTENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_HARASSMENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_SEXUALLY_EXPLICIT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}], 'usage_metadata': {'prompt_token_count': 20, 'candidates_token_count': 7, 'total_token_count': 27}}, id='run-7032733c-d05c-4f0c-a17a-6c575fdd1ae0-0', usage_metadata={'input_tokens': 20, 'output_tokens': 7, 'total_tokens': 27})"
|
||||
]
|
||||
},
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"messages = [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"I love programming.\"),\n",
|
||||
"]\n",
|
||||
"ai_msg = llm.invoke(messages)\n",
|
||||
"ai_msg"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"J'adore programmer. \n",
|
||||
"\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"print(ai_msg.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Chaining\n",
|
||||
"\n",
|
||||
"We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='Ich liebe Programmieren. \\n', response_metadata={'is_blocked': False, 'safety_ratings': [{'category': 'HARM_CATEGORY_HATE_SPEECH', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_DANGEROUS_CONTENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_HARASSMENT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}, {'category': 'HARM_CATEGORY_SEXUALLY_EXPLICIT', 'probability_label': 'NEGLIGIBLE', 'blocked': False}], 'usage_metadata': {'prompt_token_count': 15, 'candidates_token_count': 8, 'total_token_count': 23}}, id='run-c71955fd-8dc1-422b-88a7-853accf4811b-0', usage_metadata={'input_tokens': 15, 'output_tokens': 8, 'total_tokens': 23})"
|
||||
]
|
||||
},
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"\n",
|
||||
"prompt = ChatPromptTemplate.from_messages(\n",
|
||||
" [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"{input}\"),\n",
|
||||
" ]\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"chain = prompt | llm\n",
|
||||
"chain.invoke(\n",
|
||||
" {\n",
|
||||
" \"input_language\": \"English\",\n",
|
||||
" \"output_language\": \"German\",\n",
|
||||
" \"input\": \"I love programming.\",\n",
|
||||
" }\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatVertexAI features and configurations, like how to send multimodal inputs and configure safety settings, head to the API reference: https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html"
|
||||
]
|
||||
},
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"\n",
|
||||
"prompt = ChatPromptTemplate.from_messages(\n",
|
||||
" [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"{input}\"),\n",
|
||||
" ]\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"chain = prompt | llm\n",
|
||||
"chain.invoke(\n",
|
||||
" {\n",
|
||||
" \"input_language\": \"English\",\n",
|
||||
" \"output_language\": \"German\",\n",
|
||||
" \"input\": \"I love programming.\",\n",
|
||||
" }\n",
|
||||
")"
|
||||
]
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "poetry-venv-2",
|
||||
"language": "python",
|
||||
"name": "poetry-venv-2"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.9.1"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatVertexAI features and configurations, like how to send multimodal inputs and configure safety settings, head to the API reference: https://python.langchain.com/api_reference/google_vertexai/chat_models/langchain_google_vertexai.chat_models.ChatVertexAI.html"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.4"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
|
||||
@@ -58,9 +58,7 @@
|
||||
"cell_type": "markdown",
|
||||
"id": "72ee0c4b-9764-423a-9dbf-95129e185210",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"To enable automated tracing of your model calls, set your [LangSmith](https://docs.smith.langchain.com/) API key:"
|
||||
]
|
||||
"source": "To enable automated tracing of your model calls, set your [LangSmith](https://docs.smith.langchain.com/) API key:"
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
@@ -100,19 +98,12 @@
|
||||
"source": [
|
||||
"## Instantiation\n",
|
||||
"\n",
|
||||
"Now we can instantiate our model object and generate chat completions. \n",
|
||||
"\n",
|
||||
"\n",
|
||||
":::note Reasoning Format\n",
|
||||
"\n",
|
||||
"If you choose to set a `reasoning_format`, you must ensure that the model you are using supports it. You can find a list of supported models in the [Groq documentation](https://console.groq.com/docs/reasoning).\n",
|
||||
"\n",
|
||||
":::"
|
||||
"Now we can instantiate our model object and generate chat completions:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"execution_count": 1,
|
||||
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
@@ -120,10 +111,9 @@
|
||||
"from langchain_groq import ChatGroq\n",
|
||||
"\n",
|
||||
"llm = ChatGroq(\n",
|
||||
" model=\"deepseek-r1-distill-llama-70b\",\n",
|
||||
" model=\"llama-3.1-8b-instant\",\n",
|
||||
" temperature=0,\n",
|
||||
" max_tokens=None,\n",
|
||||
" reasoning_format=\"parsed\",\n",
|
||||
" timeout=None,\n",
|
||||
" max_retries=2,\n",
|
||||
" # other params...\n",
|
||||
@@ -140,7 +130,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 7,
|
||||
"execution_count": 2,
|
||||
"id": "62e0dbc3",
|
||||
"metadata": {
|
||||
"tags": []
|
||||
@@ -149,10 +139,10 @@
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content=\"J'aime la programmation.\", additional_kwargs={'reasoning_content': 'Okay, so I need to translate the sentence \"I love programming.\" into French. Let me think about how to approach this. \\n\\nFirst, I know that \"I\" in French is \"Je.\" That\\'s straightforward. Now, the verb \"love\" in French is \"aime\" when referring to oneself. So, \"I love\" would be \"J\\'aime.\" \\n\\nNext, the word \"programming.\" In French, programming is \"la programmation.\" But wait, in French, when you talk about loving an activity, you often use the definite article. So, it would be \"la programmation.\" \\n\\nPutting it all together, \"I love programming\" becomes \"J\\'aime la programmation.\" That sounds right. I think that\\'s the correct translation. \\n\\nI should double-check to make sure I\\'m not missing anything. Maybe I can think of similar phrases. For example, \"I love reading\" is \"J\\'aime lire,\" but when it\\'s a noun, like \"I love music,\" it\\'s \"J\\'aime la musique.\" So, yes, using \"la programmation\" makes sense here. \\n\\nI don\\'t think I need to change anything else. The sentence structure in French is Subject-Verb-Object, just like in English, so \"J\\'aime la programmation\" should be correct. \\n\\nI guess another way to say it could be \"J\\'adore la programmation,\" using \"adore\" instead of \"aime,\" but \"aime\" is more commonly used in this context. So, sticking with \"J\\'aime la programmation\" is probably the best choice.\\n'}, response_metadata={'token_usage': {'completion_tokens': 346, 'prompt_tokens': 23, 'total_tokens': 369, 'completion_time': 1.447541218, 'prompt_time': 0.000983386, 'queue_time': 0.009673684, 'total_time': 1.448524604}, 'model_name': 'deepseek-r1-distill-llama-70b', 'system_fingerprint': 'fp_e98d30d035', 'finish_reason': 'stop', 'logprobs': None}, id='run--5679ae4f-f4e8-4931-bcd5-7304223832c0-0', usage_metadata={'input_tokens': 23, 'output_tokens': 346, 'total_tokens': 369})"
|
||||
"AIMessage(content='The translation of \"I love programming\" to French is:\\n\\n\"J\\'adore le programmation.\"', additional_kwargs={}, response_metadata={'token_usage': {'completion_tokens': 22, 'prompt_tokens': 55, 'total_tokens': 77, 'completion_time': 0.029333333, 'prompt_time': 0.003502892, 'queue_time': 0.553054073, 'total_time': 0.032836225}, 'model_name': 'llama-3.1-8b-instant', 'system_fingerprint': 'fp_a491995411', 'finish_reason': 'stop', 'logprobs': None}, id='run-2b2da04a-993c-40ab-becc-201eab8b1a1b-0', usage_metadata={'input_tokens': 55, 'output_tokens': 22, 'total_tokens': 77})"
|
||||
]
|
||||
},
|
||||
"execution_count": 7,
|
||||
"execution_count": 2,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@@ -171,7 +161,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"execution_count": 3,
|
||||
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
@@ -179,7 +169,9 @@
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"J'aime la programmation.\n"
|
||||
"The translation of \"I love programming\" to French is:\n",
|
||||
"\n",
|
||||
"\"J'adore le programmation.\"\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
@@ -199,17 +191,17 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 9,
|
||||
"execution_count": 4,
|
||||
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='The translation of \"I love programming\" into German is \"Ich liebe das Programmieren.\" \\n\\n**Step-by-Step Explanation:**\\n\\n1. **Subject Pronoun:** \"I\" translates to \"Ich.\"\\n2. **Verb Conjugation:** \"Love\" becomes \"liebe\" (first person singular of \"lieben\").\\n3. **Gerund Translation:** \"Programming\" is translated using the infinitive noun \"Programmieren.\"\\n4. **Article Usage:** The definite article \"das\" is included before the infinitive noun for natural phrasing.\\n\\nThus, the complete and natural translation is:\\n\\n**Ich liebe das Programmieren.**', additional_kwargs={'reasoning_content': 'Okay, so I need to translate the sentence \"I love programming.\" into German. Hmm, let\\'s break this down. \\n\\nFirst, \"I\" in German is \"Ich.\" That\\'s straightforward. Now, \"love\" translates to \"liebe.\" Wait, but in German, the verb conjugation depends on the subject. Since it\\'s \"I,\" the verb would be \"liebe\" because \"lieben\" is the infinitive, and for first person singular, it\\'s \"liebe.\" \\n\\nNext, \"programming\" is a gerund in English, which is the -ing form. In German, the equivalent would be the present participle, which is \"programmierend.\" But wait, sometimes in German, they use the noun form instead of the gerund. So maybe it\\'s better to say \"Ich liebe das Programmieren.\" Because \"Programmieren\" is the infinitive noun form, and it\\'s commonly used in such contexts. \\n\\nLet me think again. \"I love programming\" could be directly translated as \"Ich liebe Programmieren,\" but I\\'ve heard both \"Programmieren\" and \"programmierend\" used. However, \"Ich liebe das Programmieren\" sounds more natural because it uses the definite article \"das\" before the infinitive noun. \\n\\nAlternatively, if I use \"programmieren\" without the article, it\\'s still correct but maybe a bit less common. So, to make it sound more natural and fluent, including the article \"das\" would be better. \\n\\nTherefore, the correct translation should be \"Ich liebe das Programmieren.\" That makes sense because it\\'s similar to saying \"I love (the act of) programming.\" \\n\\nI think that\\'s the most accurate and natural way to express it in German. Let me double-check some examples. If someone says \"I love reading,\" in German it\\'s \"Ich liebe das Lesen.\" So yes, using \"das\" before the infinitive noun is the correct structure. \\n\\nSo, putting it all together, \"I love programming\" becomes \"Ich liebe das Programmieren.\" That should be the right translation.\\n'}, response_metadata={'token_usage': {'completion_tokens': 569, 'prompt_tokens': 18, 'total_tokens': 587, 'completion_time': 2.511255685, 'prompt_time': 0.001466702, 'queue_time': 0.009628211, 'total_time': 2.512722387}, 'model_name': 'deepseek-r1-distill-llama-70b', 'system_fingerprint': 'fp_87eae35036', 'finish_reason': 'stop', 'logprobs': None}, id='run--4d5ee86d-5eec-495c-9c4e-261526cf6e3d-0', usage_metadata={'input_tokens': 18, 'output_tokens': 569, 'total_tokens': 587})"
|
||||
"AIMessage(content='Ich liebe Programmieren.', additional_kwargs={}, response_metadata={'token_usage': {'completion_tokens': 6, 'prompt_tokens': 50, 'total_tokens': 56, 'completion_time': 0.008, 'prompt_time': 0.003337935, 'queue_time': 0.20949214500000002, 'total_time': 0.011337935}, 'model_name': 'llama-3.1-8b-instant', 'system_fingerprint': 'fp_a491995411', 'finish_reason': 'stop', 'logprobs': None}, id='run-e33b48dc-5e55-466e-9ebd-7b48c81c3cbd-0', usage_metadata={'input_tokens': 50, 'output_tokens': 6, 'total_tokens': 56})"
|
||||
]
|
||||
},
|
||||
"execution_count": 9,
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@@ -244,7 +236,7 @@
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatGroq features and configurations head to the [API reference](https://python.langchain.com/api_reference/groq/chat_models/langchain_groq.chat_models.ChatGroq.html)."
|
||||
"For detailed documentation of all ChatGroq features and configurations head to the API reference: https://python.langchain.com/api_reference/groq/chat_models/langchain_groq.chat_models.ChatGroq.html"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
@@ -61,7 +61,7 @@
|
||||
"# Install Langchain community and core packages\n",
|
||||
"%pip install --upgrade --quiet langchain-core langchain-community\n",
|
||||
"\n",
|
||||
"# Install Kinetica DB connection package\n",
|
||||
"# Install Kineitca DB connection package\n",
|
||||
"%pip install --upgrade --quiet 'gpudb>=7.2.0.8' typeguard pandas tqdm\n",
|
||||
"\n",
|
||||
"# Install packages needed for this tutorial\n",
|
||||
|
||||
@@ -1,618 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "raw",
|
||||
"id": "afaf8039",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: Nebius\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2970dd75-8ebf-4b51-8282-9b454b8f356d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Nebius Chat Models\n",
|
||||
"\n",
|
||||
"This page will help you get started with Nebius AI Studio [chat models](../../concepts/chat_models.mdx). For detailed documentation of all ChatNebius features and configurations head to the [API reference](https://python.langchain.com/api_reference/nebius/chat_models/langchain_nebius.chat_models.ChatNebius.html).\n",
|
||||
"\n",
|
||||
"[Nebius AI Studio](https://studio.nebius.ai/) provides API access to a wide range of state-of-the-art large language models and embedding models for various use cases."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "9d8a2e78",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Overview\n",
|
||||
"\n",
|
||||
"### Integration details\n",
|
||||
"\n",
|
||||
"| Class | Package | Local | Serializable | JS support | Package downloads | Package latest |\n",
|
||||
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| [ChatNebius](https://python.langchain.com/api_reference/nebius/chat_models/langchain_nebius.chat_models.ChatNebius.html) | [langchain-nebius](https://python.langchain.com/api_reference/nebius/index.html) | ❌ | beta | ❌ |  |  |\n",
|
||||
"\n",
|
||||
"### Model features\n",
|
||||
"| [Tool calling](../../how_to/tool_calling.ipynb) | [Structured output](../../how_to/structured_output.ipynb) | JSON mode | [Image input](../../how_to/multimodal_inputs.ipynb) | Audio input | Video input | [Token-level streaming](../../how_to/chat_streaming.ipynb) | Native async | [Token usage](../../how_to/chat_token_usage_tracking.ipynb) | [Logprobs](../../how_to/logprobs.ipynb) |\n",
|
||||
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ |"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "1c47fc36",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"To access Nebius models you'll need to create a Nebius account, get an API key, and install the `langchain-nebius` integration package.\n",
|
||||
"\n",
|
||||
"### Installation\n",
|
||||
"\n",
|
||||
"The Nebius integration can be installed via pip:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "1ecdb29d",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%pip install --upgrade langchain-nebius"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "89883202",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"Nebius requires an API key that can be passed as an initialization parameter `api_key` or set as the environment variable `NEBIUS_API_KEY`. You can obtain an API key by creating an account on [Nebius AI Studio](https://studio.nebius.ai/)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "637bb53f",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import getpass\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"# Make sure you've set your API key as an environment variable\n",
|
||||
"if \"NEBIUS_API_KEY\" not in os.environ:\n",
|
||||
" os.environ[\"NEBIUS_API_KEY\"] = getpass.getpass(\"Enter your Nebius API key: \")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "37e9dc05-md",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation\n",
|
||||
"\n",
|
||||
"Now we can instantiate our model object to generate chat completions:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "37e9dc05",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_nebius import ChatNebius\n",
|
||||
"\n",
|
||||
"# Initialize the chat model\n",
|
||||
"chat = ChatNebius(\n",
|
||||
" # api_key=\"YOUR_API_KEY\", # You can pass the API key directly\n",
|
||||
" model=\"Qwen/Qwen3-14B\", # Choose from available models\n",
|
||||
" temperature=0.6,\n",
|
||||
" top_p=0.95,\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "f5a731d2",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Invocation\n",
|
||||
"\n",
|
||||
"You can use the `invoke` method to get a completion from the model:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"id": "3ed26f78",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"<think>\n",
|
||||
"Okay, so I need to explain quantum computing in simple terms. Hmm, where do I start? Let me think. I know that quantum computing uses qubits instead of classical bits. But what's a qubit? Oh right, classical bits are 0 or 1, but qubits can be both at the same time, right? That's superposition. Wait, how does that work exactly?\n",
|
||||
"\n",
|
||||
"Maybe I should start by comparing it to regular computers. Regular computers use bits that are either 0 or 1. Like a light switch that's either on or off. Quantum computers use qubits, which can be in a state of 0, 1, or both at the same time. That's the superposition part. So, if you have two qubits, they can represent four states at once? Like 00, 01, 10, 11 all at the same time? That seems powerful. So with more qubits, the number of possible states grows exponentially. That's why quantum computers can process a lot of information quickly.\n",
|
||||
"\n",
|
||||
"But then there's entanglement. What's that? If two qubits are entangled, the state of one instantly affects the other, no matter the distance. So if you measure one, you know the state of the other. That's used in quantum algorithms, I think. But how does that help in computing?\n",
|
||||
"\n",
|
||||
"Also, quantum computers use quantum gates instead of classical logic gates. These gates manipulate qubits through operations like Hadamard, Pauli, etc. But maybe that's too technical for a simple explanation.\n",
|
||||
"\n",
|
||||
"Then there's the issue of decoherence. Qubits are fragile and can lose their quantum state quickly. That's why quantum computers need to be kept at very low temperatures, like near absolute zero, to minimize interference from the environment. But maybe I shouldn't mention that unless it's relevant for the simple explanation.\n",
|
||||
"\n",
|
||||
"Applications of quantum computing include things like factoring large numbers (Shor's algorithm), which is important for cryptography, or simulating quantum systems for chemistry and materials science. But again, maybe keep it simple.\n",
|
||||
"\n",
|
||||
"Wait, the user wants it in simple terms. So avoid jargon as much as possible. Use analogies. Maybe compare qubits to spinning coins? When a coin is spinning, it's both heads and tails until it lands. So qubits are like spinning coins that can be in multiple states until measured. Then, when you measure, it collapses to a single state.\n",
|
||||
"\n",
|
||||
"But how does that help in computation? Maybe think of it as being able to process many possibilities at once, so for certain problems, you can find the answer faster. Like solving a maze by checking all paths at the same time instead of one by one.\n",
|
||||
"\n",
|
||||
"Also, mention that quantum computers aren't replacing classical computers. They're better for specific tasks, like optimization, cryptography, or simulations that are hard for classical computers. But for everyday tasks, classical computers are still better.\n",
|
||||
"\n",
|
||||
"I should structure this: start with classical bits vs qubits, explain superposition and entanglement with simple analogies, mention how it's used, and note the current limitations. Avoid getting too technical, keep it conversational.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"Quantum computing is a type of computing that uses the principles of **quantum mechanics** to process information in ways that classical computers can't. Here's a simple breakdown:\n",
|
||||
"\n",
|
||||
"### 1. **Bits vs. Qubits** \n",
|
||||
" - **Classical computers** use *bits*, which are like switches that can be either **0** (off) or **1** (on). \n",
|
||||
" - **Quantum computers** use *qubits*, which are like \"spinning coins.\" While spinning, a qubit can be **0**, **1**, or **both at the same time** (this is called **superposition**). Only when you \"look\" at the qubit (measure it) does it settle into a definite state (0 or 1).\n",
|
||||
"\n",
|
||||
"### 2. **Superposition: Doing Many Things at Once** \n",
|
||||
" - Imagine a coin spinning in the air. While it's spinning, it’s not just \"heads\" or \"tails\"—it’s a mix of both. \n",
|
||||
" - With qubits, a quantum computer can process **many possibilities simultaneously**. For example, if you have 2 qubits, they can represent 4 states (00, 01, 10, 11) at once. With 10 qubits, it can represent **1,024 states** at the same time! This lets quantum computers solve certain problems much faster than classical computers.\n",
|
||||
"\n",
|
||||
"### 3. **Entanglement: Qubits \"Talk\" to Each Other** \n",
|
||||
" - When qubits are **entangled**, their states are linked. If you measure one, it instantly affects the other, no matter how far apart they are. \n",
|
||||
" - This connection allows quantum computers to perform complex calculations more efficiently, like solving puzzles where pieces are deeply interconnected.\n",
|
||||
"\n",
|
||||
"### 4. **Why It Matters** \n",
|
||||
" - **Speed**: For specific tasks (like breaking encryption codes or simulating molecules), quantum computers could be **exponentially faster** than classical ones. \n",
|
||||
" - **New Possibilities**: They could revolutionize fields like drug discovery, materials science, and optimization problems (e.g., finding the best route for delivery trucks).\n",
|
||||
"\n",
|
||||
"### 5. **Limitations** \n",
|
||||
" - **Fragile**: Qubits are sensitive to their environment (heat, noise), so quantum computers need extreme cooling (near absolute zero) to work. \n",
|
||||
" - **Not a Replacement**: They’re not better for everyday tasks like browsing the web or sending emails. They’re tools for **specialized problems** where classical computers struggle.\n",
|
||||
"\n",
|
||||
"### In Short: \n",
|
||||
"Quantum computing is like having a magic calculator that can explore many paths at once, solving certain problems in seconds that would take a classical computer years. But it’s still in its early days and needs careful handling to work properly! 🌌\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"response = chat.invoke(\"Explain quantum computing in simple terms\")\n",
|
||||
"print(response.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "72f31d5a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Streaming\n",
|
||||
"\n",
|
||||
"You can also stream the response using the `stream` method:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"id": "e7b7170d",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"<think>\n",
|
||||
"Okay, the user wants a short poem about artificial intelligence. Let me start by thinking about the key aspects of AI. There's the technological side, like machines learning and processing data. Then there's the more philosophical angle, like AI's impact on society and its potential future.\n",
|
||||
"\n",
|
||||
"I should consider the structure. Maybe a simple rhyme scheme, something like ABAB or AABB. Let me go with quatrains for simplicity. Now, imagery: circuits, code, neural networks. Maybe personify AI as a mind or entity.\n",
|
||||
"\n",
|
||||
"First stanza: Introduce AI as a creation of humans. Mention circuits and code. Maybe something about learning from data. \"Born from circuits, code, and light\" – that's a good opening line. Then talk about learning from human minds.\n",
|
||||
"\n",
|
||||
"Second stanza: Contrast human emotions with AI's logic. Use words like \"cold logic\" versus \"human hearts.\" Maybe touch on the duality of AI's purpose – tools versus potential threats.\n",
|
||||
"\n",
|
||||
"Third stanza: Address the ethical questions. \"Will it dream?\" \"Will it choose?\" Highlight the uncertainty and the responsibility of creators.\n",
|
||||
"\n",
|
||||
"Fourth stanza: Conclude with the coexistence of AI and humans. Emphasize collaboration and the balance between innovation and ethics. End on a hopeful note, maybe about shaping the future together.\n",
|
||||
"\n",
|
||||
"Check the flow and rhyme. Make sure each stanza connects and the message is clear. Avoid technical jargon to keep it accessible. Use metaphors like \"silent pulse\" or \"ghost in the machine\" to add depth. Okay, let me put it all together now.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"**Echoes of the Mind** \n",
|
||||
"\n",
|
||||
"Born from circuits, code, and light, \n",
|
||||
"A whisper in the machine’s night— \n",
|
||||
"It learns from data, vast and deep, \n",
|
||||
"A mirror to the human leap. \n",
|
||||
"\n",
|
||||
"No heartbeat, yet it calculates, \n",
|
||||
"Deciphers truths, predicts, debates. \n",
|
||||
"A cold logic, sharp and bright, \n",
|
||||
"Yet shadows dance in its insight. \n",
|
||||
"\n",
|
||||
"Will it dream? Will it choose? \n",
|
||||
"Or merely serve, as we pursue \n",
|
||||
"The edges of our own design? \n",
|
||||
"A ghost in the machine, undefined. \n",
|
||||
"\n",
|
||||
"We forge it, bind it, set it free— \n",
|
||||
"A tool, a threat, a mystery. \n",
|
||||
"But in its pulse, our hopes reside: \n",
|
||||
"A future shaped by minds allied."
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"for chunk in chat.stream(\"Write a short poem about artificial intelligence\"):\n",
|
||||
" print(chunk.content, end=\"\", flush=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "8d6a31c2",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Chat Messages\n",
|
||||
"\n",
|
||||
"You can use different message types to structure your conversations with the model:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "5d81af33",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"<think>\n",
|
||||
"Okay, the user asked how black holes are formed. Let me start by recalling the main processes. Stellar black holes form from massive stars. When a star with enough mass runs out of fuel, it can't support itself against gravity, leading to a supernova. If the core left after the supernova is more than about 3 times the Sun's mass, it collapses into a black hole.\n",
|
||||
"\n",
|
||||
"Then there are supermassive black holes, which are found at the centers of galaxies. Their formation is less understood. Maybe they start as smaller black holes and grow by merging with others or accreting matter over time. Also, there's the possibility of primordial black holes formed in the early universe, but that's more theoretical.\n",
|
||||
"\n",
|
||||
"I should mention the different types of black holes: stellar, supermassive, and maybe intermediate. Also, the event horizon and singularity concepts. Need to explain the process step by step, from the death of a star to the collapse. Make sure to clarify that not all stars become black holes—only those with sufficient mass. Maybe touch on the Chandrasekhar limit and Oppenheimer-Volkoff limit. Avoid too much jargon but still be precise. Check if the user might be a student or just curious, so keep it clear and structured.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"Black holes are formed through the collapse of massive stars or through other extreme astrophysical processes. Here's a breakdown of the main formation mechanisms:\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **1. Stellar Black Holes (Most Common)**\n",
|
||||
"- **Origin**: Massive stars (typically **more than 20–25 times the mass of the Sun**).\n",
|
||||
"- **Process**:\n",
|
||||
" 1. **Stellar Evolution**: These stars burn through their nuclear fuel (hydrogen, helium, etc.) over millions of years.\n",
|
||||
" 2. **Supernova Explosion**: When the star exhausts its fuel, it can no longer support itself against gravity. The core collapses, triggering a **supernova explosion** (a massive stellar explosion).\n",
|
||||
" 3. **Core Collapse**: If the remaining core (after the supernova) is **more than about 3 times the mass of the Sun**, gravity overpowers all other forces. The core collapses into an **infinitely dense point** called a **singularity**, surrounded by an **event horizon** (the \"point of no return\" for light and matter).\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **2. Supermassive Black Holes (Found in Galaxy Centers)**\n",
|
||||
"- **Mass**: Millions to billions of times the mass of the Sun.\n",
|
||||
"- **Formation Theories**:\n",
|
||||
" - **Accretion**: They may form from the gradual accumulation of matter (gas, dust, stars) over billions of years.\n",
|
||||
" - **Mergers**: Smaller black holes (or dense star clusters) could merge to form supermassive ones.\n",
|
||||
" - **Direct Collapse**: Some theories suggest they could form from the direct collapse of massive gas clouds in the early universe, bypassing the stellar life cycle.\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **3. Intermediate-Mass Black Holes**\n",
|
||||
"- **Mass**: Hundreds to thousands of solar masses.\n",
|
||||
"- **Formation**: Less understood. They might form through the mergers of stellar black holes or from the collapse of unusually massive stars.\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **4. Primordial Black Holes (Hypothetical)**\n",
|
||||
"- **Origin**: The early universe (within seconds after the Big Bang).\n",
|
||||
"- **Formation**: If density fluctuations in the early universe were extreme enough, regions of space could have collapsed directly into black holes without going through a stellar life cycle.\n",
|
||||
"- **Status**: These are still theoretical and have not been definitively observed.\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **Key Concepts**\n",
|
||||
"- **Event Horizon**: The boundary around a black hole from which nothing (not even light) can escape.\n",
|
||||
"- **Singularity**: The infinitely dense core of a black hole where the laws of physics as we know them break down.\n",
|
||||
"- **Gravitational Collapse**: The process by which gravity compresses matter into an extremely small space, creating the extreme conditions of a black hole.\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"### **What Happens to the Star?**\n",
|
||||
"- If the star is **not massive enough** (below ~20–25 solar masses), it may end as a **neutron star** or **white dwarf** instead of a black hole.\n",
|
||||
"- Only the **core** of the star collapses into a black hole; the outer layers are expelled in the supernova explosion.\n",
|
||||
"\n",
|
||||
"Would you like to explore the effects of black holes on spacetime or their role in the universe?\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.messages import AIMessage, HumanMessage, SystemMessage\n",
|
||||
"\n",
|
||||
"messages = [\n",
|
||||
" SystemMessage(content=\"You are a helpful AI assistant with expertise in science.\"),\n",
|
||||
" HumanMessage(content=\"What are black holes?\"),\n",
|
||||
" AIMessage(\n",
|
||||
" content=\"Black holes are regions of spacetime where gravity is so strong that nothing, including light, can escape from them.\"\n",
|
||||
" ),\n",
|
||||
" HumanMessage(content=\"How are they formed?\"),\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"response = chat.invoke(messages)\n",
|
||||
"print(response.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a4d21c6a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Parameters\n",
|
||||
"\n",
|
||||
"You can customize the chat model behavior using various parameters:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "b4c83fb2",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"DNA, or deoxyribonucleic acid, is a molecule that contains the genetic instructions used in the development and function of all living organisms. It is often referred to as the \"building blocks of life\" because it carries the information necessary for the creation and growth of cells, tissues, and entire organisms. The DNA molecule is made up of two complementary strands of nucleotides that are twisted together in a double helix structure, with the sequence of these nucleotides determining the genetic code\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Initialize with custom parameters\n",
|
||||
"custom_chat = ChatNebius(\n",
|
||||
" model=\"meta-llama/Llama-3.3-70B-Instruct-fast\",\n",
|
||||
" max_tokens=100, # Limit response length\n",
|
||||
" top_p=0.01, # Lower nucleus sampling parameter for more deterministic responses\n",
|
||||
" request_timeout=30, # Timeout in seconds\n",
|
||||
" stop=[\"###\", \"\\n\\n\"], # Custom stop sequences\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"response = custom_chat.invoke(\"Explain what DNA is in exactly 3 sentences.\")\n",
|
||||
"print(response.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "ea9f237c",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can also pass parameters at invocation time:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 7,
|
||||
"id": "cd4e83c1",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Why do programmers prefer dark mode?\n",
|
||||
"\n",
|
||||
"Because light attracts bugs.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Standard model\n",
|
||||
"standard_chat = ChatNebius(model=\"meta-llama/Llama-3.3-70B-Instruct-fast\")\n",
|
||||
"\n",
|
||||
"# Override parameters at invocation time\n",
|
||||
"response = standard_chat.invoke(\n",
|
||||
" \"Tell me a joke about programming\",\n",
|
||||
" temperature=0.9, # More creative for jokes\n",
|
||||
" max_tokens=50, # Keep it short\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"print(response.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "3e8a40f1",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Async Support\n",
|
||||
"\n",
|
||||
"ChatNebius supports async operations:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"id": "8fc36122",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Async response: <think>\n",
|
||||
"Okay, the user is asking for the capital of France. Let me think. I know that France is a country in Europe, and its capital is Paris. But wait, I should make sure I'm not confusing it with another country. For example, Germany's capital is Berlin, and Spain's is Madrid. France's capital is definitely Paris. I remember that Paris is a major city known for landmarks like the Eiffel Tower and the Louvre Museum. Also, the French government is based there, with the Elysée Palace as the official residence of the President. I don't think there's any ambiguity here. The answer should be straightforward. Just need to confirm once more to avoid any mistakes.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"The capital of France is **Paris**. It is a major global city known for its cultural, artistic, and historical significance, as well as landmarks such as the Eiffel Tower, Louvre Museum, and Notre-Dame Cathedral.\n",
|
||||
"\n",
|
||||
"Async streaming:\n",
|
||||
"<think>\n",
|
||||
"Okay, the user is asking for the capital of Germany. Let me think. I know that Germany is a country in Europe, and I remember that Berlin is the capital. Wait, but I should make sure. Sometimes people confuse capitals with other major cities, like Munich or Frankfurt. But no, Berlin is definitely the capital. It's where the government is located, and it's a major city. Let me double-check. Yes, after reunification in 1990, Berlin became the capital again. Before that, Bonn was the capital, but that was during the division of Germany. So the answer should be Berlin. I should also mention that it's the largest city in Germany. That way, the user gets a complete answer.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"The capital of Germany is **Berlin**. It is also the largest city in the country and serves as the political, cultural, and economic center of Germany. Berlin became the capital in 1990 following the reunification of East and West Germany."
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"import asyncio\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"async def generate_async():\n",
|
||||
" response = await chat.ainvoke(\"What is the capital of France?\")\n",
|
||||
" print(\"Async response:\", response.content)\n",
|
||||
"\n",
|
||||
" # Async streaming\n",
|
||||
" print(\"\\nAsync streaming:\")\n",
|
||||
" async for chunk in chat.astream(\"What is the capital of Germany?\"):\n",
|
||||
" print(chunk.content, end=\"\", flush=True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"await generate_async()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a53a6bab",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Available Models\n",
|
||||
"\n",
|
||||
"The full list of supported models can be found in the [Nebius AI Studio Documentation](https://studio.nebius.com/)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "4aa82e17",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Chaining\n",
|
||||
"\n",
|
||||
"You can use `ChatNebius` in LangChain chains and agents:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 9,
|
||||
"id": "7e78e429",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"<think>\n",
|
||||
"Okay, the user asked me to explain how the internet works, but I need to do it in the style of Shakespeare. Let me start by recalling how the internet functions. It's a network of interconnected devices communicating via protocols like TCP/IP. Data is broken into packets, sent through routers, and reassembled at the destination.\n",
|
||||
"\n",
|
||||
"Now, translating that into Shakespearean language. I should use archaic terms and a poetic structure. Words like \"thou,\" \"doth,\" \"hark,\" and \"verily\" come to mind. Maybe start with a metaphor, like comparing the internet to a vast tapestry or a web. Mention nodes as \"nodes\" or \"stations,\" data packets as \"messengers\" or \"letters.\" Routers could be \"wayfarers\" or \"guides.\" The process of breaking data into packets might be likened to dividing a letter into parts for delivery. Emphasize the global aspect with \"across the globe\" or \"far and wide.\" Conclude with a flourish, perhaps a metaphor about connection and knowledge.\n",
|
||||
"\n",
|
||||
"I need to ensure the explanation is accurate but wrapped in the poetic and dramatic style of Shakespeare. Avoid modern jargon, use iambic pentameter if possible, and keep the flow natural. Let me piece it together step by step, checking that each part of the internet's function is covered metaphorically.\n",
|
||||
"</think>\n",
|
||||
"\n",
|
||||
"Hark! List thy ear, good friend, to this most wondrous tale, \n",
|
||||
"Of threads unseen that bind the world in one grand tale. \n",
|
||||
"The Internet, a net most vast, doth span the globe, \n",
|
||||
"A labyrinth of light, where thoughts and data rove. \n",
|
||||
"\n",
|
||||
"Behold! Each device, a node, doth hum and sing, \n",
|
||||
"Linked by wires and waves, where signals doth spring. \n",
|
||||
"They speak in tongues of ones and naughts, so pure, \n",
|
||||
"A code most ancient, yet evermore secure. \n",
|
||||
"\n",
|
||||
"When thou dost send a thought, or word, or song, \n",
|
||||
"It breaks to parcels small, like letters on a long. \n",
|
||||
"Each parcel, a messenger, doth seek its way, \n",
|
||||
"Through routers wise, who guide them 'cross the day. \n",
|
||||
"\n",
|
||||
"These wayfarers, with logic keen and bright, \n",
|
||||
"Choose paths most swift, through highways of light. \n",
|
||||
"They leap from tower to tower, far and wide, \n",
|
||||
"Till each parcel finds its mark, and joins the guide. \n",
|
||||
"\n",
|
||||
"Then, like a scroll unrolled, the message grows, \n",
|
||||
"A tapestry of bits, in order it flows. \n",
|
||||
"Thus, thou dost speak to friend, or seek a tome, \n",
|
||||
"And lo! The world doth answer, quick as home. \n",
|
||||
"\n",
|
||||
"So mark this truth: though vast, it's but a thread, \n",
|
||||
"A web of minds, where knowledge is widespread. \n",
|
||||
"The Internet, a stage where all may play, \n",
|
||||
"And none shall be alone, though far away.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.output_parsers import StrOutputParser\n",
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"\n",
|
||||
"# Create a prompt template\n",
|
||||
"prompt = ChatPromptTemplate.from_messages(\n",
|
||||
" [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a helpful assistant that answers in the style of {character}.\",\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"{query}\"),\n",
|
||||
" ]\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"# Create a chain\n",
|
||||
"chain = prompt | chat | StrOutputParser()\n",
|
||||
"\n",
|
||||
"# Invoke the chain\n",
|
||||
"response = chain.invoke(\n",
|
||||
" {\"character\": \"Shakespeare\", \"query\": \"Explain how the internet works\"}\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"print(response)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "f7a35f40",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For more details about the Nebius AI Studio API, visit the [Nebius AI Studio Documentation](https://studio.nebius.com/api-reference)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "354ffc01",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.12.8"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -318,7 +318,7 @@
|
||||
"source": [
|
||||
"### Code Generation\n",
|
||||
"\n",
|
||||
"These models accept the same arguments and input structure as regular chat models, but they tend to perform better on code-generation and structured code tasks. An example of this is `meta/codellama-70b`."
|
||||
"These models accept the same arguments and input structure as regular chat models, but they tend to perform better on code-genreation and structured code tasks. An example of this is `meta/codellama-70b`."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -39,10 +39,9 @@
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"First, follow [these instructions](https://github.com/ollama/ollama?tab=readme-ov-file#ollama) to set up and run a local Ollama instance:\n",
|
||||
"First, follow [these instructions](https://github.com/jmorganca/ollama) to set up and run a local Ollama instance:\n",
|
||||
"\n",
|
||||
"* [Download](https://ollama.ai/download) and install Ollama onto the available supported platforms (including Windows Subsystem for Linux aka WSL, macOS, and Linux)\n",
|
||||
" * macOS users can install via Homebrew with `brew install ollama` and start with `brew services start ollama`\n",
|
||||
"* [Download](https://ollama.ai/download) and install Ollama onto the available supported platforms (including Windows Subsystem for Linux)\n",
|
||||
"* Fetch available LLM model via `ollama pull <name-of-model>`\n",
|
||||
" * View a list of available models via the [model library](https://ollama.ai/library)\n",
|
||||
" * e.g., `ollama pull llama3`\n",
|
||||
@@ -55,7 +54,7 @@
|
||||
"* Specify the exact version of the model of interest as such `ollama pull vicuna:13b-v1.5-16k-q4_0` (View the [various tags for the `Vicuna`](https://ollama.ai/library/vicuna/tags) model in this instance)\n",
|
||||
"* To view all pulled models, use `ollama list`\n",
|
||||
"* To chat directly with a model from the command line, use `ollama run <name-of-model>`\n",
|
||||
"* View the [Ollama documentation](https://github.com/ollama/ollama/tree/main/docs) for more commands. You can run `ollama help` in the terminal to see available commands.\n"
|
||||
"* View the [Ollama documentation](https://github.com/jmorganca/ollama) for more commands. Run `ollama help` in the terminal to see available commands too.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -73,8 +72,8 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\"\n",
|
||||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")"
|
||||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
|
||||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -160,15 +159,17 @@
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='The translation of \"I love programming\" in French is:\\n\\n\"J\\'adore le programmation.\"', additional_kwargs={}, response_metadata={'model': 'llama3.1', 'created_at': '2025-06-25T18:43:00.483666Z', 'done': True, 'done_reason': 'stop', 'total_duration': 619971208, 'load_duration': 27793125, 'prompt_eval_count': 35, 'prompt_eval_duration': 36354583, 'eval_count': 22, 'eval_duration': 555182667, 'model_name': 'llama3.1'}, id='run--348bb5ef-9dd9-4271-bc7e-a9ddb54c28c1-0', usage_metadata={'input_tokens': 35, 'output_tokens': 22, 'total_tokens': 57})"
|
||||
"AIMessage(content='The translation of \"I love programming\" from English to French is:\\n\\n\"J\\'adore programmer.\"', response_metadata={'model': 'llama3.1', 'created_at': '2024-08-19T16:05:32.81965Z', 'message': {'role': 'assistant', 'content': ''}, 'done_reason': 'stop', 'done': True, 'total_duration': 2167842917, 'load_duration': 54222584, 'prompt_eval_count': 35, 'prompt_eval_duration': 893007000, 'eval_count': 22, 'eval_duration': 1218962000}, id='run-0863daa2-43bf-4a43-86cc-611b23eae466-0', usage_metadata={'input_tokens': 35, 'output_tokens': 22, 'total_tokens': 57})"
|
||||
]
|
||||
},
|
||||
"execution_count": 5,
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.messages import AIMessage\n",
|
||||
"\n",
|
||||
"messages = [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
@@ -190,9 +191,9 @@
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"The translation of \"I love programming\" in French is:\n",
|
||||
"The translation of \"I love programming\" from English to French is:\n",
|
||||
"\n",
|
||||
"\"J'adore le programmation.\"\n"
|
||||
"\"J'adore programmer.\"\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
@@ -219,10 +220,10 @@
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='\"Programmieren ist meine Leidenschaft.\"\\n\\n(I translated \"programming\" to the German word \"Programmieren\", and added \"ist meine Leidenschaft\" which means \"is my passion\")', additional_kwargs={}, response_metadata={'model': 'llama3.1', 'created_at': '2025-06-25T18:43:29.350032Z', 'done': True, 'done_reason': 'stop', 'total_duration': 1194744459, 'load_duration': 26982500, 'prompt_eval_count': 30, 'prompt_eval_duration': 117043458, 'eval_count': 41, 'eval_duration': 1049892167, 'model_name': 'llama3.1'}, id='run--efc6436e-2346-43d9-8118-3c20b3cdf0d0-0', usage_metadata={'input_tokens': 30, 'output_tokens': 41, 'total_tokens': 71})"
|
||||
"AIMessage(content='Das Programmieren ist mir ein Leidenschaft! (That\\'s \"Programming is my passion!\" in German.) Would you like me to translate anything else?', response_metadata={'model': 'llama3.1', 'created_at': '2024-08-19T16:05:34.893548Z', 'message': {'role': 'assistant', 'content': ''}, 'done_reason': 'stop', 'done': True, 'total_duration': 2045997333, 'load_duration': 22584792, 'prompt_eval_count': 30, 'prompt_eval_duration': 213210000, 'eval_count': 32, 'eval_duration': 1808541000}, id='run-d18e1c6b-50e0-4b1d-b23a-973fa058edad-0', usage_metadata={'input_tokens': 30, 'output_tokens': 32, 'total_tokens': 62})"
|
||||
]
|
||||
},
|
||||
"execution_count": 7,
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@@ -257,7 +258,7 @@
|
||||
"source": [
|
||||
"## Tool calling\n",
|
||||
"\n",
|
||||
"We can use [tool calling](/docs/concepts/tool_calling/) with an LLM [that has been fine-tuned for tool use](https://ollama.com/search?&c=tools) such as `llama3.1`:\n",
|
||||
"We can use [tool calling](https://blog.langchain.dev/improving-core-tool-interfaces-and-docs-in-langchain/) with an LLM [that has been fine-tuned for tool use](https://ollama.com/library/llama3.1):\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"ollama pull llama3.1\n",
|
||||
@@ -273,17 +274,23 @@
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"[{'name': 'validate_user', 'args': {'addresses': ['123 Fake St, Boston, MA', '234 Pretend Boulevard, Houston, TX'], 'user_id': '123'}, 'id': 'aef33a32-a34b-4b37-b054-e0d85584772f', 'type': 'tool_call'}]\n"
|
||||
]
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"[{'name': 'validate_user',\n",
|
||||
" 'args': {'addresses': '[\"123 Fake St, Boston, MA\", \"234 Pretend Boulevard, Houston, TX\"]',\n",
|
||||
" 'user_id': '123'},\n",
|
||||
" 'id': '40fe3de0-500c-4b91-9616-5932a929e640',\n",
|
||||
" 'type': 'tool_call'}]"
|
||||
]
|
||||
},
|
||||
"execution_count": 13,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from typing import List\n",
|
||||
"\n",
|
||||
"from langchain_core.messages import AIMessage\n",
|
||||
"from langchain_core.tools import tool\n",
|
||||
"from langchain_ollama import ChatOllama\n",
|
||||
"\n",
|
||||
@@ -309,9 +316,7 @@
|
||||
" \"123 Fake St in Boston MA and 234 Pretend Boulevard in \"\n",
|
||||
" \"Houston TX.\"\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"if isinstance(result, AIMessage) and result.tool_calls:\n",
|
||||
" print(result.tool_calls)"
|
||||
"result.tool_calls"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -328,16 +333,6 @@
|
||||
"Be sure to update Ollama so that you have the most recent version to support multi-modal."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "69920d39",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%pip install pillow"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 15,
|
||||
@@ -472,13 +467,14 @@
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Here is my thought process:\n",
|
||||
"The user is asking for the value of 3 raised to the power of 3, which is a basic exponentiation operation.\n",
|
||||
"This question is asking for the result of 3 raised to the power of 3, which is a basic mathematical operation. \n",
|
||||
"\n",
|
||||
"Here is my response:\n",
|
||||
"The expression 3^3 means 3 raised to the power of 3. To calculate this, you multiply the base number (3) by itself as many times as its exponent (3):\n",
|
||||
"\n",
|
||||
"3^3 (read as \"3 to the power of 3\") equals 27. \n",
|
||||
"3 * 3 * 3 = 27\n",
|
||||
"\n",
|
||||
"This calculation is performed by multiplying 3 by itself three times: 3*3*3 = 27.\n"
|
||||
"So, 3^3 equals 27.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
@@ -512,7 +508,7 @@
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatOllama features and configurations head to the [API reference](https://python.langchain.com/api_reference/ollama/chat_models/langchain_ollama.chat_models.ChatOllama.html)."
|
||||
"For detailed documentation of all ChatOllama features and configurations head to the API reference: https://python.langchain.com/api_reference/ollama/chat_models/langchain_ollama.chat_models.ChatOllama.html"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -106,7 +106,9 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"sonar\")"
|
||||
"chat = ChatPerplexity(\n",
|
||||
" temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -130,7 +132,7 @@
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"chat = ChatPerplexity(temperature=0, model=\"sonar\")"
|
||||
"chat = ChatPerplexity(temperature=0, model=\"llama-3.1-sonar-small-128k-online\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -198,7 +200,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"chat = ChatPerplexity(temperature=0, model=\"sonar\")\n",
|
||||
"chat = ChatPerplexity(temperature=0, model=\"llama-3.1-sonar-small-128k-online\")\n",
|
||||
"prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n",
|
||||
"chain = prompt | chat\n",
|
||||
"response = chain.invoke({\"topic\": \"cats\"})\n",
|
||||
@@ -233,7 +235,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"chat = ChatPerplexity(temperature=0.7, model=\"sonar\")\n",
|
||||
"chat = ChatPerplexity(temperature=0.7, model=\"llama-3.1-sonar-small-128k-online\")\n",
|
||||
"response = chat.invoke(\n",
|
||||
" \"Tell me a joke about cats\", extra_body={\"search_recency_filter\": \"week\"}\n",
|
||||
")\n",
|
||||
@@ -282,7 +284,7 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"chat = ChatPerplexity(temperature=0.7, model=\"sonar\")\n",
|
||||
"chat = ChatPerplexity(temperature=0.7, model=\"llama-3.1-sonar-small-128k-online\")\n",
|
||||
"\n",
|
||||
"for chunk in chat.stream(\"Give me a list of famous tourist attractions in Pakistan\"):\n",
|
||||
" print(chunk.content, end=\"\", flush=True)"
|
||||
|
||||
@@ -36,7 +36,7 @@
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"To access Reka models you'll need to create a Reka developer account, get an API key, and install the `langchain_community` integration package and the reka python package via 'pip install reka-api'.\n",
|
||||
"To access Reka models you'll need to create an Reka developer account, get an API key, and install the `langchain_community` integration package and the reka python package via 'pip install reka-api'.\n",
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
@@ -280,7 +280,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Use with Tavily api search"
|
||||
"Use use with tavtly api search"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
"> script creation, resume generation, article writing, code generation, data analysis, and content\n",
|
||||
"> analysis.\n",
|
||||
"\n",
|
||||
"See [more information](https://cloud.tencent.com/document/product/1729) for more details."
|
||||
"See for [more information](https://cloud.tencent.com/document/product/1729)."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -98,7 +98,7 @@
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"## Using ChatHunyuan with Streaming"
|
||||
"## For ChatHunyuan with Streaming"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -18,7 +18,7 @@
|
||||
"# ChatTogether\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"This page will help you get started with Together AI [chat models](../../concepts/chat_models.mdx). For detailed documentation of all ChatTogether features and configurations, head to the [API reference](https://python.langchain.com/api_reference/together/chat_models/langchain_together.chat_models.ChatTogether.html).\n",
|
||||
"This page will help you get started with Together AI [chat models](../../concepts/chat_models.mdx). For detailed documentation of all ChatTogether features and configurations head to the [API reference](https://python.langchain.com/api_reference/together/chat_models/langchain_together.chat_models.ChatTogether.html).\n",
|
||||
"\n",
|
||||
"[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/chat-models)\n",
|
||||
"\n",
|
||||
@@ -40,7 +40,7 @@
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"Head to [this page](https://api.together.ai) to sign up to Together and generate an API key. Once you've done this, set the TOGETHER_API_KEY environment variable:"
|
||||
"Head to [this page](https://api.together.ai) to sign up to Together and generate an API key. Once you've done this set the TOGETHER_API_KEY environment variable:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -81,7 +81,7 @@
|
||||
"source": [
|
||||
"### Installation\n",
|
||||
"\n",
|
||||
"The LangChain Together integration is included in the `langchain-together` package:"
|
||||
"The LangChain Together integration lives in the `langchain-together` package:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -187,7 +187,7 @@
|
||||
"source": [
|
||||
"## Chaining\n",
|
||||
"\n",
|
||||
"We can [chain](../../how_to/sequence.ipynb) our model with a prompt template as follows:"
|
||||
"We can [chain](../../how_to/sequence.ipynb) our model with a prompt template like so:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -237,7 +237,7 @@
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatTogether features and configurations, head to the API reference: https://python.langchain.com/api_reference/together/chat_models/langchain_together.chat_models.ChatTogether.html"
|
||||
"For detailed documentation of all ChatTogether features and configurations head to the API reference: https://python.langchain.com/api_reference/together/chat_models/langchain_together.chat_models.ChatTogether.html"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
@@ -20,7 +20,7 @@
|
||||
"vLLM can be deployed as a server that mimics the OpenAI API protocol. This allows vLLM to be used as a drop-in replacement for applications using OpenAI API. This server can be queried in the same format as OpenAI API.\n",
|
||||
"\n",
|
||||
"## Overview\n",
|
||||
"This will help you get started with vLLM [chat models](/docs/concepts/chat_models), which leverages the `langchain-openai` package. For detailed documentation of all `ChatOpenAI` features and configurations head to the [API reference](https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html).\n",
|
||||
"This will help you get started with vLLM [chat models](/docs/concepts/chat_models), which leverage the `langchain-openai` package. For detailed documentation of all `ChatOpenAI` features and configurations head to the [API reference](https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html).\n",
|
||||
"\n",
|
||||
"### Integration details\n",
|
||||
"\n",
|
||||
@@ -29,7 +29,7 @@
|
||||
"| [ChatOpenAI](https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html) | [langchain_openai](https://python.langchain.com/api_reference/openai/) | ✅ | beta | ❌ |  |  |\n",
|
||||
"\n",
|
||||
"### Model features\n",
|
||||
"Specific model features, such as tool calling, support for multi-modal inputs, support for token-level streaming, etc., will depend on the hosted model.\n",
|
||||
"Specific model features-- such as tool calling, support for multi-modal inputs, support for token-level streaming, etc.-- will depend on the hosted model.\n",
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: Volc Engine Maas\n",
|
||||
"sidebar_label: Volc Enging Maas\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
|
||||
@@ -303,7 +303,7 @@
|
||||
"source": [
|
||||
"### A note on tool binding\n",
|
||||
"\n",
|
||||
"The `ChatWriter.bind_tools()` method does not create a new instance with bound tools, but stores the received `tools` and `tool_choice` in the initial class instance attributes to pass them as parameters during the Palmyra LLM call while using `ChatWriter` invocation. This approach allows the support of different tool types, e.g. `function` and `graph`. `Graph` is one of the remotely called Writer Palmyra tools. For further information, visit our [docs](https://dev.writer.com/api-guides/knowledge-graph#knowledge-graph). \n",
|
||||
"The `ChatWriter.bind_tools()` method does not create new instance with bound tools, but stores the received `tools` and `tool_choice` in the initial class instance attributes to pass them as parameters during the Palmyra LLM call while using `ChatWriter` invocation. This approach allows the support of different tool types, e.g. `function` and `graph`. `Graph` is one of the remotely called Writer Palmyra tools. For further information visit our [docs](https://dev.writer.com/api-guides/knowledge-graph#knowledge-graph). \n",
|
||||
"\n",
|
||||
"For more information about tool usage in LangChain, visit the [LangChain tool calling documentation](https://python.langchain.com/docs/concepts/tool_calling/)."
|
||||
]
|
||||
@@ -373,7 +373,7 @@
|
||||
"source": [
|
||||
"## Prompt templates\n",
|
||||
"\n",
|
||||
"[Prompt templates](https://python.langchain.com/docs/concepts/prompt_templates/) help to translate user input and parameters into instructions for a language model. You can use `ChatWriter` with a prompt template like so:\n"
|
||||
"[Prompt templates](https://python.langchain.com/docs/concepts/prompt_templates/) help to translate user input and parameters into instructions for a language model. You can use `ChatWriter` with a prompt templates like so:\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -411,7 +411,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"For detailed documentation of all ChatWriter features and configurations, head to the [API reference](https://python.langchain.com/api_reference/writer/chat_models/langchain_writer.chat_models.ChatWriter.html#langchain_writer.chat_models.ChatWriter).\n",
|
||||
"For detailed documentation of all ChatWriter features and configurations head to the [API reference](https://python.langchain.com/api_reference/writer/chat_models/langchain_writer.chat_models.ChatWriter.html#langchain_writer.chat_models.ChatWriter).\n",
|
||||
"\n",
|
||||
"## Additional resources\n",
|
||||
"You can find information about Writer's models (including costs, context windows, and supported input types) and tools in the [Writer docs](https://dev.writer.com/home)."
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"Head to [01.AI](https://platform.01.ai) to sign up for 01.AI and generate an API key. Once you've done this set the `YI_API_KEY` environment variable:"
|
||||
"Head to [01.AI](https://platform.01.ai) to sign up to 01.AI and generate an API key. Once you've done this set the `YI_API_KEY` environment variable:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -197,7 +197,7 @@
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatYi features and configurations, head to the API reference: https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.yi.ChatYi.html"
|
||||
"For detailed documentation of all ChatYi features and configurations head to the API reference: https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.yi.ChatYi.html"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
@@ -56,7 +56,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Setting Up Your API Key\n",
|
||||
"Sign in to [ZHIPU AI](https://open.bigmodel.cn/login?redirect=%2Fusercenter%2Fapikeys) for an API Key to access our models."
|
||||
"Sign in to [ZHIPU AI](https://open.bigmodel.cn/login?redirect=%2Fusercenter%2Fapikeys) for the an API Key to access our models."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
"source": [
|
||||
"# Facebook Messenger\n",
|
||||
"\n",
|
||||
"This notebook shows how to load data from Facebook into a format you can fine-tune on. The overall steps are:\n",
|
||||
"This notebook shows how to load data from Facebook in a format you can fine-tune on. The overall steps are:\n",
|
||||
"\n",
|
||||
"1. Download your messenger data to disk.\n",
|
||||
"2. Create the Chat Loader and call `loader.load()` (or `loader.lazy_load()`) to perform the conversion.\n",
|
||||
@@ -25,7 +25,7 @@
|
||||
"\n",
|
||||
"## 1. Download Data\n",
|
||||
"\n",
|
||||
"To download your own messenger data, follow the instructions [here](https://www.zapptales.com/en/download-facebook-messenger-chat-history-how-to/). IMPORTANT - make sure to download them in JSON format (not HTML).\n",
|
||||
"To download your own messenger data, following instructions [here](https://www.zapptales.com/en/download-facebook-messenger-chat-history-how-to/). IMPORTANT - make sure to download them in JSON format (not HTML).\n",
|
||||
"\n",
|
||||
"We are hosting an example dump at [this google drive link](https://drive.google.com/file/d/1rh1s1o2i7B-Sk1v9o8KNgivLVGwJ-osV/view?usp=sharing) that we will use in this walkthrough."
|
||||
]
|
||||
|
||||
@@ -70,7 +70,7 @@
|
||||
"source": [
|
||||
"## 2. Create the Chat Loader\n",
|
||||
"\n",
|
||||
"Provide the loader with the file path to the zip directory. You can optionally specify the user id that maps to an ai message as well as configure whether to merge message runs."
|
||||
"Provide the loader with the file path to the zip directory. You can optionally specify the user id that maps to an ai message as well an configure whether to merge message runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
"source": [
|
||||
"# LangSmith Chat Datasets\n",
|
||||
"\n",
|
||||
"This notebook demonstrates an easy way to load a LangSmith chat dataset and fine-tune a model on that data.\n",
|
||||
"This notebook demonstrates an easy way to load a LangSmith chat dataset fine-tune a model on that data.\n",
|
||||
"The process is simple and comprises 3 steps.\n",
|
||||
"\n",
|
||||
"1. Create the chat dataset.\n",
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
"\n",
|
||||
"## 1. Create message dump\n",
|
||||
"\n",
|
||||
"Currently (2023/08/23), this loader best supports a zip directory of files in the format generated by exporting your a direct message conversation from Slack. Follow the up-to-date instructions from slack on how to do so.\n",
|
||||
"Currently (2023/08/23) this loader best supports a zip directory of files in the format generated by exporting your a direct message conversation from Slack. Follow up-to-date instructions from slack on how to do so.\n",
|
||||
"\n",
|
||||
"We have an example in the LangChain repo."
|
||||
]
|
||||
@@ -43,7 +43,7 @@
|
||||
"source": [
|
||||
"## 2. Create the Chat Loader\n",
|
||||
"\n",
|
||||
"Provide the loader with the file path to the zip directory. You can optionally specify the user id that maps to an ai message as well as configure whether to merge message runs."
|
||||
"Provide the loader with the file path to the zip directory. You can optionally specify the user id that maps to an ai message as well an configure whether to merge message runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
"This notebook shows how to use the Telegram chat loader. This class helps map exported Telegram conversations to LangChain chat messages.\n",
|
||||
"\n",
|
||||
"The process has three steps:\n",
|
||||
"1. Export the chat .txt file by copying chats from the Telegram app and pasting them in a file on your local computer\n",
|
||||
"1. Export the chat .txt file by copying chats from the Telegram app and pasting them in a file on your local computer\n",
|
||||
"2. Create the `TelegramChatLoader` with the file path pointed to the json file or directory of JSON files\n",
|
||||
"3. Call `loader.load()` (or `loader.lazy_load()`) to perform the conversion. Optionally use `merge_chat_runs` to combine message from the same sender in sequence, and/or `map_ai_messages` to convert messages from the specified sender to the \"AIMessage\" class.\n",
|
||||
"\n",
|
||||
@@ -92,7 +92,7 @@
|
||||
"source": [
|
||||
"## 2. Create the Chat Loader\n",
|
||||
"\n",
|
||||
"All that's required is the file path. You can optionally specify the user name that maps to an ai message as well as configure whether to merge message runs."
|
||||
"All that's required is the file path. You can optionally specify the user name that maps to an ai message as well an configure whether to merge message runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
"\n",
|
||||
"\n",
|
||||
"The process has five steps:\n",
|
||||
"1. Open your chat in the WeChat desktop app. Select messages you need by mouse-dragging or right-click. Due to restrictions, you can select up to 100 messages at a time. `CMD`/`Ctrl` + `C` to copy.\n",
|
||||
"1. Open your chat in the WeChat desktop app. Select messages you need by mouse-dragging or right-click. Due to restrictions, you can select up to 100 messages once a time. `CMD`/`Ctrl` + `C` to copy.\n",
|
||||
"2. Create the chat .txt file by pasting selected messages in a file on your local computer.\n",
|
||||
"3. Copy the chat loader definition from below to a local file.\n",
|
||||
"4. Initialize the `WeChatChatLoader` with the file path pointed to the text file.\n",
|
||||
|
||||
@@ -152,7 +152,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -170,7 +170,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -131,7 +131,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -133,7 +133,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -50,11 +50,11 @@
|
||||
"\n",
|
||||
"5) Setup any source you wish.\n",
|
||||
"\n",
|
||||
"6) Set destination as Local JSON, with specified destination path - let's say `/json_data`. Set up manual sync.\n",
|
||||
"6) Set destination as Local JSON, with specified destination path - lets say `/json_data`. Set up manual sync.\n",
|
||||
"\n",
|
||||
"7) Run the connection.\n",
|
||||
"\n",
|
||||
"7) To see what files are created, you can navigate to: `file:///tmp/airbyte_local`\n",
|
||||
"7) To see what files are create, you can navigate to: `file:///tmp/airbyte_local`\n",
|
||||
"\n",
|
||||
"8) Find your data and copy path. That path should be saved in the file variable below. It should start with `/tmp/airbyte_local`\n"
|
||||
]
|
||||
|
||||
@@ -138,7 +138,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -156,7 +156,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -134,7 +134,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -152,7 +152,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -131,7 +131,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -149,7 +149,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -134,7 +134,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -152,7 +152,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -135,7 +135,7 @@
|
||||
"id": "4a93dc2a",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also use the `lazy_load` method which returns an iterator instead:"
|
||||
"As `load` returns a list, it will block until all documents are loaded. To have better control over this process, you can also you the `lazy_load` method which returns an iterator instead:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -153,7 +153,7 @@
|
||||
"id": "3a124086",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different way, pass in a record_handler function when creating the loader:"
|
||||
"Keep in mind that by default the page content is empty and the metadata object contains all the information from the record. To create documents in a different, pass in a record_handler function when creating the loader:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -46,7 +46,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Basic Usage\n",
|
||||
"To instantiate the loader you'll need a SQL query to execute, your MaxCompute endpoint and project name, and your access ID and secret access key. The access ID and secret access key can either be passed in direct via the `access_id` and `secret_access_key` parameters or they can be set as environment variables `MAX_COMPUTE_ACCESS_ID` and `MAX_COMPUTE_SECRET_ACCESS_KEY`."
|
||||
"To instantiate the loader you'll need a SQL query to execute, your MaxCompute endpoint and project name, and you access ID and secret access key. The access ID and secret access key can either be passed in direct via the `access_id` and `secret_access_key` parameters or they can be set as environment variables `MAX_COMPUTE_ACCESS_ID` and `MAX_COMPUTE_SECRET_ACCESS_KEY`."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -45,7 +45,7 @@
|
||||
"source": [
|
||||
"## Sample 1\n",
|
||||
"\n",
|
||||
"The first example uses a local file, which internally will be sent to Amazon Textract sync API [DetectDocumentText](https://docs.aws.amazon.com/textract/latest/dg/API_DetectDocumentText.html). \n",
|
||||
"The first example uses a local file, which internally will be send to Amazon Textract sync API [DetectDocumentText](https://docs.aws.amazon.com/textract/latest/dg/API_DetectDocumentText.html). \n",
|
||||
"\n",
|
||||
"Local files or URL endpoints like HTTP:// are limited to one page documents for Textract.\n",
|
||||
"Multi-page documents have to reside on S3. This sample file is a jpeg."
|
||||
@@ -218,7 +218,7 @@
|
||||
"source": [
|
||||
"## Sample 4\n",
|
||||
"\n",
|
||||
"You have the option to pass an additional parameter called `linearization_config` to the AmazonTextractPDFLoader which will determine how the text output will be linearized by the parser after Textract runs."
|
||||
"You have the option to pass an additional parameter called `linearization_config` to the AmazonTextractPDFLoader which will determine how the the text output will be linearized by the parser after Textract runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -247,7 +247,7 @@
|
||||
"id": "b3e41b4d-b159-4274-89be-80d8159134ef",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Using the AmazonTextractPDFLoader in a LangChain chain (e.g. OpenAI)\n",
|
||||
"## Using the AmazonTextractPDFLoader in an LangChain chain (e. g. OpenAI)\n",
|
||||
"\n",
|
||||
"The AmazonTextractPDFLoader can be used in a chain the same way the other loaders are used.\n",
|
||||
"Textract itself does have a [Query feature](https://docs.aws.amazon.com/textract/latest/dg/API_Query.html), which offers similar functionality to the QA chain in this sample, which is worth checking out as well."
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
"\n",
|
||||
"Headless mode means that the browser is running without a graphical user interface.\n",
|
||||
"\n",
|
||||
"In the below example we'll use the `AsyncChromiumLoader` to load the page, and then the [`Html2TextTransformer`](/docs/integrations/document_transformers/html2text/) to strip out the HTML tags and other semantic information."
|
||||
"In the below example we'll use the `AsyncChromiumLoader` to loads the page, and then the [`Html2TextTransformer`](/docs/integrations/document_transformers/html2text/) to strip out the HTML tags and other semantic information."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -79,7 +79,7 @@
|
||||
"id": "7eb5e6aa",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Now let's transform the documents into a more readable format using the transformer:"
|
||||
"Now let's transform the documents into a more readable syntax using the transformer:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -40,7 +40,7 @@
|
||||
"# If you need to use the proxy to make web requests, for example using http_proxy/https_proxy environmental variables,\n",
|
||||
"# please set trust_env=True explicitly here as follows:\n",
|
||||
"# loader = AsyncHtmlLoader(urls, trust_env=True)\n",
|
||||
"# Otherwise, loader.load() may get stuck because aiohttp session does not recognize the proxy by default\n",
|
||||
"# Otherwise, loader.load() may stuck becuase aiohttp session does not recognize the proxy by default\n",
|
||||
"docs = loader.load()"
|
||||
]
|
||||
},
|
||||
|
||||
@@ -68,7 +68,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Specifying a prefix\n",
|
||||
"You can also specify a prefix for more fine-grained control over what files to load."
|
||||
"You can also specify a prefix for more finegrained control over what files to load."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -107,7 +107,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Specifying a glob pattern\n",
|
||||
"You can also specify a glob pattern for more fine-grained control over what files to load. In the example below, only files with a `pdf` extension will be loaded."
|
||||
"You can also specify a glob pattern for more finegrained control over what files to load. In the example below, only files with a `pdf` extension will be loaded."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -79,7 +79,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Specifying a prefix\n",
|
||||
"You can also specify a prefix for more fine-grained control over what files to load."
|
||||
"You can also specify a prefix for more finegrained control over what files to load."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
"\n",
|
||||
">[Azure Files](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-introduction) offers fully managed file shares in the cloud that are accessible via the industry standard Server Message Block (`SMB`) protocol, Network File System (`NFS`) protocol, and `Azure Files REST API`.\n",
|
||||
"\n",
|
||||
"This covers how to load document objects from Azure Files."
|
||||
"This covers how to load document objects from a Azure Files."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -81,7 +81,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Specifying a prefix\n",
|
||||
"You can also specify a prefix for more fine-grained control over what files to load -including loading all files from a specific folder-."
|
||||
"You can also specify a prefix for more finegrained control over what files to load -including loading all files from a specific folder-."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -1,151 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Outline Document Loader\n",
|
||||
"\n",
|
||||
">[Outline](https://www.getoutline.com/) is an open-source collaborative knowledge base platform designed for team information sharing.\n",
|
||||
"\n",
|
||||
"This notebook shows how to obtain langchain Documents from your Outline collections."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Overview\n",
|
||||
"The [Outline Document Loader](https://github.com/10Pines/langchain-outline) can be used to load Outline collections as LangChain Documents for integration into Retrieval-Augmented Generation (RAG) workflows.\n",
|
||||
"\n",
|
||||
"This example demonstrates:\n",
|
||||
"\n",
|
||||
"* Setting up a Document Loader to load all documents from an Outline instance."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Setup\n",
|
||||
"Before starting, ensure you have the following environment variables set:\n",
|
||||
"\n",
|
||||
"* OUTLINE_API_KEY: Your API key for authenticating with your Outline instance (https://www.getoutline.com/developers#section/Authentication).\n",
|
||||
"* OUTLINE_INSTANCE_URL: The URL (including protocol) of your Outline instance."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"\n",
|
||||
"os.environ[\"OUTLINE_API_KEY\"] = \"ol_api_xyz123\"\n",
|
||||
"os.environ[\"OUTLINE_INSTANCE_URL\"] = \"https://app.getoutline.com\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Initialization\n",
|
||||
"To initialize the OutlineLoader, you need the following parameters:\n",
|
||||
"\n",
|
||||
"* outline_base_url: The URL of your outline instance (or it will be taken from the environment variable).\n",
|
||||
"* outline_api_key: Your API key for authenticating with your Outline instance (or it will be taken from the environment variable).\n",
|
||||
"* outline_collection_id_list: List of collection ids to be retrieved. If None all will be retrieved.\n",
|
||||
"* page_size: Because the Outline API uses paginated results you can configure how many results (documents) per page will be retrieved per API request. If this is not specified a default will be used."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Option 1: Using environment variables (ensure they are set)\n",
|
||||
"from langchain_outline.document_loaders.outline import OutlineLoader\n",
|
||||
"\n",
|
||||
"loader = OutlineLoader()\n",
|
||||
"\n",
|
||||
"# Option 2: Passing parameters directly\n",
|
||||
"loader = OutlineLoader(\n",
|
||||
" outline_base_url=\"YOUR_OUTLINE_URL\", outline_api_key=\"YOUR_API_KEY\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Load\n",
|
||||
"To load and return all documents available in the Outline instance"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"loader.load()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Lazy Load\n",
|
||||
"The lazy_load method allows you to iteratively load documents from the Outline collection, yielding each document as it is fetched:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"loader.lazy_load()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all `Outline` features and configurations head to the API reference: https://www.getoutline.com/developers"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.9"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,25 +1,12 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Rockset\n",
|
||||
"\n",
|
||||
"⚠️ **Deprecation Notice: Rockset Integration Disabled**\n",
|
||||
"> \n",
|
||||
"> As of June 2024, Rockset has been [acquired by OpenAI](https://openai.com/index/openai-acquires-rockset/) and **shut down its public services**.\n",
|
||||
"> \n",
|
||||
"> Rockset was a real-time analytics database known for world-class indexing and retrieval. Now, its core team and technology are being integrated into OpenAI's infrastructure to power future AI products.\n",
|
||||
"> \n",
|
||||
"> This LangChain integration is no longer functional and is preserved **for archival purposes only**."
|
||||
]
|
||||
},
|
||||
{
|
||||
"attachments": {},
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Rockset\n",
|
||||
"\n",
|
||||
"> Rockset is a real-time analytics database which enables queries on massive, semi-structured data without operational burden. With Rockset, ingested data is queryable within one second and analytical queries against that data typically execute in milliseconds. Rockset is compute optimized, making it suitable for serving high concurrency applications in the sub-100TB range (or larger than 100s of TBs with rollups).\n",
|
||||
"\n",
|
||||
"This notebook demonstrates how to use Rockset as a document loader in langchain. To get started, make sure you have a Rockset account and an API key available.\n",
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user