docs: standard tests to markdown, load templates from files (#28603)

This commit is contained in:
Erick Friis 2024-12-06 17:37:21 -08:00 committed by GitHub
parent 9b848491c8
commit dd0085a9ff
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 408 additions and 619 deletions

View File

@ -48,7 +48,7 @@ import CodeBlock from '@theme/CodeBlock';
import ChatModelSource from '../../../../src/theme/integration_template/integration_template/chat_models.py';
<CodeBlock language="jsx" title="langchain_parrot_link/chat_models.py">
<CodeBlock language="python" title="langchain_parrot_link/chat_models.py">
{
ChatModelSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
@ -101,7 +101,7 @@ import ChatModelSource from '../../../../src/theme/integration_template/integrat
import VectorstoreSource from '../../../../src/theme/integration_template/integration_template/vectorstores.py';
<CodeBlock language="jsx" title="langchain_parrot_link/vectorstores.py">
<CodeBlock language="python" title="langchain_parrot_link/vectorstores.py">
{
VectorstoreSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
@ -161,7 +161,7 @@ For convenience, we also include the code below.
import EmbeddingsSource from '/src/theme/integration_template/integration_template/embeddings.py';
<CodeBlock language="jsx" title="langchain_parrot_link/embeddings.py">
<CodeBlock language="python" title="langchain_parrot_link/embeddings.py">
{
EmbeddingsSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
@ -234,7 +234,7 @@ For convenience, we also include the code below.
import ToolSource from '/src/theme/integration_template/integration_template/tools.py';
<CodeBlock language="jsx" title="langchain_parrot_link/tools.py">
<CodeBlock language="python" title="langchain_parrot_link/tools.py">
{
ToolSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
@ -297,7 +297,7 @@ For convenience, we also include the code below.
import RetrieverSource from '/src/theme/integration_template/integration_template/retrievers.py';
<CodeBlock language="jsx" title="langchain_parrot_link/retrievers.py">
<CodeBlock language="python" title="langchain_parrot_link/retrievers.py">
{
RetrieverSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')

View File

@ -1,613 +0,0 @@
{
"cells": [
{
"cell_type": "raw",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"pagination_next: contributing/how_to/integrations/publish\n",
"pagination_prev: contributing/how_to/integrations/package\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# How to add standard tests to an integration\n",
"\n",
"When creating either a custom class for yourself or to publish in a LangChain integration, it is important to add standard tests to ensure it works as expected. This guide will show you how to add standard tests to a custom chat model, and you can **[Skip to the test templates](#standard-test-templates-per-component)** for implementing tests for each integration type.\n",
"\n",
"## Setup\n",
"\n",
"If you're coming from the [previous guide](../package), you have already installed these dependencies, and you can skip this section.\n",
"\n",
"First, let's install 2 dependencies:\n",
"\n",
"- `langchain-core` will define the interfaces we want to import to define our custom tool.\n",
"- `langchain-tests` will provide the standard tests we want to use. Recommended to pin to the latest version: <img src=\"https://img.shields.io/pypi/v/langchain-tests\" style={{position:\"relative\",top:4,left:3}} />\n",
"\n",
":::note\n",
"\n",
"Because added tests in new versions of `langchain-tests` can break your CI/CD pipelines, we recommend pinning the \n",
"version of `langchain-tests` to avoid unexpected changes.\n",
"\n",
":::\n",
"\n",
"import Tabs from '@theme/Tabs';\n",
"import TabItem from '@theme/TabItem';\n",
"\n",
"<Tabs>\n",
" <TabItem value=\"poetry\" label=\"Poetry\" default>\n",
"If you followed the [previous guide](../package), you should already have these dependencies installed!\n",
"\n",
"```bash\n",
"poetry add langchain-core\n",
"poetry add --group test pytest pytest-socket pytest-asyncio langchain-tests==<latest_version>\n",
"poetry install --with test\n",
"```\n",
" </TabItem>\n",
" <TabItem value=\"pip\" label=\"Pip\">\n",
"```bash\n",
"pip install -U langchain-core pytest pytest-socket pytest-asyncio langchain-tests\n",
"\n",
"# install current package in editable mode\n",
"pip install --editable .\n",
"```\n",
" </TabItem>\n",
"</Tabs>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's say we're publishing a package, `langchain_parrot_link`, that exposes the chat model from the [guide on implementing the package](../package). We can add the standard tests to the package by following the steps below."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And we'll assume you've structured your package the same way as the main LangChain\n",
"packages:\n",
"\n",
"```plaintext\n",
"langchain-parrot-link/\n",
"├── langchain_parrot_link/\n",
"│ ├── __init__.py\n",
"│ └── chat_models.py\n",
"├── tests/\n",
"│ ├── __init__.py\n",
"│ └── test_chat_models.py\n",
"├── pyproject.toml\n",
"└── README.md\n",
"```\n",
"\n",
"## Add and configure standard tests\n",
"\n",
"There are 2 namespaces in the `langchain-tests` package: \n",
"\n",
"- [unit tests](../../../concepts/testing.mdx#unit-tests) (`langchain_tests.unit_tests`): designed to be used to test the component in isolation and without access to external services\n",
"- [integration tests](../../../concepts/testing.mdx#unit-tests) (`langchain_tests.integration_tests`): designed to be used to test the component with access to external services (in particular, the external service that the component is designed to interact with).\n",
"\n",
"Both types of tests are implemented as [`pytest` class-based test suites](https://docs.pytest.org/en/7.1.x/getting-started.html#group-multiple-tests-in-a-class).\n",
"\n",
"By subclassing the base classes for each type of standard test (see below), you get all of the standard tests for that type, and you\n",
"can override the properties that the test suite uses to configure the tests.\n",
"\n",
"### Standard chat model tests\n",
"\n",
"Here's how you would configure the standard unit tests for the custom chat model:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/unit_tests/test_chat_models.py\"\n",
"from typing import Tuple, Type\n",
"\n",
"from langchain_parrot_link.chat_models import ChatParrotLink\n",
"from langchain_tests.unit_tests import ChatModelUnitTests\n",
"\n",
"\n",
"class TestChatParrotLinkUnit(ChatModelUnitTests):\n",
" @property\n",
" def chat_model_class(self) -> Type[ChatParrotLink]:\n",
" return ChatParrotLink\n",
"\n",
" @property\n",
" def chat_model_params(self) -> dict:\n",
" return {\n",
" \"model\": \"bird-brain-001\",\n",
" \"temperature\": 0,\n",
" \"parrot_buffer_length\": 50,\n",
" }"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/integration_tests/test_chat_models.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.chat_models import ChatParrotLink\n",
"from langchain_tests.integration_tests import ChatModelIntegrationTests\n",
"\n",
"\n",
"class TestChatParrotLinkIntegration(ChatModelIntegrationTests):\n",
" @property\n",
" def chat_model_class(self) -> Type[ChatParrotLink]:\n",
" return ChatParrotLink\n",
"\n",
" @property\n",
" def chat_model_params(self) -> dict:\n",
" return {\n",
" \"model\": \"bird-brain-001\",\n",
" \"temperature\": 0,\n",
" \"parrot_buffer_length\": 50,\n",
" }"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"and you would run these with the following commands from your project root\n",
"\n",
"<Tabs>\n",
" <TabItem value=\"poetry\" label=\"Poetry\" default>\n",
"\n",
"```bash\n",
"# run unit tests without network access\n",
"poetry run pytest --disable-socket --allow-unix-socket --asyncio-mode=auto tests/unit_tests\n",
"\n",
"# run integration tests\n",
"poetry run pytest --asyncio-mode=auto tests/integration_tests\n",
"```\n",
"\n",
" </TabItem>\n",
" <TabItem value=\"pip\" label=\"Pip\">\n",
"\n",
"```bash\n",
"# run unit tests without network access\n",
"pytest --disable-socket --allow-unix-socket --asyncio-mode=auto tests/unit_tests\n",
"\n",
"# run integration tests\n",
"pytest --asyncio-mode=auto tests/integration_tests\n",
"```\n",
"\n",
" </TabItem>\n",
"</Tabs>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Test suite information and troubleshooting\n",
"\n",
"For a full list of the standard test suites that are available, as well as\n",
"information on which tests are included and how to troubleshoot common issues,\n",
"see the [Standard Tests API Reference](https://python.langchain.com/api_reference/standard_tests/index.html).\n",
"\n",
"An increasing number of troubleshooting guides are being added to this documentation,\n",
"and if you're interested in contributing, feel free to add docstrings to tests in \n",
"[Github](https://github.com/langchain-ai/langchain/tree/master/libs/standard-tests/langchain_tests)!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Standard test templates per component:\n",
"\n",
"Above, we implement the **unit** and **integration** standard tests for a tool. Below are the templates for implementing the standard tests for each component:\n",
"\n",
"<details>\n",
" <summary>Chat Models</summary>\n",
" <p>Note: The standard tests for chat models are implemented in the example in the main body of this guide too.</p>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Chat model standard tests test a range of behaviors, from the most basic requirements (generating a response to a query) to optional capabilities like multi-modal support and tool-calling. For a test run to be successful:\n",
"\n",
"1. If a feature is intended to be supported by the model, it should pass;\n",
"2. If a feature is not intended to be supported by the model, it should be skipped.\n",
"\n",
"Tests for \"optional\" capabilities are controlled via a set of properties that can be overridden on the test model subclass.\n",
"\n",
"You can see the entire list of properties in the API references for\n",
"[unit tests](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.chat_models.ChatModelUnitTests.html)\n",
"and [integration tests](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.html).\n",
"\n",
"For example, to enable integration tests for image inputs, we can implement\n",
"\n",
"```python\n",
"@property\n",
"def supports_image_inputs(self) -> bool:\n",
" return True\n",
"```\n",
"\n",
"on the integration test class.\n",
"\n",
":::note\n",
"\n",
"Details on what tests are run, how each test can be skipped, and troubleshooting tips for each test can be found in the API references. See details:\n",
"\n",
"- [Unit tests API reference](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.chat_models.ChatModelUnitTests.html)\n",
"- [Integration tests API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.html)\n",
"\n",
":::\n",
"\n",
"Unit test example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/unit_tests/test_chat_models.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.chat_models import ChatParrotLink\n",
"from langchain_tests.unit_tests import ChatModelUnitTests\n",
"\n",
"\n",
"class TestChatParrotLinkUnit(ChatModelUnitTests):\n",
" @property\n",
" def chat_model_class(self) -> Type[ChatParrotLink]:\n",
" return ChatParrotLink\n",
"\n",
" @property\n",
" def chat_model_params(self) -> dict:\n",
" return {\n",
" \"model\": \"bird-brain-001\",\n",
" \"temperature\": 0,\n",
" \"parrot_buffer_length\": 50,\n",
" }"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Integration test example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/integration_tests/test_chat_models.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.chat_models import ChatParrotLink\n",
"from langchain_tests.integration_tests import ChatModelIntegrationTests\n",
"\n",
"\n",
"class TestChatParrotLinkIntegration(ChatModelIntegrationTests):\n",
" @property\n",
" def chat_model_class(self) -> Type[ChatParrotLink]:\n",
" return ChatParrotLink\n",
"\n",
" @property\n",
" def chat_model_params(self) -> dict:\n",
" return {\n",
" \"model\": \"bird-brain-001\",\n",
" \"temperature\": 0,\n",
" \"parrot_buffer_length\": 50,\n",
" }"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"</details>\n",
"<details>\n",
" <summary>Embedding Models</summary>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To configure standard tests for an embeddings model, we subclass\n",
"`EmbeddingsUnitTests` and `EmbeddingsIntegrationTests`. On each subclass, we\n",
"implement the `embeddings_class` property to specify the embeddings model to be\n",
"tested. We can also override the embedding_model_params property to specify\n",
"initialization parameters. See examples below.\n",
"\n",
":::note\n",
"\n",
"Details on what tests are run, how each test can be skipped, and troubleshooting tips for each test can be found in the API references. See details:\n",
"\n",
"- [Unit tests API reference](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.embeddings.EmbeddingsUnitTests.html)\n",
"- [Integration tests API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.embeddings.EmbeddingsIntegrationTests.html)\n",
"\n",
":::\n",
"\n",
"Unit test example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/unit_tests/test_embeddings.py\"\n",
"from typing import Tuple, Type\n",
"\n",
"from langchain_parrot_link.embeddings import ParrotLinkEmbeddings\n",
"from langchain_tests.unit_tests import EmbeddingsUnitTests\n",
"\n",
"\n",
"class TestParrotLinkEmbeddingsUnit(EmbeddingsUnitTests):\n",
" @property\n",
" def embeddings_class(self) -> Type[ParrotLinkEmbeddings]:\n",
" return ParrotLinkEmbeddings\n",
"\n",
" @property\n",
" def embedding_model_params(self) -> dict:\n",
" return {\"model\": \"nest-embed-001\", \"temperature\": 0}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Integration test example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/integration_tests/test_embeddings.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.embeddings import ParrotLinkEmbeddings\n",
"from langchain_tests.integration_tests import EmbeddingsIntegrationTests\n",
"\n",
"\n",
"class TestParrotLinkEmbeddingsIntegration(EmbeddingsIntegrationTests):\n",
" @property\n",
" def embeddings_class(self) -> Type[ParrotLinkEmbeddings]:\n",
" return ParrotLinkEmbeddings\n",
"\n",
" @property\n",
" def embedding_model_params(self) -> dict:\n",
" return {\"model\": \"nest-embed-001\"}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"</details>\n",
"<details>\n",
" <summary>Tools/Toolkits</summary>"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/unit_tests/test_tools.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.tools import ParrotMultiplyTool\n",
"from langchain_tests.unit_tests import ToolsUnitTests\n",
"\n",
"\n",
"class TestParrotMultiplyToolUnit(ToolsUnitTests):\n",
" @property\n",
" def tool_constructor(self) -> Type[ParrotMultiplyTool]:\n",
" return ParrotMultiplyTool\n",
"\n",
" @property\n",
" def tool_constructor_params(self) -> dict:\n",
" # if your tool constructor instead required initialization arguments like\n",
" # `def __init__(self, some_arg: int):`, you would return those here\n",
" # as a dictionary, e.g.: `return {'some_arg': 42}`\n",
" return {}\n",
"\n",
" @property\n",
" def tool_invoke_params_example(self) -> dict:\n",
" \"\"\"\n",
" Returns a dictionary representing the \"args\" of an example tool call.\n",
"\n",
" This should NOT be a ToolCall dict - i.e. it should not\n",
" have {\"name\", \"id\", \"args\"} keys.\n",
" \"\"\"\n",
" return {\"a\": 2, \"b\": 3}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/integration_tests/test_tools.py\"\n",
"from typing import Type\n",
"\n",
"from langchain_parrot_link.tools import ParrotMultiplyTool\n",
"from langchain_tests.integration_tests import ToolsIntegrationTests\n",
"\n",
"\n",
"class TestParrotMultiplyToolIntegration(ToolsIntegrationTests):\n",
" @property\n",
" def tool_constructor(self) -> Type[ParrotMultiplyTool]:\n",
" return ParrotMultiplyTool\n",
"\n",
" @property\n",
" def tool_constructor_params(self) -> dict:\n",
" # if your tool constructor instead required initialization arguments like\n",
" # `def __init__(self, some_arg: int):`, you would return those here\n",
" # as a dictionary, e.g.: `return {'some_arg': 42}`\n",
" return {}\n",
"\n",
" @property\n",
" def tool_invoke_params_example(self) -> dict:\n",
" \"\"\"\n",
" Returns a dictionary representing the \"args\" of an example tool call.\n",
"\n",
" This should NOT be a ToolCall dict - i.e. it should not\n",
" have {\"name\", \"id\", \"args\"} keys.\n",
" \"\"\"\n",
" return {\"a\": 2, \"b\": 3}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"</details>\n",
"<details>\n",
" <summary>Vector Stores</summary>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here's how you would configure the standard tests for a typical vector store (using\n",
"`ParrotVectorStore` as a placeholder):"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# title=\"tests/integration_tests/test_vectorstores_sync.py\"\n",
"\n",
"from typing import Generator\n",
"\n",
"import pytest\n",
"from langchain_core.vectorstores import VectorStore\n",
"from langchain_parrot_link.vectorstores import ParrotVectorStore\n",
"from langchain_standard_tests.integration_tests.vectorstores import (\n",
" VectorStoreIntegrationTests,\n",
")\n",
"\n",
"\n",
"class TestParrotVectorStore(VectorStoreIntegrationTests):\n",
" @pytest.fixture()\n",
" def vectorstore(self) -> Generator[VectorStore, None, None]: # type: ignore\n",
" \"\"\"Get an empty vectorstore for unit tests.\"\"\"\n",
" store = ParrotVectorStore(self.get_embeddings())\n",
" # note: store should be EMPTY at this point\n",
" # if you need to delete data, you may do so here\n",
" try:\n",
" yield store\n",
" finally:\n",
" # cleanup operations, or deleting data\n",
" pass"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Configuring the tests consists of implementing pytest fixtures for setting up an\n",
"empty vector store and tearing down the vector store after the test run ends.\n",
"\n",
"For example, below is the `VectorStoreIntegrationTests` class for the [Chroma](https://python.langchain.com/docs/integrations/vectorstores/chroma/)\n",
"integration:\n",
"\n",
"```python\n",
"from typing import Generator\n",
"\n",
"import pytest\n",
"from langchain_core.vectorstores import VectorStore\n",
"from langchain_tests.integration_tests.vectorstores import VectorStoreIntegrationTests\n",
"\n",
"from langchain_chroma import Chroma\n",
"\n",
"\n",
"class TestChromaStandard(VectorStoreIntegrationTests):\n",
" @pytest.fixture()\n",
" def vectorstore(self) -> Generator[VectorStore, None, None]: # type: ignore\n",
" \"\"\"Get an empty vectorstore for unit tests.\"\"\"\n",
" store = Chroma(embedding_function=self.get_embeddings())\n",
" try:\n",
" yield store\n",
" finally:\n",
" store.delete_collection()\n",
" pass\n",
"\n",
"```\n",
"\n",
"Note that before the initial `yield`, we instantiate the vector store with an\n",
"[embeddings](/docs/concepts/embedding_models/) object. This is a pre-defined\n",
"[\"fake\" embeddings model](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.html#langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.get_embeddings)\n",
"that will generate short, arbitrary vectors for documents. You can use a different\n",
"embeddings object if desired.\n",
"\n",
"In the `finally` block, we call whatever integration-specific logic is needed to\n",
"bring the vector store to a clean state. This logic is executed in between each test\n",
"(e.g., even if tests fail).\n",
"\n",
":::note\n",
"\n",
"Details on what tests are run and troubleshooting tips for each test can be found in the [API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.html).\n",
"\n",
":::"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"</details>"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -0,0 +1,402 @@
---
pagination_next: contributing/how_to/integrations/publish
pagination_prev: contributing/how_to/integrations/package
---
# How to add standard tests to an integration
When creating either a custom class for yourself or to publish in a LangChain integration, it is important to add standard tests to ensure it works as expected. This guide will show you how to add standard tests to each integration type.
## Setup
If you're coming from the [previous guide](../package), you have already installed these dependencies, and you can skip this section.
First, let's install 2 dependencies:
- `langchain-core` will define the interfaces we want to import to define our custom tool.
- `langchain-tests` will provide the standard tests we want to use. Recommended to pin to the latest version: <img src="https://img.shields.io/pypi/v/langchain-tests" style={{position:"relative",top:4,left:3}} />
:::note
Because added tests in new versions of `langchain-tests` can break your CI/CD pipelines, we recommend pinning the
version of `langchain-tests` to avoid unexpected changes.
:::
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
<Tabs>
<TabItem value="poetry" label="Poetry" default>
If you followed the [previous guide](../package), you should already have these dependencies installed!
```bash
poetry add langchain-core
poetry add --group test pytest pytest-socket pytest-asyncio langchain-tests==<latest_version>
poetry install --with test
```
</TabItem>
<TabItem value="pip" label="Pip">
```bash
pip install -U langchain-core pytest pytest-socket pytest-asyncio langchain-tests
# install current package in editable mode
pip install --editable .
```
</TabItem>
</Tabs>
Let's say we're publishing a package, `langchain_parrot_link`, that exposes the chat model from the [guide on implementing the package](../package). We can add the standard tests to the package by following the steps below.
And we'll assume you've structured your package the same way as the main LangChain
packages:
```plaintext
langchain-parrot-link/
├── langchain_parrot_link/
│ ├── __init__.py
│ └── chat_models.py
├── tests/
│ ├── __init__.py
│ ├── unit_tests/
│ │ ├── __init__.py
│ │ └── test_chat_models.py
│ └── integration_tests/
│ ├── __init__.py
│ └── test_chat_models.py
├── pyproject.toml
└── README.md
```
## Add and configure standard tests
There are 2 namespaces in the `langchain-tests` package:
- [unit tests](../../../concepts/testing.mdx#unit-tests) (`langchain_tests.unit_tests`): designed to be used to test the component in isolation and without access to external services
- [integration tests](../../../concepts/testing.mdx#integration-tests) (`langchain_tests.integration_tests`): designed to be used to test the component with access to external services (in particular, the external service that the component is designed to interact with).
Both types of tests are implemented as [`pytest` class-based test suites](https://docs.pytest.org/en/7.1.x/getting-started.html#group-multiple-tests-in-a-class).
By subclassing the base classes for each type of standard test (see below), you get all of the standard tests for that type, and you
can override the properties that the test suite uses to configure the tests.
### Implementing standard tests
import CodeBlock from '@theme/CodeBlock';
In the following tabs, we show how to implement the standard tests for
each component type:
<Tabs>
<TabItem value="chat_models" label="Chat models">
Here's how you would configure the standard unit tests for the custom chat model:
Chat model standard tests test a range of behaviors, from the most basic requirements (generating a response to a query) to optional capabilities like multi-modal support and tool-calling. For a test run to be successful:
1. If a feature is intended to be supported by the model, it should pass;
2. If a feature is not intended to be supported by the model, it should be skipped.
Tests for "optional" capabilities are controlled via a set of properties that can be overridden on the test model subclass.
You can see the entire list of properties in the API references for
[unit tests](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.chat_models.ChatModelUnitTests.html)
and [integration tests](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.html).
For example, to enable integration tests for image inputs, we can implement
```python
@property
def supports_image_inputs(self) -> bool:
return True
```
on the integration test class.
:::note
Details on what tests are run, how each test can be skipped, and troubleshooting tips for each test can be found in the API references. See details:
- [Unit tests API reference](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.chat_models.ChatModelUnitTests.html)
- [Integration tests API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.html)
:::
Unit test example:
import ChatUnitSource from '../../../../src/theme/integration_template/tests/unit_tests/test_chat_models.py';
<CodeBlock language="python" title="tests/unit_tests/test_chat_models.py">
{
ChatUnitSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT_LINK')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
Integration test example:
import ChatIntegrationSource from '../../../../src/theme/integration_template/tests/integration_tests/test_chat_models.py';
<CodeBlock language="python" title="tests/integration_tests/test_chat_models.py">
{
ChatIntegrationSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT_LINK')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
</TabItem>
<TabItem value="vector_stores" label="Vector stores">
Here's how you would configure the standard tests for a typical vector store (using
`ParrotVectorStore` as a placeholder):
Note that unlike chat models, vector stores do not have optional capabilities that
can be enabled or disabled at this time.
import VectorStoreIntegrationSource from '../../../../src/theme/integration_template/tests/integration_tests/test_vectorstores.py';
<CodeBlock language="python" title="tests/integration_tests/test_vectorstores.py">
{
VectorStoreIntegrationSource.replaceAll('__ModuleName__', 'Parrot')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
Configuring the tests consists of implementing pytest fixtures for setting up an
empty vector store and tearing down the vector store after the test run ends.
For example, below is the `VectorStoreIntegrationTests` class for the [Chroma](https://python.langchain.com/docs/integrations/vectorstores/chroma/)
integration:
```python
from typing import Generator
import pytest
from langchain_core.vectorstores import VectorStore
from langchain_tests.integration_tests.vectorstores import VectorStoreIntegrationTests
from langchain_chroma import Chroma
class TestChromaStandard(VectorStoreIntegrationTests):
@pytest.fixture()
def vectorstore(self) -> Generator[VectorStore, None, None]: # type: ignore
"""Get an empty vectorstore for unit tests."""
store = Chroma(embedding_function=self.get_embeddings())
try:
yield store
finally:
store.delete_collection()
pass
```
Note that before the initial `yield`, we instantiate the vector store with an
[embeddings](/docs/concepts/embedding_models/) object. This is a pre-defined
["fake" embeddings model](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.html#langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.get_embeddings)
that will generate short, arbitrary vectors for documents. You can use a different
embeddings object if desired.
In the `finally` block, we call whatever integration-specific logic is needed to
bring the vector store to a clean state. This logic is executed in between each test
(e.g., even if tests fail).
:::note
Details on what tests are run and troubleshooting tips for each test can be found in the [API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.vectorstores.VectorStoreIntegrationTests.html).
:::
</TabItem>
<TabItem value="embeddings" label="Embeddings">
To configure standard tests for an embeddings model, we subclass
`EmbeddingsUnitTests` and `EmbeddingsIntegrationTests`. On each subclass, we
implement the `embeddings_class` property to specify the embeddings model to be
tested. We can also override the embedding_model_params property to specify
initialization parameters. See examples below.
:::note
Details on what tests are run, how each test can be skipped, and troubleshooting tips for each test can be found in the API references. See details:
- [Unit tests API reference](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.embeddings.EmbeddingsUnitTests.html)
- [Integration tests API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.embeddings.EmbeddingsIntegrationTests.html)
:::
Unit test example:
import EmbeddingsUnitSource from '../../../../src/theme/integration_template/tests/unit_tests/test_embeddings.py';
<CodeBlock language="python" title="tests/unit_tests/test_embeddings.py">
{
EmbeddingsUnitSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT_LINK')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
Integration test example:
```python title="tests/integration_tests/test_embeddings.py"
from typing import Type
from langchain_parrot_link.embeddings import ParrotLinkEmbeddings
from langchain_tests.integration_tests import EmbeddingsIntegrationTests
class TestParrotLinkEmbeddingsIntegration(EmbeddingsIntegrationTests):
@property
def embeddings_class(self) -> Type[ParrotLinkEmbeddings]:
return ParrotLinkEmbeddings
@property
def embedding_model_params(self) -> dict:
return {"model": "nest-embed-001"}
```
import EmbeddingsIntegrationSource from '../../../../src/theme/integration_template/tests/integration_tests/test_embeddings.py';
<CodeBlock language="python" title="tests/integration_tests/test_embeddings.py">
{
EmbeddingsIntegrationSource.replaceAll('__ModuleName__', 'ParrotLink')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT_LINK')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
</TabItem>
<TabItem value="tools" label="Tools">
To configure standard tests for a tool, we subclass `ToolsUnitTests` and
`ToolsIntegrationTests`. On each subclass, we override the following `@property` methods
to specify the tool to be tested and the tool's configuration:
| Property | Description |
| --- | --- |
| `tool_constructor` | The constructor for the tool to be tested, or an instantiated tool. |
| `tool_constructor_params` | The parameters to pass to the tool (optional). |
| `tool_invoke_params_example` | An example of the parameters to pass to the tool's `invoke` method. |
If you are testing a tool class and pass a class like `MyTool` to `tool_constructor`, you can pass the parameters to the constructor in `tool_constructor_params`.
If you are testing an instantiated tool, you can pass the instantiated tool to `tool_constructor` and do not
override `tool_constructor_params`.
:::note
Details on what tests are run, how each test can be skipped, and troubleshooting tips for each test can be found in the API references. See details:
- [Unit tests API reference](https://python.langchain.com/api_reference/standard_tests/unit_tests/langchain_tests.unit_tests.tools.ToolsUnitTests.html)
- [Integration tests API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.tools.ToolsIntegrationTests.html)
:::
import ToolsUnitSource from '../../../../src/theme/integration_template/tests/unit_tests/test_tools.py';
<CodeBlock language="python" title="tests/unit_tests/test_tools.py">
{
ToolsUnitSource.replaceAll('__ModuleName__', 'Parrot')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
import ToolsIntegrationSource from '../../../../src/theme/integration_template/tests/integration_tests/test_tools.py';
<CodeBlock language="python" title="tests/integration_tests/test_tools.py">
{
ToolsIntegrationSource.replaceAll('__ModuleName__', 'Parrot')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
</TabItem>
<TabItem value="retrievers" label="Retrievers">
To configure standard tests for a retriever, we subclass `RetrieversUnitTests` and
`RetrieversIntegrationTests`. On each subclass, we override the following `@property` methods
| Property | Description |
| --- | --- |
| `retriever_constructor` | The class for the retriever to be tested |
| `retriever_constructor_params` | The parameters to pass to the retriever's constructor |
| `retriever_query_example` | An example of the query to pass to the retriever's `invoke` method |
:::note
Details on what tests are run and troubleshooting tips for each test can be found in the [API reference](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.retrievers.RetrieversIntegrationTests.html).
:::
import RetrieverIntegrationSource from '../../../../src/theme/integration_template/tests/integration_tests/test_retrievers.py';
<CodeBlock language="python" title="tests/integration_tests/test_retrievers.py">
{
RetrieverIntegrationSource.replaceAll('__ModuleName__', 'Parrot')
.replaceAll('__package_name__', 'langchain-parrot-link')
.replaceAll('__MODULE_NAME__', 'PARROT')
.replaceAll('__module_name__', 'langchain_parrot_link')
}
</CodeBlock>
</TabItem>
</Tabs>
---
### Running the tests
You can run these with the following commands from your project root
<Tabs>
<TabItem value="poetry" label="Poetry" default>
```bash
# run unit tests without network access
poetry run pytest --disable-socket --allow-unix-socket --asyncio-mode=auto tests/unit_tests
# run integration tests
poetry run pytest --asyncio-mode=auto tests/integration_tests
```
</TabItem>
<TabItem value="pip" label="Pip">
```bash
# run unit tests without network access
pytest --disable-socket --allow-unix-socket --asyncio-mode=auto tests/unit_tests
# run integration tests
pytest --asyncio-mode=auto tests/integration_tests
```
</TabItem>
</Tabs>
## Test suite information and troubleshooting
For a full list of the standard test suites that are available, as well as
information on which tests are included and how to troubleshoot common issues,
see the [Standard Tests API Reference](https://python.langchain.com/api_reference/standard_tests/index.html).
You can see troubleshooting guides under the individual test suites listed in that API Reference. For example,
[here is the guide for `ChatModelIntegrationTests.test_usage_metadata`](https://python.langchain.com/api_reference/standard_tests/integration_tests/langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.html#langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.test_usage_metadata).

View File

@ -19,6 +19,6 @@ class Test__ModuleName__Retriever(RetrieversIntegrationTests):
@property
def retriever_query_example(self) -> str:
"""
Returns a dictionary representing the "args" of an example retriever call.
Returns a str representing the "query" of an example retriever call.
"""
return "example query"