- ActionAgent has a property called, `allowed_tools`, which is declared as `List`. It stores all provided tools which is available to use during agent action. - This collection shouldn’t allow duplicates. The original datatype List doesn’t make sense. Each tool should be unique. Even when there are variants (assuming in the future), it would be named differently in load_tools. Test: - confirm the functionality in an example by initializing an agent with a list of 2 tools and confirm everything works. ```python3 def test_agent_chain_chat_bot(): from langchain.agents import load_tools from langchain.agents import initialize_agent from langchain.agents import AgentType from langchain.chat_models import ChatOpenAI from langchain.llms import OpenAI from langchain.utilities.duckduckgo_search import DuckDuckGoSearchAPIWrapper chat = ChatOpenAI(temperature=0) llm = OpenAI(temperature=0) tools = load_tools(["ddg-search", "llm-math"], llm=llm) agent = initialize_agent(tools, chat, agent=AgentType.CHAT_ZERO_SHOT_REACT_DESCRIPTION, verbose=True) agent.run("Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?") test_agent_chain_chat_bot() ``` Result: <img width="863" alt="Screenshot 2023-05-01 at 7 58 11 PM" src="https://user-images.githubusercontent.com/62768671/235572157-0937594c-ddfb-4760-acb2-aea4cacacd89.png"> |
||
---|---|---|
.. | ||
integration_tests | ||
mock_servers | ||
unit_tests | ||
__init__.py | ||
README.md |
Readme tests(draft)
Integrations Tests
Prepare
This repository contains functional tests for several search engines and databases. The tests aim to verify the correct behavior of the engines and databases according to their specifications and requirements.
To run some integration tests, such as tests located in
tests/integration_tests/vectorstores/
, you will need to install the following
software:
- Docker
- Python 3.8.1 or later
We have optional group test_integration
in the pyproject.toml
file. This group
should contain dependencies for the integration tests and can be installed using the
command:
poetry install --with test_integration
Any new dependencies should be added by running:
# add package and install it after adding:
poetry add tiktoken@latest --group "test_integration" && poetry install --with test_integration
Before running any tests, you should start a specific Docker container that has all the
necessary dependencies installed. For instance, we use the elasticsearch.yml
container
for test_elasticsearch.py
:
cd tests/integration_tests/vectorstores/docker-compose
docker-compose -f elasticsearch.yml up
Prepare environment variables for local testing:
- copy
tests/.env.example
totests/.env
- set variables in
tests/.env
file, e.gOPENAI_API_KEY
Additionally, it's important to note that some integration tests may require certain
environment variables to be set, such as OPENAI_API_KEY
. Be sure to set any required
environment variables before running the tests to ensure they run correctly.
Recording HTTP interactions with pytest-vcr
Some of the integration tests in this repository involve making HTTP requests to external services. To prevent these requests from being made every time the tests are run, we use pytest-vcr to record and replay HTTP interactions.
When running tests in a CI/CD pipeline, you may not want to modify the existing cassettes. You can use the --vcr-record=none command-line option to disable recording new cassettes. Here's an example:
pytest --log-cli-level=10 tests/integration_tests/vectorstores/test_pinecone.py --vcr-record=none
pytest tests/integration_tests/vectorstores/test_elasticsearch.py --vcr-record=none
Run some tests with coverage:
pytest tests/integration_tests/vectorstores/test_elasticsearch.py --cov=langchain --cov-report=html
start "" htmlcov/index.html || open htmlcov/index.html