docs: Add langchain-prolog documentation (#29788)

I want to add documentation for a new integration with SWI-Prolog.

@hwchase17 check this out:

https://github.com/apisani1/langchain-prolog/tree/main/examples/travel_agent

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
This commit is contained in:
Antonio Pisani 2025-02-20 11:50:28 -05:00 committed by GitHub
parent be7fa920fa
commit 2c403a3ea9
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 404 additions and 0 deletions

View File

@ -0,0 +1,23 @@
# SWI-Prolog
SWI-Prolog offers a comprehensive free Prolog environment.
## Installation and Setup
Install lanchain-prolog using pip:
```bash
pip install langchain-prolog
```
## Tools
The `PrologTool` class allows the generation of langchain tools that use Prolog rules to generate answers.
```python
from langchain_prolog import PrologConfig, PrologTool
```
See a [usage example](/docs/integrations/tools/prolog_tool).
See the same guide for usage examples of `PrologRunnable`, which allows the generation
of LangChain runnables that use Prolog rules to generate answers.

View File

@ -0,0 +1,376 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "bf0c1ab2",
"metadata": {},
"source": [
"# Prolog\n",
"\n",
"LangChain tools that use Prolog rules to generate answers.\n",
"\n",
"## Overview\n",
"\n",
"The PrologTool class allows the generation of langchain tools that use Prolog rules to generate answers.\n",
"\n",
"## Setup\n",
"\n",
"Let's use the following Prolog rules in the file family.pl:\n",
"\n",
"parent(john, bianca, mary).\\\n",
"parent(john, bianca, michael).\\\n",
"parent(peter, patricia, jennifer).\\\n",
"partner(X, Y) :- parent(X, Y, _)."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "60f0e3d4",
"metadata": {},
"outputs": [],
"source": [
"#!pip install langchain-prolog\n",
"\n",
"from langchain_prolog import PrologConfig, PrologRunnable, PrologTool\n",
"\n",
"TEST_SCRIPT = \"family.pl\""
]
},
{
"cell_type": "markdown",
"id": "b50700d4",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"First create the Prolog tool:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "9b9a2015",
"metadata": {},
"outputs": [],
"source": [
"schema = PrologRunnable.create_schema(\"parent\", [\"men\", \"women\", \"child\"])\n",
"config = PrologConfig(\n",
" rules_file=TEST_SCRIPT,\n",
" query_schema=schema,\n",
")\n",
"prolog_tool = PrologTool(\n",
" prolog_config=config,\n",
" name=\"family_query\",\n",
" description=\"\"\"\n",
" Query family relationships using Prolog.\n",
" parent(X, Y, Z) implies only that Z is a child of X and Y.\n",
" Input can be a query string like 'parent(john, X, Y)' or 'john, X, Y'\"\n",
" You have to specify 3 parameters: men, woman, child. Do not use quotes.\n",
" \"\"\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "c3363f7d",
"metadata": {},
"source": [
"## Invocation\n",
"\n",
"### Using a Prolog tool with an LLM and function calling"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "68709aa2",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"#!pip install python-dotenv\n",
"\n",
"from dotenv import find_dotenv, load_dotenv\n",
"\n",
"load_dotenv(find_dotenv(), override=True)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "c171c1ef",
"metadata": {},
"outputs": [],
"source": [
"#!pip install langchain-openai\n",
"\n",
"from langchain_core.messages import HumanMessage\n",
"from langchain_openai import ChatOpenAI"
]
},
{
"cell_type": "markdown",
"id": "9d23b2c5",
"metadata": {},
"source": [
"To use the tool, bind it to the LLM model and query the model:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "3ce4479d",
"metadata": {},
"outputs": [],
"source": [
"llm = ChatOpenAI(model=\"gpt-4o-mini\")\n",
"llm_with_tools = llm.bind_tools([prolog_tool])\n",
"messages = [HumanMessage(\"Who are John's children?\")]\n",
"response = llm_with_tools.invoke(messages)"
]
},
{
"cell_type": "markdown",
"id": "50154355",
"metadata": {},
"source": [
"The LLM will respond with a tool call request:"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "5676fe7e",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'name': 'family_query',\n",
" 'args': {'men': 'john', 'women': None, 'child': None},\n",
" 'id': 'call_3VazWUstCGlY8zczi05TaduU',\n",
" 'type': 'tool_call'}"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages.append(response)\n",
"response.tool_calls[0]"
]
},
{
"cell_type": "markdown",
"id": "f774d582",
"metadata": {},
"source": [
"The tool takes this request and queries the Prolog database:"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "66516c4e",
"metadata": {},
"outputs": [],
"source": [
"tool_msg = prolog_tool.invoke(response.tool_calls[0])"
]
},
{
"cell_type": "markdown",
"id": "0fca8ec2",
"metadata": {},
"source": [
"The tool returns a list with all the solutions for the query:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "33f779bc",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"ToolMessage(content='[{\"Women\": \"bianca\", \"Child\": \"mary\"}, {\"Women\": \"bianca\", \"Child\": \"michael\"}]', name='family_query', tool_call_id='call_3VazWUstCGlY8zczi05TaduU')"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages.append(tool_msg)\n",
"tool_msg"
]
},
{
"cell_type": "markdown",
"id": "b520745e",
"metadata": {},
"source": [
"That we then pass to the LLM, and the LLM answers the original query using the tool response:"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "f997186a",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"John has two children: Mary and Michael. Their mother is Bianca.\n"
]
}
],
"source": [
"answer = llm_with_tools.invoke(messages)\n",
"print(answer.content)"
]
},
{
"cell_type": "markdown",
"id": "0f7ce937",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"### Using a Prolog Tool with an agent"
]
},
{
"cell_type": "markdown",
"id": "37cd7ef8",
"metadata": {},
"source": [
"To use the prolog tool with an agent, pass it to the agent's constructor:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "b7bbfd79",
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import AgentExecutor, create_tool_calling_agent\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"\n",
"llm = ChatOpenAI(model=\"gpt-4o-mini\")\n",
"\n",
"prompt = ChatPromptTemplate.from_messages(\n",
" [\n",
" (\"system\", \"You are a helpful assistant\"),\n",
" (\"human\", \"{input}\"),\n",
" (\"placeholder\", \"{agent_scratchpad}\"),\n",
" ]\n",
")\n",
"\n",
"tools = [prolog_tool]\n",
"\n",
"agent = create_tool_calling_agent(llm, tools, prompt)\n",
"\n",
"agent_executor = AgentExecutor(agent=agent, tools=tools)"
]
},
{
"cell_type": "markdown",
"id": "3e866895",
"metadata": {},
"source": [
"The agent takes the query and use the Prolog tool if needed:"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "acad8d33",
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"answer = agent_executor.invoke({\"input\": \"Who are John's children?\"})"
]
},
{
"cell_type": "markdown",
"id": "92e71ffa",
"metadata": {},
"source": [
"Then the agent recieves the tool response as part of the `\"agent_scratchpad`\" placeholder and generates the answer:"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "36a464f4",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"John has two children: Mary and Michael, with Bianca as their mother.\n"
]
}
],
"source": [
"print(answer[\"output\"])"
]
},
{
"cell_type": "markdown",
"id": "baee74e3",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"(TODO: update with API reference once built.)\n",
"\n",
"See https://github.com/apisani1/langchain-prolog/tree/main for detail."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3.9 (test-env)",
"language": "python",
"name": "test-env"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.21"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@ -445,3 +445,8 @@ packages:
repo: Shikenso-Analytics/langchain-discord
downloads: 1
downloads_updated_at: '2025-02-15T16:00:00.000000+00:00'
- name: langchain-prolog
path: .
repo: apisani1/langchain-prolog
downloads: 0
downloads_updated_at: '2025-02-15T16:00:00.000000+00:00'