Compare commits

...

10 Commits

Author SHA1 Message Date
Bagatur
5fb687ef99 readme 2023-10-18 14:55:06 -07:00
Bagatur
922b11c6d7 Merge branch 'master' into cogniswitch_chains 2023-10-18 14:53:21 -07:00
Bagatur
9b53654939 cr 2023-10-18 14:53:11 -07:00
Bagatur
fa00ec1ca0 wip 2023-10-18 12:21:52 -07:00
Bagatur
ff5d243fcb merge 2023-10-18 10:50:01 -07:00
CogniJT
6611122082 Merge branch 'master' into cogniswitch_chains 2023-09-21 15:07:14 +05:30
CogniJT
8d1140a0d4 Merge branch 'master' into cogniswitch_chains 2023-09-21 03:57:27 +05:30
CogniJT
bdd6460794 Merge branch 'langchain-ai:master' into cogniswitch_chains 2023-09-21 02:49:59 +05:30
JT
e6946ac621 Merge branch 'cogniswitch_chains' of https://github.com/CogniSwitch/langchain into cogniswitch_chains 2023-09-21 02:40:01 +05:30
JT
979c6b6a1a Adding CogniSwitch.ai calls as a chain
- Jupyter Notebook
- Init to import
- Store Chain
- Answer Chain
- Tests for the chains
2023-09-21 02:39:20 +05:30
7 changed files with 600 additions and 0 deletions

View File

@@ -15,6 +15,7 @@ Notebook | Description
[camel_role_playing.ipynb](https://github.com/langchain-ai/langchain/tree/master/cookbook/camel_role_playing.ipynb) | Implement the camel framework for creating autonomous cooperative agents in large scale language models, using role-playing and inception prompting to guide chat agents towards task completion.
[causal_program_aided_language_...](https://github.com/langchain-ai/langchain/tree/master/cookbook/causal_program_aided_language_model.ipynb) | Implement the causal program-aided language (cpal) chain, which improves upon the program-aided language (pal) by incorporating causal structure to prevent hallucination in language models, particularly when dealing with complex narratives and math problems with nested dependencies.
[code-analysis-deeplake.ipynb](https://github.com/langchain-ai/langchain/tree/master/cookbook/code-analysis-deeplake.ipynb) | Analyze its own code base with the help of gpt and activeloop's deep lake.
[cogniswitch_chain_usage.ipynb](https://github.com/langchain-ai/langchain/tree/master/cookbook/cogniswitch_chain_usage.ipynb) | Use cogniswitch to have chat with your knowledge in just two steps.
[custom_agent_with_plugin_retri...](https://github.com/langchain-ai/langchain/tree/master/cookbook/custom_agent_with_plugin_retrieval.ipynb) | Build a custom agent that can interact with ai plugins by retrieving tools and creating natural language wrappers around openapi endpoints.
[custom_agent_with_plugin_retri...](https://github.com/langchain-ai/langchain/tree/master/cookbook/custom_agent_with_plugin_retrieval_using_plugnplai.ipynb) | Build a custom agent with plugin retrieval functionality, utilizing ai plugins from the `plugnplai` directory.
[databricks_sql_db.ipynb](https://github.com/langchain-ai/langchain/tree/master/cookbook/databricks_sql_db.ipynb) | Connect to databricks runtimes and databricks sql.

View File

@@ -0,0 +1,210 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "19062701",
"metadata": {},
"source": [
"## Cogniswitch Chains\n",
"\n",
"**Use cogniswitch to have chat with your knowledge in just two steps**\n",
"\n",
"visit https://console.cogniswitch.ai:8443 to register.<br>\n",
"\n",
"**Registration steps:**\n",
"- Signup with your email and verify your registration\n",
"- You will get a mail with a platform token and oauth token for using the services.\n",
"\n",
"\n",
"**step 1: Cogniswitch Store Chain:**<br>\n",
"- Use your cogniswitch token, openAI API key, oauth token and your file or URL to run the chain. <br> \n",
"- it will be processed and stored in your knowledge store. <br> \n",
"- you can check the status of document processing in cogniswitch console. <br>\n",
"\n",
"**step 2: Cogniswitch Answer Chain:**<br>\n",
"- Use your cogniswitch token, openAI API key, oauth token and your question to run the chain. <br>\n",
"- You will get the answer from your knowledge as the response. <br>\n"
]
},
{
"cell_type": "markdown",
"id": "1435b193",
"metadata": {},
"source": [
"## Import necessary libraries"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "05df13dc",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains.cogniswitch import CogniswitchAnswerChain\n",
"from langchain.chains.cogniswitch import CogniswitchStoreChain"
]
},
{
"cell_type": "markdown",
"id": "3476857f",
"metadata": {},
"source": [
"## Cogniswitch Store Chain"
]
},
{
"cell_type": "markdown",
"id": "320e02fc",
"metadata": {},
"source": [
"### Instantiate the Store chain"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "89f58167",
"metadata": {},
"outputs": [],
"source": [
"cs_store = CogniswitchStoreChain()"
]
},
{
"cell_type": "markdown",
"id": "42c9890e",
"metadata": {},
"source": [
"### Run the chain with the cs token, openai token and either url or a file path"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "5c6410e9",
"metadata": {},
"outputs": [],
"source": [
"response = cs_store.run({\"cs_token\": <Your cs token here>,\n",
" \"OAI_token\": <your Openai API key here>,\n",
" \"url\":\"https://cogniswitch.ai\",\n",
" \"apiKey\": <your oauth token here>,\n",
" })"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "794b4fba",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'data': {'knowledgeSourceId': 20, 'sourceType': 'https://cogniswitch.ai', 'sourceURL': None, 'sourceFileName': None, 'sourceName': None, 'sourceDescription': None, 'status': 'UPLOADED'}, 'list': None, 'message': \"We're processing your content & will send you an email on completion, hang tight!\", 'statusCode': 1000}\n"
]
}
],
"source": [
"print(response)"
]
},
{
"cell_type": "markdown",
"id": "b5e9ca94",
"metadata": {},
"source": [
"## Cogniswitch Answer Chain"
]
},
{
"cell_type": "markdown",
"id": "c7d32067",
"metadata": {},
"source": [
"### Instantiate the answer chain"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "1365953e",
"metadata": {},
"outputs": [],
"source": [
"cs_answer = CogniswitchAnswerChain()"
]
},
{
"cell_type": "markdown",
"id": "0ba9aca9",
"metadata": {},
"source": [
"### Run the chain with the cs token, openai token and the query"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "eb059787",
"metadata": {},
"outputs": [],
"source": [
"resp = cs_answer.run({\"cs_token\": <Your cs token here>,\n",
" \"query\":\"what is cogniswitch\",\n",
" \"OAI_token\": <your Openai API key here>,\n",
" \"apiKey\": <your oauth token here>,\n",
" })"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "e73e963f",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'data': {'answer': 'CogniSwitch is a technology developed by CogniSwitch Inc. It is designed to make Generative AI (Artificial Intelligence) reliable by utilizing enterprise knowledge. It auto-gathers and organizes knowledge from various sources, allowing experts to curate and visualize it before publication. The CogniSwitch API enables Gen AI applications to access this knowledge on-demand, ensuring reliability and eliminating hallucinations and bias.'}, 'list': None, 'message': None, 'statusCode': 1000}\n"
]
}
],
"source": [
"print(resp)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "fc5027e3",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "langchain_repo",
"language": "python",
"name": "langchain_repo"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.17"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@@ -0,0 +1,6 @@
"Cogniswitch chains"
from langchain.chains.cogniswitch.cogniswitch_answer import CogniswitchAnswerChain
from langchain.chains.cogniswitch.cogniswitch_loader import CogniswitchStoreChain
__all__ = ["CogniswitchAnswerChain", "CogniswitchStoreChain"]

View File

@@ -0,0 +1,96 @@
from typing import Any, Dict, List, Optional
import requests
from langchain.callbacks.manager import CallbackManagerForChainRun
from langchain.chains.base import Chain
class CogniswitchAnswerChain(Chain):
"""
A chain class for interacting with the Cogniswitch service to answer questions.
"""
@property
def input_keys(self) -> List[str]:
"""
List of expected input keys for the chain.
Returns:
List[str]: A list of input keys.
"""
return ["cs_token", "OAI_token", "query", "apiKey"]
@property
def output_keys(self) -> List[str]:
"""
List of output keys produced by the chain.
Returns:
List[str]: A list of output keys.
"""
return ["response"]
def _call(
self,
inputs: Dict[str, Any],
run_manager: Optional[CallbackManagerForChainRun] = None,
) -> Dict[str, Any]:
"""
Execute the chain to answer a query.
Args:
inputs (Dict[str, Any]): Input dictionary containing
'cs_token', 'OAI_token', and 'query'.
run_manager (Optional[CallbackManagerForChainRun]):
Manager for chain run callbacks.
Returns:
Dict[str, Any]: Output dictionary containing
the 'response' from the service.
"""
cs_token = inputs["cs_token"]
OAI_token = inputs["OAI_token"]
query = inputs["query"]
apiKey = inputs["apiKey"]
response = self.answer_cs(cs_token, OAI_token, query, apiKey)
return {"response": response}
def answer_cs(self, cs_token: str, OAI_token: str, query: str, apiKey: str) -> dict:
"""
Send a query to the Cogniswitch service and retrieve the response.
Args:
cs_token (str): Cogniswitch token.
OAI_token (str): OpenAI token.
query (str): Query to be answered.
Returns:
dict: Response JSON from the Cogniswitch service.
"""
api_url = "https://api.cogniswitch.ai:8243/cs-api/0.0.1/cs/knowledgeRequest"
headers = {
"apiKey": apiKey,
"platformToken": cs_token,
"openAIToken": OAI_token,
}
data = {"query": query}
response = requests.post(api_url, headers=headers, verify=False, data=data)
return response.json()
def _validate_inputs(self, inputs: Dict[str, Any]) -> None:
"""
Validate if all required input keys are provided.
Args:
inputs (Dict[str, Any]): Input dictionary containing the provided keys.
Raises:
ValueError: If any required input key is missing.
"""
missing_keys = set(self.input_keys).difference(inputs)
if missing_keys:
raise ValueError(f"Missing {missing_keys}")

View File

@@ -0,0 +1,193 @@
from typing import Any, Dict, List, Optional
import requests
from langchain.callbacks.manager import CallbackManagerForChainRun
from langchain.chains.base import Chain
class CogniswitchStoreChain(Chain):
"""
A chain class for storing data using the Cogniswitch service.
"""
cs_token: str = "cs_token"
OAI_token: str = "OAI_token"
url: str = "url"
file: str = "file"
apiKey: str = "apiKey"
document_name: str = "document_name"
document_description: str = "document_description"
input_variables = [
cs_token,
OAI_token,
url,
file,
document_name,
document_description,
apiKey,
]
output_key: str = "response"
@property
def input_keys(self) -> List[str]:
"""
List of expected input keys for the chain.
Returns:
List[str]: A list of input keys.
"""
return [
self.cs_token,
self.OAI_token,
self.file,
self.url,
self.document_name,
self.document_description,
self.apiKey,
]
@property
def output_keys(self) -> List[str]:
"""
List of output keys produced by the chain.
Returns:
List[str]: A list of output keys.
"""
return [self.output_key]
def _call(
self,
inputs: Dict[str, Any],
run_manager: Optional[CallbackManagerForChainRun] = None,
) -> Dict[str, Any]:
"""
Execute the chain to store data.
Args:
inputs (Dict[str, Any]): Input dictionary containing:
'cs_token', 'OAI_token', 'url', and 'file'.
run_manager
(Optional[CallbackManagerForChainRun]): Manager for chain run callbacks.
Returns:
Dict[str, Any]: Output dictionary containing the response from the service.
"""
cs_token = inputs["cs_token"]
OAI_token = inputs["OAI_token"]
url = inputs.get("url")
file = inputs.get("file")
document_name = inputs.get("document_name")
document_description = inputs.get("document_description")
apiKey = inputs["apiKey"]
if document_name is None:
document_name = None
response = self.store_data(
cs_token, OAI_token, url, file, apiKey, document_name, document_description
)
return {"response": response}
def store_data(
self,
cs_token: str,
OAI_token: str,
url: Optional[str],
file: Optional[str],
apiKey: str,
document_name: Optional[str],
document_description: Optional[str],
) -> dict:
"""
Store data using the Cogniswitch service.
Args:
cs_token (str): Cogniswitch token.
OAI_token (str): OpenAI token.
url (Optional[str]): URL link.
file (Optional[str]): file path of your file.
the current files supported by the files are
.txt, .pdf, .docx, .doc, .html
document_name (Optional[str]): Name of the document you are uploading.
document_description (Optional[str]): Description of the document.
Returns:
dict: Response JSON from the Cogniswitch service.
"""
if not document_name:
document_name = None
if not document_description:
document_description = None
if not file:
api_url = (
"https://api.cogniswitch.ai:8243/cs-api/0.0.1/cs/knowledgeSource/url"
)
headers = {
"apiKey": apiKey,
"openAIToken": OAI_token,
"platformToken": cs_token,
}
files = None
data = {"url": url}
response = requests.post(
api_url, headers=headers, verify=False, data=data, files=files
)
if not url:
api_url = (
"https://api.cogniswitch.ai:8243/cs-api/0.0.1/cs/knowledgeSource/file"
)
headers = {
"apiKey": apiKey,
"openAIToken": OAI_token,
"platformToken": cs_token,
}
if file is not None:
files = {"file": open(file, "rb")}
else:
files = None
data = {
"url": url,
"documentName": document_name,
"documentDescription": document_description,
}
response = requests.post(
api_url, headers=headers, verify=False, data=data, files=files
)
if response.status_code == 200:
return response.json()
else:
# error_message = response.json()["message"]
return {
"message": "Bad Request",
}
def _validate_inputs(self, inputs: Dict[str, Any]) -> None:
"""
Validate if all required input keys are provided.
Args:
inputs (Dict[str, Any]): Input dictionary containing the provided keys.
Raises:
ValueError: If any required input key is missing.
"""
required_keys = {
self.cs_token,
self.OAI_token,
self.file
if not inputs.get("url")
else None, # Either file or url is required
self.url if not inputs.get("file") else None,
self.apiKey,
}
missing_keys = required_keys.difference(inputs)
if None not in missing_keys:
raise ValueError(f"Missing: {missing_keys}")

View File

@@ -0,0 +1,51 @@
import unittest
from unittest.mock import MagicMock, patch
from langchain.chains.cogniswitch import CogniswitchAnswerChain
class TestCogniswitchAnswerChain(unittest.TestCase):
@patch("requests.post")
def test_answer_cs(self, mock_post: MagicMock) -> None:
chain = CogniswitchAnswerChain()
cs_token = "cs_token"
OAI_token = "OAI_token"
apiKey = "apiKey"
query = "test query"
expected_response = "Test answer"
mock_response = MagicMock()
mock_response.json.return_value = expected_response
mock_post.return_value = mock_response
response = chain.answer_cs(cs_token, OAI_token, query, apiKey)
self.assertEqual(response, expected_response)
mock_post.assert_called_once_with(
"https://api.cogniswitch.ai:8243/cs-api/0.0.1/cs/knowledgeRequest",
headers={
"apiKey": apiKey,
"platformToken": cs_token,
"openAIToken": OAI_token,
},
verify=False,
data={"query": query},
)
def test_validate_inputs_missing_cs_token(self) -> None:
chain = CogniswitchAnswerChain()
inputs = {"query": "test query", "apiKey": "apiKey"}
with self.assertRaises(ValueError):
chain._validate_inputs(inputs)
def test_validate_inputs_missing_query(self) -> None:
chain = CogniswitchAnswerChain()
inputs = {"cs_token": "cs_token", "apiKey": "apiKey"}
with self.assertRaises(ValueError):
chain._validate_inputs(inputs)
def test_validate_inputs_missing_keys(self) -> None:
chain = CogniswitchAnswerChain()
inputs: dict = {}
with self.assertRaises(ValueError):
chain._validate_inputs(inputs)

View File

@@ -0,0 +1,43 @@
import unittest
from unittest.mock import MagicMock, patch
from langchain.chains.cogniswitch import CogniswitchStoreChain
class TestCogniswitchStoreChain(unittest.TestCase):
@patch("requests.post")
def test_store_data_successful(self, mock_post: MagicMock) -> None:
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.json.return_value = {"message": "Data stored successfully"}
mock_post.return_value = mock_response
chain = CogniswitchStoreChain()
result = chain.store_data(
"cs_token", "OAI_token", "http://example.com", None, "apiKey", None, None
)
self.assertEqual(result, {"message": "Data stored successfully"})
@patch("requests.post")
def test_store_data_failure(self, mock_post: MagicMock) -> None:
mock_response = MagicMock()
mock_response.status_code = 400
mock_response.json.return_value = {"message": "Bad Request"}
mock_post.return_value = mock_response
chain = CogniswitchStoreChain()
result = chain.store_data(
"cs_token",
"OAI_token",
None,
None,
"apiKey",
"document_name",
"document_description",
)
self.assertEqual(
result,
{
"message": "Bad Request",
},
)