mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-26 08:33:49 +00:00
docs: add seekrflow chat model integration docs (#30596)
### **PR title** `docs: add SeekrFlow integration notebook` --- ### 💬 **PR message** - **Description:** This PR adds an integration notebook for [`[ChatSeekrFlow](https://pypi.org/project/langchain-seekrflow/)`](https://pypi.org/project/langchain-seekrflow/) under `docs/docs/integrations/chat/`. Per LangChain’s guidance, SeekrFlow has been published as a standalone OSS package (`langchain-seekrflow`) rather than as a direct community integration. This notebook ensures discoverability, demonstration, and testability of the integration within LangChain’s documentation structure. - **Issue:** N/A – this is a new integration contribution aligned with LangChain’s external package policy. - **Dependencies:** - [`[langchain-seekrflow](https://pypi.org/project/langchain-seekrflow)`](https://pypi.org/project/langchain-seekrflow) (published to PyPI) - [`[seekrai](https://pypi.org/project/seekrai/)`](https://pypi.org/project/seekrai/) (SeekrFlow client SDK) - **Twitter handle (optional):** @seekrtechnology --------- Co-authored-by: Ben Faircloth <bfaircloth@seekr.com> Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
This commit is contained in:
parent
768e4f695a
commit
6896c863e8
362
docs/docs/integrations/chat/seekrflow.ipynb
Normal file
362
docs/docs/integrations/chat/seekrflow.ipynb
Normal file
@ -0,0 +1,362 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "62d5a1ea",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# ChatSeekrFlow\n",
|
||||
"\n",
|
||||
"> [Seekr](https://www.seekr.com/) provides AI-powered solutions for structured, explainable, and transparent AI interactions.\n",
|
||||
"\n",
|
||||
"This notebook provides a quick overview for getting started with Seekr [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatSeekrFlow` features and configurations, head to the [API reference](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.seekrflow.ChatSeekrFlow.html).\n",
|
||||
"\n",
|
||||
"## Overview\n",
|
||||
"\n",
|
||||
"`ChatSeekrFlow` class wraps a chat model endpoint hosted on SeekrFlow, enabling seamless integration with LangChain applications.\n",
|
||||
"\n",
|
||||
"### Integration Details\n",
|
||||
"\n",
|
||||
"| Class | Package | Local | Serializable | Package downloads | Package latest |\n",
|
||||
"| :--- | :--- | :---: | :---: | :---: | :---: |\n",
|
||||
"| [ChatSeekrFlow](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.seekrflow.ChatSeekrFlow.html) | [seekrai](https://python.langchain.com/docs/integrations/providers/seekr/) | ❌ | beta |  |  |\n",
|
||||
"\n",
|
||||
"### Model Features\n",
|
||||
"\n",
|
||||
"| [Tool calling](/docs/how_to/tool_calling/) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
||||
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
||||
"| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ |\n",
|
||||
"\n",
|
||||
"### Supported Methods\n",
|
||||
"`ChatSeekrFlow` supports all methods of `ChatModel`, **except async APIs**.\n",
|
||||
"\n",
|
||||
"### Endpoint Requirements\n",
|
||||
"\n",
|
||||
"The serving endpoint `ChatSeekrFlow` wraps **must** have OpenAI-compatible chat input/output format. It can be used for:\n",
|
||||
"1. **Fine-tuned Seekr models**\n",
|
||||
"2. **Custom SeekrFlow models**\n",
|
||||
"3. **RAG-enabled models using Seekr's retrieval system**\n",
|
||||
"\n",
|
||||
"For async usage, please refer to `AsyncChatSeekrFlow` (coming soon).\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "93fea471",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Getting Started with ChatSeekrFlow in LangChain\n",
|
||||
"\n",
|
||||
"This notebook covers how to use SeekrFlow as a chat model in LangChain."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2f320c17",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"Ensure you have the necessary dependencies installed:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"pip install seekrai langchain langchain-community\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"You must also have an API key from Seekr to authenticate requests.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "911ca53c",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Standard library\n",
|
||||
"import getpass\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"# Third-party\n",
|
||||
"from langchain.prompts import ChatPromptTemplate\n",
|
||||
"from langchain.schema import HumanMessage\n",
|
||||
"from langchain_core.runnables import RunnableSequence\n",
|
||||
"\n",
|
||||
"# OSS SeekrFlow integration\n",
|
||||
"from langchain_seekrflow import ChatSeekrFlow\n",
|
||||
"from seekrai import SeekrFlow"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "150461cb",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API Key Setup\n",
|
||||
"\n",
|
||||
"You'll need to set your API key as an environment variable to authenticate requests.\n",
|
||||
"\n",
|
||||
"Run the below cell.\n",
|
||||
"\n",
|
||||
"Or manually assign it before running queries:\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"SEEKR_API_KEY = \"your-api-key-here\"\n",
|
||||
"```\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "38afcd6e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"os.environ[\"SEEKR_API_KEY\"] = getpass.getpass(\"Enter your Seekr API key:\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "82d83c0e",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "71b14751",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"os.environ[\"SEEKR_API_KEY\"]\n",
|
||||
"seekr_client = SeekrFlow(api_key=SEEKR_API_KEY)\n",
|
||||
"\n",
|
||||
"llm = ChatSeekrFlow(\n",
|
||||
" client=seekr_client, model_name=\"meta-llama/Meta-Llama-3-8B-Instruct\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "1046e86c",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Invocation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"id": "f61a60f6",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Hello there! I'm Seekr, nice to meet you! What brings you here today? Do you have a question, or are you looking for some help with something? I'm all ears (or rather, all text)!\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"response = llm.invoke([HumanMessage(content=\"Hello, Seekr!\")])\n",
|
||||
"print(response.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "853b0349",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Chaining"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "35fca3ec",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"content='The translation of \"Good morning\" in French is:\\n\\n\"Bonne journée\"' additional_kwargs={} response_metadata={}\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"prompt = ChatPromptTemplate.from_template(\"Translate to French: {text}\")\n",
|
||||
"\n",
|
||||
"chain: RunnableSequence = prompt | llm\n",
|
||||
"result = chain.invoke({\"text\": \"Good morning\"})\n",
|
||||
"print(result)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "a7b28b8d",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"\n",
|
||||
"🔹 Testing Sync `stream()` (Streaming)...\n",
|
||||
"Here is a haiku:\n",
|
||||
"\n",
|
||||
"Golden sunset fades\n",
|
||||
"Ripples on the quiet lake\n",
|
||||
"Peaceful evening sky"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"def test_stream():\n",
|
||||
" \"\"\"Test synchronous invocation in streaming mode.\"\"\"\n",
|
||||
" print(\"\\n🔹 Testing Sync `stream()` (Streaming)...\")\n",
|
||||
"\n",
|
||||
" for chunk in llm.stream([HumanMessage(content=\"Write me a haiku.\")]):\n",
|
||||
" print(chunk.content, end=\"\", flush=True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# ✅ Ensure streaming is enabled\n",
|
||||
"llm = ChatSeekrFlow(\n",
|
||||
" client=seekr_client,\n",
|
||||
" model_name=\"meta-llama/Meta-Llama-3-8B-Instruct\",\n",
|
||||
" streaming=True, # ✅ Enable streaming\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"# ✅ Run sync streaming test\n",
|
||||
"test_stream()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "b3847b34",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Error Handling & Debugging"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"id": "6bc38b48",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Running test: Missing Client\n",
|
||||
"✅ Expected Error: SeekrFlow client cannot be None.\n",
|
||||
"Running test: Missing Model Name\n",
|
||||
"✅ Expected Error: A valid model name must be provided.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Define a minimal mock SeekrFlow client\n",
|
||||
"class MockSeekrClient:\n",
|
||||
" \"\"\"Mock SeekrFlow API client that mimics the real API structure.\"\"\"\n",
|
||||
"\n",
|
||||
" class MockChat:\n",
|
||||
" \"\"\"Mock Chat object with a completions method.\"\"\"\n",
|
||||
"\n",
|
||||
" class MockCompletions:\n",
|
||||
" \"\"\"Mock Completions object with a create method.\"\"\"\n",
|
||||
"\n",
|
||||
" def create(self, *args, **kwargs):\n",
|
||||
" return {\n",
|
||||
" \"choices\": [{\"message\": {\"content\": \"Mock response\"}}]\n",
|
||||
" } # Mimic API response\n",
|
||||
"\n",
|
||||
" completions = MockCompletions()\n",
|
||||
"\n",
|
||||
" chat = MockChat()\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def test_initialization_errors():\n",
|
||||
" \"\"\"Test that invalid ChatSeekrFlow initializations raise expected errors.\"\"\"\n",
|
||||
"\n",
|
||||
" test_cases = [\n",
|
||||
" {\n",
|
||||
" \"name\": \"Missing Client\",\n",
|
||||
" \"args\": {\"client\": None, \"model_name\": \"seekrflow-model\"},\n",
|
||||
" \"expected_error\": \"SeekrFlow client cannot be None.\",\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"name\": \"Missing Model Name\",\n",
|
||||
" \"args\": {\"client\": MockSeekrClient(), \"model_name\": \"\"},\n",
|
||||
" \"expected_error\": \"A valid model name must be provided.\",\n",
|
||||
" },\n",
|
||||
" ]\n",
|
||||
"\n",
|
||||
" for test in test_cases:\n",
|
||||
" try:\n",
|
||||
" print(f\"Running test: {test['name']}\")\n",
|
||||
" faulty_llm = ChatSeekrFlow(**test[\"args\"])\n",
|
||||
"\n",
|
||||
" # If no error is raised, fail the test\n",
|
||||
" print(f\"❌ Test '{test['name']}' failed: No error was raised!\")\n",
|
||||
" except Exception as e:\n",
|
||||
" error_msg = str(e)\n",
|
||||
" assert test[\"expected_error\"] in error_msg, f\"Unexpected error: {error_msg}\"\n",
|
||||
" print(f\"✅ Expected Error: {error_msg}\")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Run test\n",
|
||||
"test_initialization_errors()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d1c9ddf3",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "411a8bea",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"- `ChatSeekrFlow` class: [`langchain_seekrflow.ChatSeekrFlow`](https://github.com/benfaircloth/langchain-seekrflow/blob/main/langchain_seekrflow/seekrflow.py)\n",
|
||||
"- PyPI package: [`langchain-seekrflow`](https://pypi.org/project/langchain-seekrflow/)\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "3ef00a51",
|
||||
"metadata": {},
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
Loading…
Reference in New Issue
Block a user