mirror of
https://github.com/hwchase17/langchain.git
synced 2026-02-21 14:43:07 +00:00
Merge branch 'wip-v0.4' into standard_outputs_copy
This commit is contained in:
@@ -89,7 +89,7 @@ Please see the [API reference for @tool](https://python.langchain.com/api_refere
|
||||
|
||||
## Tool artifacts
|
||||
|
||||
**Tools** are utilities that can be called by a model, and whose outputs are designed to be fed back to a model. Sometimes, however, there are artifacts of a tool's execution that we want to make accessible to downstream components in our chain or agent, but that we don't want to expose to the model itself. For example if a tool returns a custom object, a dataframe or an image, we may want to pass some metadata about this output to the model without passing the actual output to the model. At the same time, we may want to be able to access this full output elsewhere, for example in downstream tools.
|
||||
**Tools** are utilities that can be called by a model, and whose outputs are designed to be fed back to a model. Sometimes, however, there are artifacts of a tool's execution that we want to make accessible to downstream components in our chain or agent, but that we don't want to expose to the model itself. For example if a tool returns a custom object, a dataframe or an image, we may want to pass some metadata about this output to the model without passing the actual output. At the same time, we may want to be able to access this full output elsewhere, for example in downstream tools.
|
||||
|
||||
```python
|
||||
@tool(response_format="content_and_artifact")
|
||||
|
||||
@@ -45,7 +45,7 @@
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"To use the integration you must:\n",
|
||||
"To use the integration you must either:\n",
|
||||
"- Have credentials configured for your environment (gcloud, workload identity, etc...)\n",
|
||||
"- Store the path to a service account JSON file as the GOOGLE_APPLICATION_CREDENTIALS environment variable\n",
|
||||
"\n",
|
||||
|
||||
298
docs/docs/integrations/chat/gradientai.ipynb
Normal file
298
docs/docs/integrations/chat/gradientai.ipynb
Normal file
@@ -0,0 +1,298 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "raw",
|
||||
"id": "afaf8039",
|
||||
"metadata": {
|
||||
"vscode": {
|
||||
"languageId": "raw"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"---\n",
|
||||
"sidebar_label: DigitalOcean Gradient\n",
|
||||
"---"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "e49f1e0d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# ChatGradient\n",
|
||||
"\n",
|
||||
"This will help you getting started with DigitalOcean Gradient Chat Models.\n",
|
||||
"\n",
|
||||
"## Overview\n",
|
||||
"### Integration details\n",
|
||||
"\n",
|
||||
"| Class | Package | Package downloads | Package latest |\n",
|
||||
"| :--- | :--- | :---: | :---: |\n",
|
||||
"| [DigitalOcean Gradient](https://python.langchain.com/docs/api_reference/llms/langchain_gradient.llms.LangchainGradient/) | [langchain-gradient](https://python.langchain.com/docs/api_reference/langchain-gradient_api_reference/) |  |  |\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"\n",
|
||||
"langchain-gradient uses DigitalOcean Gradient Platform. \n",
|
||||
"\n",
|
||||
"Create an account on DigitalOcean, acquire a `DIGITALOCEAN_INFERENCE_KEY` API key from the Gradient Platform, and install the `langchain-gradient` integration package.\n",
|
||||
"\n",
|
||||
"### Credentials\n",
|
||||
"\n",
|
||||
"Head to [DigitalOcean Login](https://cloud.digitalocean.com/login) \n",
|
||||
"\n",
|
||||
"1. Sign up/Login to DigitalOcean Cloud Console\n",
|
||||
"2. Go to the Gradient Platform and navigate to Serverless Inference.\n",
|
||||
"3. Click on Create model access key, enter a name, and create the key.\n",
|
||||
"\n",
|
||||
"Once you've done this set the `DIGITALOCEAN_INFERENCE_KEY` environment variable:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "433e8d2b-9519-4b49-b2c4-7ab65b046c94",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import getpass\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"if not os.getenv(\"DIGITALOCEAN_INFERENCE_KEY\"):\n",
|
||||
" os.environ[\"DIGITALOCEAN_INFERENCE_KEY\"] = getpass.getpass(\n",
|
||||
" \"Enter your DIGITALOCEAN_INFERENCE_KEY API key: \"\n",
|
||||
" )"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "72ee0c4b-9764-423a-9dbf-95129e185210",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "a15d341e-3e26-4ca3-830b-5aab30ed66de",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\"\n",
|
||||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "0730d6a1-c893-4840-9817-5e5251676d5d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Installation\n",
|
||||
"\n",
|
||||
"The DigitalOcean Gradient integration lives in the `langchain-gradient` package:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "652d6238-1f87-422a-b135-f5abbb8652fc",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"\n",
|
||||
"\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.0\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m25.1.1\u001b[0m\n",
|
||||
"\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip3.12 install --upgrade pip\u001b[0m\n",
|
||||
"Note: you may need to restart the kernel to use updated packages.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"%pip install -qU langchain-gradient"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Instantiation\n",
|
||||
"\n",
|
||||
"Now we can instantiate our model object and generate chat completions:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_gradient import ChatGradient\n",
|
||||
"\n",
|
||||
"llm = ChatGradient(\n",
|
||||
" model=\"llama3.3-70b-instruct\",\n",
|
||||
" # other params...\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2b4f3e15",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Invocation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "62e0dbc3",
|
||||
"metadata": {
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content=\"...that had been hidden away for centuries, nestled amongst the twisted roots of an ancient tree. As soon as Mira's fingers made contact with the stone, she felt an sudden surge of energy course through her veins, like a river bursting its banks. The stone, which had been dull and lifeless just moments before, now pulsed with a soft, ethereal light, as if it had been awakened by Mira's touch.\\n\\nIntrigued, Mira turned the stone over in her hand, studying it from every angle. The light emanating from it cast eerie shadows on the trees around her, making her feel as though she was standing at the threshold of a secret world. As she gazed deeper into the stone, she began to notice that the glow was not just a random color, but a deep, rich blue that seemed to be calling to her.\\n\\nWithout thinking, Mira felt an overwhelming urge to follow the stone's gentle glow, which seemed to be leading her deeper into the mysterious forest. The trees loomed above her, their branches creaking and swaying in the wind, as if they too were urging her onward. The air was filled with the sweet scent of wildflowers and the soft hooting of owls, creating a sense of enchantment that was both exhilarating and unsettling.\\n\\nAs Mira wandered deeper into the forest, the stone's light grew brighter, illuminating a winding path that was all but invisible in the fading light of day. The trees grew taller and closer together here, forming a tunnel of foliage that seemed to be guiding her towards a hidden destination. Mira's heart pounded with excitement and a hint of fear, as she realized that she was being drawn into a world that was both magical and unknown.\\n\\nSuddenly, the trees parted, and Mira found herself standing at the edge of a clearing, surrounded by a ring of towering mushrooms that glowed with a soft, luminescent light. The air was filled with a faint humming noise, like the buzzing of a thousand bees, and the stone in her hand pulsed with an otherworldly energy. In the center of the clearing stood an enormous tree, its trunk twisted and gnarled with age, its branches reaching up towards the stars like a Nature's own cathedral.\\n\\nMira felt a sense of awe wash over her, as she approached the tree, the stone still clutched in her hand. She could feel the magic of the forest pulsing through her, calling to her, drawing her closer to the heart of the mystery. And as she reached out to touch the trunk of the tree, the stone's glow surged to a brilliant intensity, illuminating a doorway that had been hidden in the trunk all along...\", additional_kwargs={}, response_metadata={'finish_reason': 'stop'}, id='run--593a6940-4c76-413b-bed9-1fd94f91c6c1-0', usage_metadata={'input_tokens': 82, 'output_tokens': 555, 'total_tokens': 637})"
|
||||
]
|
||||
},
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"messages = [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" \"You are a creative storyteller. Continue any story prompt you receive in an engaging and imaginative way.\",\n",
|
||||
" ),\n",
|
||||
" (\n",
|
||||
" \"human\",\n",
|
||||
" \"Once upon a time, in a village at the edge of a mysterious forest, a young girl named Mira found a glowing stone...\",\n",
|
||||
" ),\n",
|
||||
"]\n",
|
||||
"ai_msg = llm.invoke(messages)\n",
|
||||
"ai_msg"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"...that had been hidden away for centuries, nestled amongst the twisted roots of an ancient tree. As soon as Mira's fingers made contact with the stone, she felt an sudden surge of energy course through her veins, like a river bursting its banks. The stone, which had been dull and lifeless just moments before, now pulsed with a soft, ethereal light, as if it had been awakened by Mira's touch.\n",
|
||||
"\n",
|
||||
"Intrigued, Mira turned the stone over in her hand, studying it from every angle. The light emanating from it cast eerie shadows on the trees around her, making her feel as though she was standing at the threshold of a secret world. As she gazed deeper into the stone, she began to notice that the glow was not just a random color, but a deep, rich blue that seemed to be calling to her.\n",
|
||||
"\n",
|
||||
"Without thinking, Mira felt an overwhelming urge to follow the stone's gentle glow, which seemed to be leading her deeper into the mysterious forest. The trees loomed above her, their branches creaking and swaying in the wind, as if they too were urging her onward. The air was filled with the sweet scent of wildflowers and the soft hooting of owls, creating a sense of enchantment that was both exhilarating and unsettling.\n",
|
||||
"\n",
|
||||
"As Mira wandered deeper into the forest, the stone's light grew brighter, illuminating a winding path that was all but invisible in the fading light of day. The trees grew taller and closer together here, forming a tunnel of foliage that seemed to be guiding her towards a hidden destination. Mira's heart pounded with excitement and a hint of fear, as she realized that she was being drawn into a world that was both magical and unknown.\n",
|
||||
"\n",
|
||||
"Suddenly, the trees parted, and Mira found herself standing at the edge of a clearing, surrounded by a ring of towering mushrooms that glowed with a soft, luminescent light. The air was filled with a faint humming noise, like the buzzing of a thousand bees, and the stone in her hand pulsed with an otherworldly energy. In the center of the clearing stood an enormous tree, its trunk twisted and gnarled with age, its branches reaching up towards the stars like a Nature's own cathedral.\n",
|
||||
"\n",
|
||||
"Mira felt a sense of awe wash over her, as she approached the tree, the stone still clutched in her hand. She could feel the magic of the forest pulsing through her, calling to her, drawing her closer to the heart of the mystery. And as she reached out to touch the trunk of the tree, the stone's glow surged to a brilliant intensity, illuminating a doorway that had been hidden in the trunk all along...\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"print(ai_msg.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Chaining\n",
|
||||
"\n",
|
||||
"We can chain our model with a prompt template like so:\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"AIMessage(content='The Eiffel Tower was designed by Gustave Eiffel\\'s engineering company and was completed in 1889. (Sentence: \"It was designed by Gustave Eiffel\\'s engineering company. The tower is one of the most recognizable structures in the world. ... The Eiffel Tower is located in Paris and was completed in 1889.\")', additional_kwargs={}, response_metadata={'finish_reason': 'stop'}, id='run--c23ffab6-06ae-4130-87b1-d5b2e7744906-0', usage_metadata={'input_tokens': 153, 'output_tokens': 74, 'total_tokens': 227})"
|
||||
]
|
||||
},
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||
"\n",
|
||||
"prompt = ChatPromptTemplate(\n",
|
||||
" [\n",
|
||||
" (\n",
|
||||
" \"system\",\n",
|
||||
" 'You are a knowledgeable assistant. Carefully read the provided context and answer the user\\'s question. If the answer is present in the context, cite the relevant sentence. If not, reply with \"Not found in context.\"',\n",
|
||||
" ),\n",
|
||||
" (\"human\", \"Context: {context}\\nQuestion: {question}\"),\n",
|
||||
" ]\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"chain = prompt | llm\n",
|
||||
"chain.invoke(\n",
|
||||
" {\n",
|
||||
" \"context\": (\n",
|
||||
" \"The Eiffel Tower is located in Paris and was completed in 1889. \"\n",
|
||||
" \"It was designed by Gustave Eiffel's engineering company. \"\n",
|
||||
" \"The tower is one of the most recognizable structures in the world. \"\n",
|
||||
" \"The Statue of Liberty was a gift from France to the United States.\"\n",
|
||||
" ),\n",
|
||||
" \"question\": \"Who designed the Eiffel Tower and when was it completed?\",\n",
|
||||
" }\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "8a6660e4",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## API reference\n",
|
||||
"\n",
|
||||
"For detailed documentation of all ChatGradient features and configurations head to the API reference."
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.12.2"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
95
docs/docs/integrations/providers/gradientai.mdx
Normal file
95
docs/docs/integrations/providers/gradientai.mdx
Normal file
@@ -0,0 +1,95 @@
|
||||
# ChatGradient
|
||||
|
||||
This will help you getting started with DigitalOcean Gradient [chat models](/docs/concepts/chat_models).
|
||||
|
||||
## Overview
|
||||
### Integration details
|
||||
|
||||
| Class | Package | Package downloads | Package latest |
|
||||
| :--- | :--- | :---: | :---: |
|
||||
| [ChatGradient](https://python.langchain.com/api_reference/langchain-gradient/chat_models/langchain_gradient.chat_models.ChatGradient.html) | [langchain-gradient](https://python.langchain.com/api_reference/langchain-gradient/) |  |  |
|
||||
|
||||
|
||||
## Setup
|
||||
|
||||
langchain-gradient uses DigitalOcean Gradient Platform.
|
||||
|
||||
Create an account on DigitalOcean, acquire a `DIGITALOCEAN_INFERENCE_KEY` API key from the Gradient Platform, and install the `langchain-gradient` integration package.
|
||||
|
||||
### Credentials
|
||||
|
||||
Head to [DigitalOcean Gradient](https://www.digitalocean.com/products/gradient)
|
||||
|
||||
1. Sign up/Login to DigitalOcean Cloud Console
|
||||
2. Go to the Gradient Platform and navigate to Serverless Inference.
|
||||
3. Click on Create model access key, enter a name, and create the key.
|
||||
|
||||
Once you've done this set the `DIGITALOCEAN_INFERENCE_KEY` environment variable:
|
||||
|
||||
```python
|
||||
import os
|
||||
os.environ["DIGITALOCEAN_INFERENCE_KEY"] = "your-api-key"
|
||||
```
|
||||
|
||||
### Installation
|
||||
|
||||
The LangChain Gradient integration is in the `langchain-gradient` package:
|
||||
|
||||
```bash
|
||||
pip install -qU langchain-gradient
|
||||
```
|
||||
|
||||
## Instantiation
|
||||
|
||||
```python
|
||||
from langchain_gradient import ChatGradient
|
||||
|
||||
llm = ChatGradient(
|
||||
model="llama3.3-70b-instruct",
|
||||
api_key=os.environ.get("DIGITALOCEAN_INFERENCE_KEY")
|
||||
)
|
||||
```
|
||||
|
||||
## Invocation
|
||||
|
||||
```python
|
||||
messages = [
|
||||
(
|
||||
"system",
|
||||
"You are a creative storyteller. Continue any story prompt you receive in an engaging and imaginative way.",
|
||||
),
|
||||
("human", "Once upon a time, in a village at the edge of a mysterious forest, a young girl named Mira found a glowing stone..."),
|
||||
]
|
||||
ai_msg = llm.invoke(messages)
|
||||
ai_msg
|
||||
print(ai_msg.content)
|
||||
```
|
||||
|
||||
## Chaining
|
||||
|
||||
```python
|
||||
from langchain_core.prompts import ChatPromptTemplate
|
||||
|
||||
prompt = ChatPromptTemplate(
|
||||
[
|
||||
(
|
||||
"system",
|
||||
"You are a knowledgeable assistant. Carefully read the provided context and answer the user's question. If the answer is present in the context, cite the relevant sentence. If not, reply with \"Not found in context.\"",
|
||||
),
|
||||
("human", "Context: {context}\nQuestion: {question}"),
|
||||
]
|
||||
)
|
||||
|
||||
chain = prompt | llm
|
||||
chain.invoke(
|
||||
{
|
||||
"context": (
|
||||
"The Eiffel Tower is located in Paris and was completed in 1889. "
|
||||
"It was designed by Gustave Eiffel's engineering company. "
|
||||
"The tower is one of the most recognizable structures in the world. "
|
||||
"The Statue of Liberty was a gift from France to the United States."
|
||||
),
|
||||
"question": "Who designed the Eiffel Tower and when was it completed?"
|
||||
}
|
||||
)
|
||||
```
|
||||
@@ -235,7 +235,7 @@
|
||||
"\n",
|
||||
"**Retries**\n",
|
||||
"\n",
|
||||
"Automatically reprocess any unsuccessful API requests **`upto 5`** times. Uses an **`exponential backoff`** strategy, which spaces out retry attempts to prevent network overload.[Docs](https://portkey.ai/docs/product/ai-gateway-streamline-llm-integrations)\n",
|
||||
"Automatically reprocess any unsuccessful API requests **`upto 5`** times. Uses an **`exponential backoff`** strategy, which spaces out retry attempts to prevent network overload. [Docs](https://portkey.ai/docs/product/ai-gateway-streamline-llm-integrations)\n",
|
||||
"\n",
|
||||
"**Tagging**\n",
|
||||
"\n",
|
||||
|
||||
@@ -9,7 +9,9 @@
|
||||
"source": [
|
||||
"# Alpha Vantage\n",
|
||||
"\n",
|
||||
">[Alpha Vantage](https://www.alphavantage.co) Alpha Vantage provides realtime and historical financial market data through a set of powerful and developer-friendly data APIs and spreadsheets. \n",
|
||||
">[Alpha Vantage](https://www.alphavantage.co) Alpha Vantage provides realtime and historical financial market data through a set of powerful and developer-friendly data APIs and spreadsheets.\n",
|
||||
"\n",
|
||||
"Generate the `ALPHAVANTAGE_API_KEY` [at their website](https://www.alphavantage.co/support/#api-key)\n.",
|
||||
"\n",
|
||||
"Use the ``AlphaVantageAPIWrapper`` to get currency exchange rates."
|
||||
]
|
||||
|
||||
@@ -25,7 +25,7 @@
|
||||
"- [VertexAIImageEditorChat](#image-editing) : Edit an entire uploaded or generated image with a text prompt.\n",
|
||||
"- [VertexAIImageCaptioning](#image-captioning) : Get text descriptions of images with visual captioning.\n",
|
||||
"- [VertexAIVisualQnAChat](#visual-question-answering-vqa) : Get answers to a question about an image with Visual Question Answering (VQA).\n",
|
||||
" * NOTE : Currently we support only only single-turn chat for Visual QnA (VQA)"
|
||||
" * NOTE : Currently we support only single-turn chat for Visual QnA (VQA)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -48,11 +48,11 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Create Image Gentation model Object\n",
|
||||
"# Create Image Generation model Object\n",
|
||||
"generator = VertexAIImageGeneratorChat()"
|
||||
]
|
||||
},
|
||||
@@ -140,11 +140,11 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 11,
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Create Image Gentation model Object\n",
|
||||
"# Create Image Generation model Object\n",
|
||||
"generator = VertexAIImageGeneratorChat()\n",
|
||||
"\n",
|
||||
"# Provide a text input for image\n",
|
||||
@@ -244,7 +244,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 19,
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@@ -268,10 +268,10 @@
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# use image egenarted in Image Generation Section\n",
|
||||
"# use image generated in Image Generation Section\n",
|
||||
"img_base64 = generated_image[\"image_url\"][\"url\"]\n",
|
||||
"response = model.invoke(img_base64)\n",
|
||||
"print(f\"Generated Cpation : {response}\")\n",
|
||||
"print(f\"Generated Caption : {response}\")\n",
|
||||
"\n",
|
||||
"# Convert base64 string to Image\n",
|
||||
"img = Image.open(\n",
|
||||
|
||||
@@ -30,49 +30,50 @@ The following table shows information on all available key-value stores.
|
||||
|
||||
KV_STORE_FEAT_TABLE = {
|
||||
"AstraDBByteStore": {
|
||||
"class": "[AstraDBByteStore](https://python.langchain.com/api_reference/astradb/storage/langchain_astradb.storage.AstraDBByteStore.html)",
|
||||
"class": "[AstraDBByteStore](https://python.langchain.com/docs/integrations/stores/astradb/)",
|
||||
"local": False,
|
||||
"package": "[langchain_astradb](https://python.langchain.com/api_reference/astradb/)",
|
||||
"package": "[langchain-astradb](https://python.langchain.com/api_reference/astradb/storage/langchain_astradb.storage.AstraDBByteStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"CassandraByteStore": {
|
||||
"class": "[CassandraByteStore](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.cassandra.CassandraByteStore.html)",
|
||||
"class": "[CassandraByteStore](https://python.langchain.com/docs/integrations/stores/cassandra/)",
|
||||
"local": False,
|
||||
"package": "[langchain_community](https://python.langchain.com/api_reference/community/)",
|
||||
"package": "[langchain-community](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.cassandra.CassandraByteStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"ElasticsearchEmbeddingsCache": {
|
||||
"class": "[ElasticsearchEmbeddingsCache](https://python.langchain.com/api_reference/elasticsearch/cache/langchain_elasticsearch.cache.ElasticsearchEmbeddingsCache.html)",
|
||||
"class": "[ElasticsearchEmbeddingsCache](https://python.langchain.com/docs/integrations/stores/elasticsearch/)",
|
||||
"local": True,
|
||||
"package": "[langchain_elasticsearch](https://python.langchain.com/api_reference/elasticsearch/)",
|
||||
"package": "[langchain-elasticsearch](https://python.langchain.com/api_reference/elasticsearch/cache/langchain_elasticsearch.cache.ElasticsearchEmbeddingsCache.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"LocalFileStore": {
|
||||
"class": "[LocalFileStore](https://python.langchain.com/api_reference/storage/langchain.storage.file_system.LocalFileStore.html)",
|
||||
"class": "[LocalFileStore](https://python.langchain.com/docs/integrations/stores/file_system/)",
|
||||
"local": True,
|
||||
"package": "[langchain](https://python.langchain.com/api_reference/langchain/)",
|
||||
"package": "[langchain](https://python.langchain.com/api_reference/langchain/storage/langchain.storage.file_system.LocalFileStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"InMemoryByteStore": {
|
||||
"class": "[InMemoryByteStore](https://python.langchain.com/api_reference/core/stores/langchain_core.stores.InMemoryByteStore.html)",
|
||||
"class": "[InMemoryByteStore](https://python.langchain.com/docs/integrations/stores/in_memory/)",
|
||||
"local": True,
|
||||
"package": "[langchain_core](https://python.langchain.com/api_reference/core/)",
|
||||
"package": "[langchain-core](https://python.langchain.com/api_reference/core/stores/langchain_core.stores.InMemoryByteStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"RedisStore": {
|
||||
"class": "[RedisStore](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.redis.RedisStore.html)",
|
||||
"class": "[RedisStore](https://python.langchain.com/docs/integrations/stores/redis/)",
|
||||
"local": True,
|
||||
"package": "[langchain_community](https://python.langchain.com/api_reference/community/)",
|
||||
"package": "[langchain-community](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.redis.RedisStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
"UpstashRedisByteStore": {
|
||||
"class": "[UpstashRedisByteStore](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.upstash_redis.UpstashRedisByteStore.html)",
|
||||
"class": "[UpstashRedisByteStore](https://python.langchain.com/docs/integrations/stores/upstash_redis/)",
|
||||
"local": False,
|
||||
"package": "[langchain_community](https://python.langchain.com/api_reference/community/)",
|
||||
"package": "[langchain-community](https://python.langchain.com/api_reference/community/storage/langchain_community.storage.upstash_redis.UpstashRedisByteStore.html)",
|
||||
"downloads": "",
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
DEPRECATED = []
|
||||
|
||||
|
||||
|
||||
@@ -435,7 +435,7 @@ const FEATURE_TABLES = {
|
||||
selfHost: false,
|
||||
cloudOffering: true,
|
||||
apiLink: "https://python.langchain.com/api_reference/aws/retrievers/langchain_aws.retrievers.bedrock.AmazonKnowledgeBasesRetriever.html",
|
||||
package: "langchain_aws"
|
||||
package: "langchain-aws"
|
||||
},
|
||||
{
|
||||
name: "AzureAISearchRetriever",
|
||||
@@ -443,7 +443,7 @@ const FEATURE_TABLES = {
|
||||
selfHost: false,
|
||||
cloudOffering: true,
|
||||
apiLink: "https://python.langchain.com/api_reference/community/retrievers/langchain_community.retrievers.azure_ai_search.AzureAISearchRetriever.html",
|
||||
package: "langchain_community"
|
||||
package: "langchain-community"
|
||||
},
|
||||
{
|
||||
name: "ElasticsearchRetriever",
|
||||
@@ -451,7 +451,7 @@ const FEATURE_TABLES = {
|
||||
selfHost: true,
|
||||
cloudOffering: true,
|
||||
apiLink: "https://python.langchain.com/api_reference/elasticsearch/retrievers/langchain_elasticsearch.retrievers.ElasticsearchRetriever.html",
|
||||
package: "langchain_elasticsearch"
|
||||
package: "langchain-elasticsearch"
|
||||
},
|
||||
{
|
||||
name: "VertexAISearchRetriever",
|
||||
@@ -459,7 +459,7 @@ const FEATURE_TABLES = {
|
||||
selfHost: false,
|
||||
cloudOffering: true,
|
||||
apiLink: "https://python.langchain.com/api_reference/google_community/vertex_ai_search/langchain_google_community.vertex_ai_search.VertexAISearchRetriever.html",
|
||||
package: "langchain_google_community"
|
||||
package: "langchain-google-community"
|
||||
}
|
||||
],
|
||||
},
|
||||
@@ -484,21 +484,21 @@ const FEATURE_TABLES = {
|
||||
link: "arxiv",
|
||||
source: (<>Scholarly articles on <a href="https://arxiv.org/">arxiv.org</a></>),
|
||||
apiLink: "https://python.langchain.com/api_reference/community/retrievers/langchain_community.retrievers.arxiv.ArxivRetriever.html",
|
||||
package: "langchain_community"
|
||||
package: "langchain-community"
|
||||
},
|
||||
{
|
||||
name: "TavilySearchAPIRetriever",
|
||||
link: "tavily",
|
||||
source: "Internet search",
|
||||
apiLink: "https://python.langchain.com/api_reference/community/retrievers/langchain_community.retrievers.tavily_search_api.TavilySearchAPIRetriever.html",
|
||||
package: "langchain_community"
|
||||
package: "langchain-community"
|
||||
},
|
||||
{
|
||||
name: "WikipediaRetriever",
|
||||
link: "wikipedia",
|
||||
source: (<><a href="https://www.wikipedia.org/">Wikipedia</a> articles</>),
|
||||
apiLink: "https://python.langchain.com/api_reference/community/retrievers/langchain_community.retrievers.wikipedia.WikipediaRetriever.html",
|
||||
package: "langchain_community"
|
||||
package: "langchain-community"
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
Reference in New Issue
Block a user