mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-25 16:13:25 +00:00
docs: Updating documentation for Konko provider (#16953)
- **Description:** A small update to the Konko provider documentation. --------- Co-authored-by: Shivani Modi <shivanimodi@Shivanis-MacBook-Pro.local>
This commit is contained in:
parent
973ba0d84b
commit
fcb875629d
@ -15,39 +15,23 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"# ChatKonko\n",
|
"# ChatKonko\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"# Konko\n",
|
||||||
|
"\n",
|
||||||
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
|
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Konko API is a fully managed API designed to help application developers:\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"1. Select the right LLM(s) for their application\n",
|
"1. **Select** the right open source or proprietary LLMs for their application\n",
|
||||||
"2. Prototype with various open-source and proprietary LLMs\n",
|
"2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs\n",
|
||||||
"3. Access Fine Tuning for open-source LLMs to get industry-leading performance at a fraction of the cost\n",
|
"3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost\n",
|
||||||
"4. Setup low-cost production APIs according to security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n",
|
"4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n",
|
||||||
"\n",
|
"\n",
|
||||||
"### Steps to Access Models\n",
|
|
||||||
"1. **Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.\n",
|
|
||||||
"\n",
|
|
||||||
"2. **Identify Suitable Endpoints:** Determine which [endpoint](https://docs.konko.ai/docs/list-of-models#list-of-available-models) (ChatCompletion or Completion) supports your selected model.\n",
|
|
||||||
"\n",
|
|
||||||
"3. **Selecting a Model:** [Choose a model](https://docs.konko.ai/docs/list-of-models#list-of-available-models) based on its metadata and how well it fits your use case.\n",
|
|
||||||
"\n",
|
|
||||||
"4. **Prompting Guidelines:** Once a model is selected, refer to the [prompting guidelines](https://docs.konko.ai/docs/prompting) to effectively communicate with it.\n",
|
|
||||||
"\n",
|
|
||||||
"5. **Using the API:** Finally, use the appropriate Konko [API endpoint](https://docs.konko.ai/docs/quickstart-for-completion-and-chat-completion-endpoint) to call the model and receive responses.\n",
|
|
||||||
"\n",
|
|
||||||
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/).\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"This example goes over how to use LangChain to interact with `Konko` ChatCompletion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-chatcompletion)\n",
|
"This example goes over how to use LangChain to interact with `Konko` ChatCompletion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-chatcompletion)\n",
|
||||||
|
"\n",
|
||||||
|
"To run this notebook, you'll need Konko API key. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models\n",
|
||||||
"\n"
|
"\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/)."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": 1,
|
||||||
@ -64,11 +48,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## 2. Set API Keys\n",
|
"#### Set Environment Variables\n",
|
||||||
"\n",
|
|
||||||
"<br />\n",
|
|
||||||
"\n",
|
|
||||||
"### Option 1: Set Environment Variables\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"1. You can set environment variables for \n",
|
"1. You can set environment variables for \n",
|
||||||
" 1. KONKO_API_KEY (Required)\n",
|
" 1. KONKO_API_KEY (Required)\n",
|
||||||
@ -78,18 +58,7 @@
|
|||||||
"```shell\n",
|
"```shell\n",
|
||||||
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
|
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
|
||||||
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
|
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
|
||||||
"```\n",
|
"```"
|
||||||
"\n",
|
|
||||||
"Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.\n",
|
|
||||||
"\n",
|
|
||||||
"### Option 2: Set API Keys Programmatically\n",
|
|
||||||
"\n",
|
|
||||||
"If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:\n",
|
|
||||||
"\n",
|
|
||||||
"```python\n",
|
|
||||||
"konko.set_api_key('your_KONKO_API_KEY_here') \n",
|
|
||||||
"konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional\n",
|
|
||||||
"```\n"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -98,7 +67,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"## Calling a model\n",
|
"## Calling a model\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Find a model on the [Konko overview page](https://docs.konko.ai/v0.5.0/docs/list-of-models)\n",
|
"Find a model on the [Konko overview page](https://docs.konko.ai/docs/list-of-models)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).\n",
|
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
@ -1,20 +1,27 @@
|
|||||||
{
|
{
|
||||||
"cells": [
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "raw",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"---\n",
|
||||||
|
"sidebar_label: Konko\n",
|
||||||
|
"---"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"id": "136d9ba6-c42a-435b-9e19-77ebcc7a3145",
|
"id": "136d9ba6-c42a-435b-9e19-77ebcc7a3145",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# ChatKonko\n",
|
"# Konko\n",
|
||||||
"\n",
|
"\n",
|
||||||
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
|
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Konko API is a fully managed API designed to help application developers:\n",
|
"1. **Select** the right open source or proprietary LLMs for their application\n",
|
||||||
"\n",
|
"2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs\n",
|
||||||
"1. Select the right LLM(s) for their application\n",
|
"3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost\n",
|
||||||
"2. Prototype with various open-source and proprietary LLMs\n",
|
"4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n"
|
||||||
"3. Access Fine Tuning for open-source LLMs to get industry-leading performance at a fraction of the cost\n",
|
|
||||||
"4. Setup low-cost production APIs according to security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -22,25 +29,44 @@
|
|||||||
"id": "0d896d07-82b4-4f38-8c37-f0bc8b0e4fe1",
|
"id": "0d896d07-82b4-4f38-8c37-f0bc8b0e4fe1",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Steps to Access Models\n",
|
|
||||||
"1. **Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.\n",
|
|
||||||
"\n",
|
|
||||||
"2. **Identify Suitable Endpoints:** Determine which [endpoint](https://docs.konko.ai/docs/list-of-models#list-of-available-models) (ChatCompletion or Completion) supports your selected model.\n",
|
|
||||||
"\n",
|
|
||||||
"3. **Selecting a Model:** [Choose a model](https://docs.konko.ai/docs/list-of-models#list-of-available-models) based on its metadata and how well it fits your use case.\n",
|
|
||||||
"\n",
|
|
||||||
"4. **Prompting Guidelines:** Once a model is selected, refer to the [prompting guidelines](https://docs.konko.ai/docs/prompting) to effectively communicate with it.\n",
|
|
||||||
"\n",
|
|
||||||
"5. **Using the API:** Finally, use the appropriate Konko [API endpoint](https://docs.konko.ai/docs/quickstart-for-completion-and-chat-completion-endpoint) to call the model and receive responses.\n",
|
|
||||||
"\n",
|
|
||||||
"This example goes over how to use LangChain to interact with `Konko` completion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-completion)\n",
|
"This example goes over how to use LangChain to interact with `Konko` completion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-completion)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/)."
|
"To run this notebook, you'll need Konko API key. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Set Environment Variables\n",
|
||||||
|
"\n",
|
||||||
|
"1. You can set environment variables for \n",
|
||||||
|
" 1. KONKO_API_KEY (Required)\n",
|
||||||
|
" 2. OPENAI_API_KEY (Optional)\n",
|
||||||
|
"2. In your current shell session, use the export command:\n",
|
||||||
|
"\n",
|
||||||
|
"```shell\n",
|
||||||
|
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
|
||||||
|
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
|
||||||
|
"```"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Calling a model\n",
|
||||||
|
"\n",
|
||||||
|
"Find a model on the [Konko overview page](https://docs.konko.ai/docs/list-of-models)\n",
|
||||||
|
"\n",
|
||||||
|
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).\n",
|
||||||
|
"\n",
|
||||||
|
"From here, we can initialize our model:"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": null,
|
||||||
"id": "dd70bccb-7a65-42d0-a3f2-8116f3549da7",
|
"id": "dd70bccb-7a65-42d0-a3f2-8116f3549da7",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
|
@ -1,81 +1,45 @@
|
|||||||
# Konko
|
# Konko
|
||||||
This page covers how to run models on Konko within LangChain.
|
All functionality related to Konko
|
||||||
|
|
||||||
Konko API is a fully managed API designed to help application developers:
|
>[Konko AI](https://www.konko.ai/) provides a fully managed API to help application developers
|
||||||
|
|
||||||
Select the right LLM(s) for their application
|
>1. **Select** the right open source or proprietary LLMs for their application
|
||||||
Prototype with various open-source and proprietary LLMs
|
>2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs
|
||||||
Move to production in-line with their security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant infrastructure
|
>3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost
|
||||||
|
>4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure
|
||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
|
|
||||||
### First you'll need an API key
|
1. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models via our endpoints for [chat completions](https://docs.konko.ai/reference/post-chat-completions) and [completions](https://docs.konko.ai/reference/post-completions).
|
||||||
You can request it by messaging [support@konko.ai](mailto:support@konko.ai)
|
2. Enable a Python3.8+ environment
|
||||||
|
3. Install the SDK
|
||||||
|
|
||||||
### Install Konko AI's Python SDK
|
```bash
|
||||||
|
pip install konko
|
||||||
|
```
|
||||||
|
|
||||||
#### 1. Enable a Python3.8+ environment
|
4. Set API Keys as environment variables(`KONKO_API_KEY`,`OPENAI_API_KEY`)
|
||||||
|
|
||||||
#### 2. Set API Keys
|
```bash
|
||||||
|
|
||||||
##### Option 1: Set Environment Variables
|
|
||||||
|
|
||||||
1. You can set environment variables for
|
|
||||||
1. KONKO_API_KEY (Required)
|
|
||||||
2. OPENAI_API_KEY (Optional)
|
|
||||||
|
|
||||||
2. In your current shell session, use the export command:
|
|
||||||
|
|
||||||
```shell
|
|
||||||
export KONKO_API_KEY={your_KONKO_API_KEY_here}
|
export KONKO_API_KEY={your_KONKO_API_KEY_here}
|
||||||
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional
|
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional
|
||||||
```
|
```
|
||||||
|
|
||||||
Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.
|
Please see [the Konko docs](https://docs.konko.ai/docs/getting-started) for more details.
|
||||||
|
|
||||||
##### Option 2: Set API Keys Programmatically
|
|
||||||
|
|
||||||
If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:
|
|
||||||
|
|
||||||
```python
|
|
||||||
konko.set_api_key('your_KONKO_API_KEY_here')
|
|
||||||
konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 3. Install the SDK
|
|
||||||
|
|
||||||
|
|
||||||
```shell
|
## LLM
|
||||||
pip install konko
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 4. Verify Installation & Authentication
|
**Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.
|
||||||
|
|
||||||
```python
|
Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).
|
||||||
#Confirm konko has installed successfully
|
|
||||||
import konko
|
|
||||||
#Confirm API keys from Konko and OpenAI are set properly
|
|
||||||
konko.Model.list()
|
|
||||||
```
|
|
||||||
|
|
||||||
## Calling a model
|
See a usage [example](/docs/integrations/llms/konko).
|
||||||
|
|
||||||
Find a model on the [Konko Introduction page](https://docs.konko.ai/docs/list-of-models)
|
### Examples of Endpoint Usage
|
||||||
|
|
||||||
Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/listmodels).
|
|
||||||
|
|
||||||
## Examples of Endpoint Usage
|
|
||||||
|
|
||||||
|
|
||||||
- **ChatCompletion with Mistral-7B:**
|
|
||||||
```python
|
|
||||||
chat_instance = ChatKonko(max_tokens=10, model = 'mistralai/mistral-7b-instruct-v0.1')
|
|
||||||
msg = HumanMessage(content="Hi")
|
|
||||||
chat_response = chat_instance([msg])
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Completion with mistralai/Mistral-7B-v0.1:**
|
- **Completion with mistralai/Mistral-7B-v0.1:**
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from langchain.llms import Konko
|
from langchain.llms import Konko
|
||||||
llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
|
llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
|
||||||
@ -83,4 +47,19 @@ Another way to find the list of models running on the Konko instance is through
|
|||||||
response = llm(prompt)
|
response = llm(prompt)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Chat Models
|
||||||
|
|
||||||
|
See a usage [example](/docs/integrations/chat/konko).
|
||||||
|
|
||||||
|
|
||||||
|
- **ChatCompletion with Mistral-7B:**
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.schema import HumanMessage
|
||||||
|
from langchain_community.chat_models import ChatKonko
|
||||||
|
chat_instance = ChatKonko(max_tokens=10, model = 'mistralai/mistral-7b-instruct-v0.1')
|
||||||
|
msg = HumanMessage(content="Hi")
|
||||||
|
chat_response = chat_instance([msg])
|
||||||
|
```
|
||||||
|
|
||||||
For further assistance, contact [support@konko.ai](mailto:support@konko.ai) or join our [Discord](https://discord.gg/TXV2s3z7RZ).
|
For further assistance, contact [support@konko.ai](mailto:support@konko.ai) or join our [Discord](https://discord.gg/TXV2s3z7RZ).
|
Loading…
Reference in New Issue
Block a user