langchain/docs/docs/integrations/llms/koboldai.ipynb
Bagatur 480626dc99
docs, community[patch], experimental[patch], langchain[patch], cli[pa… (#15412)
…tch]: import models from community

ran
```bash
git grep -l 'from langchain\.chat_models' | xargs -L 1 sed -i '' "s/from\ langchain\.chat_models/from\ langchain_community.chat_models/g"
git grep -l 'from langchain\.llms' | xargs -L 1 sed -i '' "s/from\ langchain\.llms/from\ langchain_community.llms/g"
git grep -l 'from langchain\.embeddings' | xargs -L 1 sed -i '' "s/from\ langchain\.embeddings/from\ langchain_community.embeddings/g"
git checkout master libs/langchain/tests/unit_tests/llms
git checkout master libs/langchain/tests/unit_tests/chat_models
git checkout master libs/langchain/tests/unit_tests/embeddings/test_imports.py
make format
cd libs/langchain; make format
cd ../experimental; make format
cd ../core; make format
```
2024-01-02 15:32:16 -05:00

89 lines
2.1 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "FPF4vhdZyJ7S"
},
"source": [
"# KoboldAI API\n",
"\n",
"[KoboldAI](https://github.com/KoboldAI/KoboldAI-Client) is a \"a browser-based front-end for AI-assisted writing with multiple local & remote AI models...\". It has a public and local API that is able to be used in langchain.\n",
"\n",
"This example goes over how to use LangChain with that API.\n",
"\n",
"Documentation can be found in the browser adding /api to the end of your endpoint (i.e http://127.0.0.1/:5000/api).\n"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "lyzOsRRTf_Vr"
},
"outputs": [],
"source": [
"from langchain_community.llms import KoboldApiLLM"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "1a_H7mvfy51O"
},
"source": [
"Replace the endpoint seen below with the one shown in the output after starting the webui with --api or --public-api\n",
"\n",
"Optionally, you can pass in parameters like temperature or max_length"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "g3vGebq8f_Vr"
},
"outputs": [],
"source": [
"llm = KoboldApiLLM(endpoint=\"http://192.168.1.144:5000\", max_length=80)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "sPxNGGiDf_Vr",
"outputId": "024a1d62-3cd7-49a8-c6a8-5278224d02ef"
},
"outputs": [],
"source": [
"response = llm(\"### Instruction:\\nWhat is the first book of the bible?\\n### Response:\")"
]
}
],
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"display_name": "venv",
"language": "python",
"name": "venv"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 1
}