mirror of
https://github.com/csunny/DB-GPT.git
synced 2025-09-10 21:39:33 +00:00
doc:proxy api
This commit is contained in:
@@ -77,7 +77,7 @@ macos:brew install git-lfs
|
||||
```
|
||||
##### Download LLM Model and Embedding Model
|
||||
|
||||
If you use OpenAI llm service, see [LLM Use FAQ](https://db-gpt.readthedocs.io/en/latest/getting_started/faq/llm/llm_faq.html)
|
||||
If you use OpenAI llm service, see [How to Use LLM REST API](https://db-gpt.readthedocs.io/en/latest/getting_started/faq/llm/proxyllm/proxyllm.html)
|
||||
|
||||
```{tip}
|
||||
If you use openai or Axzure or tongyi llm api service, you don't need to download llm model.
|
||||
|
@@ -28,6 +28,7 @@ Multi LLMs Support, Supports multiple large language models, currently supportin
|
||||
:name: llama_cpp
|
||||
:hidden:
|
||||
|
||||
./proxyllm/proxyllm.md
|
||||
./llama/llama_cpp.md
|
||||
./quantization/quantization.md
|
||||
./vllm/vllm.md
|
||||
|
74
docs/getting_started/install/llm/proxyllm/proxyllm.md
Normal file
74
docs/getting_started/install/llm/proxyllm/proxyllm.md
Normal file
@@ -0,0 +1,74 @@
|
||||
Proxy LLM API
|
||||
==================================
|
||||
Now DB-GPT supports connect LLM service through proxy rest api.
|
||||
|
||||
LLM rest api now supports
|
||||
```{note}
|
||||
* OpenAI
|
||||
* Azure
|
||||
* Aliyun tongyi
|
||||
* Baidu wenxin
|
||||
* Zhipu
|
||||
* Baichuan
|
||||
* Bard
|
||||
```
|
||||
|
||||
|
||||
### How to Integrate LLM rest API, like OpenAI, Azure, tongyi, wenxin llm api service?
|
||||
update your `.env` file
|
||||
```commandline
|
||||
#OpenAI
|
||||
LLM_MODEL=chatgpt_proxyllm
|
||||
PROXY_API_KEY={your-openai-sk}
|
||||
PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions
|
||||
|
||||
#Azure
|
||||
LLM_MODEL=chatgpt_proxyllm
|
||||
PROXY_API_KEY={your-openai-sk}
|
||||
PROXY_SERVER_URL=https://xx.openai.azure.com/v1/chat/completions
|
||||
|
||||
#Aliyun tongyi
|
||||
LLM_MODEL=tongyi_proxyllm
|
||||
TONGYI_PROXY_API_KEY={your-tongyi-sk}
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
|
||||
## Baidu wenxin
|
||||
LLM_MODEL=wenxin_proxyllm
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
WEN_XIN_MODEL_VERSION={version}
|
||||
WEN_XIN_API_KEY={your-wenxin-sk}
|
||||
WEN_XIN_SECRET_KEY={your-wenxin-sct}
|
||||
|
||||
## Zhipu
|
||||
LLM_MODEL=zhipu_proxyllm
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
ZHIPU_MODEL_VERSION={version}
|
||||
ZHIPU_PROXY_API_KEY={your-zhipu-sk}
|
||||
|
||||
## Baichuan
|
||||
LLM_MODEL=bc_proxyllm
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
BAICHUN_MODEL_NAME={version}
|
||||
BAICHUAN_PROXY_API_KEY={your-baichuan-sk}
|
||||
BAICHUAN_PROXY_API_SECRET={your-baichuan-sct}
|
||||
|
||||
## bard
|
||||
LLM_MODEL=bard_proxyllm
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
# from https://bard.google.com/ f12-> application-> __Secure-1PSID
|
||||
BARD_PROXY_API_KEY={your-bard-token}
|
||||
```
|
||||
```{tip}
|
||||
Make sure your .env configuration is not overwritten
|
||||
```
|
||||
|
||||
### How to Integrate Embedding rest API, like OpenAI, Azure api service?
|
||||
|
||||
```commandline
|
||||
## Openai embedding model, See /pilot/model/parameter.py
|
||||
EMBEDDING_MODEL=proxy_openai
|
||||
proxy_openai_proxy_server_url=https://api.openai.com/v1
|
||||
proxy_openai_proxy_api_key={your-openai-sk}
|
||||
proxy_openai_proxy_backend=text-embedding-ada-002
|
||||
```
|
||||
|
Reference in New Issue
Block a user