refactor: Modify default webserver port to 5670 (#1410)

This commit is contained in:
Fangyin Cheng
2024-04-12 11:47:24 +08:00
committed by GitHub
parent aea575e0b4
commit c3ae1915d2
26 changed files with 58 additions and 62 deletions

View File

@@ -30,7 +30,7 @@ import TabItem from '@theme/TabItem';
DBGPT_API_KEY=dbgpt
APP_ID={YOUR_APP_ID}
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -87,7 +87,7 @@ GET /api/v2/serve/apps/{app_id}
```shell
DBGPT_API_KEY=dbgpt
APP_ID={YOUR_APP_ID}
curl -X GET "http://localhost:5000/api/v2/serve/apps/$APP_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/apps/$APP_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>
@@ -139,7 +139,7 @@ GET /api/v2/serve/apps
```shell
DBGPT_API_KEY=dbgpt
curl -X GET 'http://localhost:5000/api/v2/serve/apps' -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET 'http://localhost:5670/api/v2/serve/apps' -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>

View File

@@ -30,7 +30,7 @@ import TabItem from '@theme/TabItem';
```shell
DBGPT_API_KEY="dbgpt"
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -94,7 +94,7 @@ data: [DONE]
```shell
DBGPT_API_KEY="dbgpt"
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \

View File

@@ -30,7 +30,7 @@ import TabItem from '@theme/TabItem';
DBGPT_API_KEY=dbgpt
DB_NAME="{your_db_name}"
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -126,7 +126,7 @@ DELETE /api/v2/serve/datasources
DBGPT_API_KEY=dbgpt
DATASOURCE_ID={YOUR_DATASOURCE_ID}
curl -X DELETE "http://localhost:5000/api/v2/serve/datasources/$DATASOURCE_ID" \
curl -X DELETE "http://localhost:5670/api/v2/serve/datasources/$DATASOURCE_ID" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
```
@@ -180,7 +180,7 @@ GET /api/v2/serve/datasources/{datasource_id}
DBGPT_API_KEY=dbgpt
DATASOURCE_ID={YOUR_DATASOURCE_ID}
curl -X GET "http://localhost:5000/api/v2/serve/datasources/$DATASOURCE_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/datasources/$DATASOURCE_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>
@@ -234,7 +234,7 @@ GET /api/v2/serve/datasources
```shell
DBGPT_API_KEY=dbgpt
curl -X GET "http://localhost:5000/api/v2/serve/datasources" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/datasources" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>

View File

@@ -30,7 +30,7 @@ import TabItem from '@theme/TabItem';
DBGPT_API_KEY=dbgpt
FLOW_ID={YOUR_FLOW_ID}
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -107,7 +107,7 @@ DELETE /api/v2/serve/awel/flows
DBGPT_API_KEY=dbgpt
FLOW_ID={YOUR_FLOW_ID}
curl -X DELETE "http://localhost:5000/api/v2/serve/awel/flows/$FLOW_ID" \
curl -X DELETE "http://localhost:5670/api/v2/serve/awel/flows/$FLOW_ID" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
```
@@ -161,7 +161,7 @@ GET /api/v2/serve/awel/flows/{flow_id}
DBGPT_API_KEY=dbgpt
FLOW_ID={YOUR_FLOW_ID}
curl -X GET "http://localhost:5000/api/v2/serve/awel/flows/$FLOW_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/awel/flows/$FLOW_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>
@@ -215,7 +215,7 @@ GET /api/v2/serve/awel/flows
```shell
DBGPT_API_KEY=dbgpt
curl -X GET "http://localhost:5000/api/v2/serve/awel/flows" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/awel/flows" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>

View File

@@ -15,7 +15,7 @@ All API requests should include your API key in an Authorization HTTP header as
Example with the DB-GPT API curl command:
```bash
curl "http://localhost:5000/api/v2/chat/completions" \
curl "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
```
Example with the DB-GPT Client Python package:

View File

@@ -30,7 +30,7 @@ import TabItem from '@theme/TabItem';
DBGPT_API_KEY=dbgpt
SPACE_NAME={YOUR_SPACE_NAME}
curl -X POST "http://localhost:5000/api/v2/chat/completions" \
curl -X POST "http://localhost:5670/api/v2/chat/completions" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -334,7 +334,7 @@ POST /api/v2/serve/knowledge/spaces
```shell
DBGPT_API_KEY="dbgpt"
curl --location --request POST 'http://localhost:5000/api/v2/serve/knowledge/spaces' \
curl --location --request POST 'http://localhost:5670/api/v2/serve/knowledge/spaces' \
--header 'Authorization: Bearer $DBGPT_API_KEY' \
--header 'Content-Type: application/json' \
--data-raw '{"desc": "for client space desc", "name": "test_space_2", "owner": "dbgpt", "vector_type": "Chroma"
@@ -410,7 +410,7 @@ PUT /api/v2/serve/knowledge/spaces
```shell
DBGPT_API_KEY="dbgpt"
curl --location --request PUT 'http://localhost:5000/api/v2/serve/knowledge/spaces' \
curl --location --request PUT 'http://localhost:5670/api/v2/serve/knowledge/spaces' \
--header 'Authorization: Bearer $DBGPT_API_KEY' \
--header 'Content-Type: application/json' \
--data-raw '{"desc": "for client space desc v2", "id": "49", "name": "test_space_2", "owner": "dbgpt", "vector_type": "Chroma"
@@ -493,7 +493,7 @@ DELETE /api/v2/serve/knowledge/spaces
DBGPT_API_KEY=dbgpt
SPACE_ID={YOUR_SPACE_ID}
curl -X DELETE "http://localhost:5000/api/v2/serve/knowledge/spaces/$SPACE_ID" \
curl -X DELETE "http://localhost:5670/api/v2/serve/knowledge/spaces/$SPACE_ID" \
-H "Authorization: Bearer $DBGPT_API_KEY" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@@ -548,7 +548,7 @@ GET /api/v2/serve/knowledge/spaces/{space_id}
```shell
DBGPT_API_KEY=dbgpt
SPACE_ID={YOUR_SPACE_ID}
curl -X GET "http://localhost:5000/api/v2/serve/knowledge/spaces/$SPACE_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET "http://localhost:5670/api/v2/serve/knowledge/spaces/$SPACE_ID" -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>
@@ -600,7 +600,7 @@ GET /api/v2/serve/knowledge/spaces
```shell
DBGPT_API_KEY=dbgpt
curl -X GET 'http://localhost:5000/api/v2/serve/knowledge/spaces' -H "Authorization: Bearer $DBGPT_API_KEY"
curl -X GET 'http://localhost:5670/api/v2/serve/knowledge/spaces' -H "Authorization: Bearer $DBGPT_API_KEY"
```
</TabItem>

View File

@@ -27,7 +27,7 @@ print(completion.choices[0].message.content)
## Application service layer API
The service layer API refers to the API exposed on port 5000 after starting the webserver, which is mainly focused on the application layer. It can be divided into the following parts according to categories
The service layer API refers to the API exposed on port 5670 after starting the webserver, which is mainly focused on the application layer. It can be divided into the following parts according to categories
- Chat API
- Editor API
@@ -37,7 +37,7 @@ The service layer API refers to the API exposed on port 5000 after starting the
- Model API
:::info
Note: After starting the webserver, open http://127.0.0.1:5000/docs to view details
Note: After starting the webserver, open http://127.0.0.1:5670/docs to view details
Regarding the service layer API, in terms of strategy in the early days, we maintained the principle of minimum availability and openness. APIs that are stably exposed to the outside world will carry version information, such as
- /api/v1/
@@ -164,5 +164,5 @@ Currently, due to frequent changes in Knowledge and Prompt, the relevant APIs ar
:::
More detailed interface parameters can be viewed at `http://127.0.0.1:5000/docs`
More detailed interface parameters can be viewed at `http://127.0.0.1:5670/docs`

View File

@@ -52,7 +52,7 @@ Usage: dbgpt knowledge [OPTIONS] COMMAND [ARGS]...
Options:
--address TEXT Address of the Api server(If not set, try to read from
environment variable: API_ADDRESS). [default:
http://127.0.0.1:5000]
http://127.0.0.1:5670]
--help Show this message and exit.
Commands:
@@ -374,7 +374,7 @@ INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
```
#### Webserver command
The front-end service can be started through `dbgpt start webserver`, the default port is 5000, and can be accessed through `http://127.0.0.1:5000`
The front-end service can be started through `dbgpt start webserver`, the default port is 5670, and can be accessed through `http://127.0.0.1:5670`
```python
~ dbgpt start webserver --help

View File

@@ -58,7 +58,7 @@ with DAG("simple_dag_example") as dag:
Before performing access verification, the project needs to be started first: `python dbgpt/app/dbgpt_server.py`
```bash
% curl -X GET http://127.0.0.1:5000/api/v1/awel/trigger/examples/hello\?name\=zhangsan
% curl -X GET http://127.0.0.1:5670/api/v1/awel/trigger/examples/hello\?name\=zhangsan
"Hello, zhangsan, your age is 18"
```

View File

@@ -15,7 +15,7 @@ You can try to use gradio's [network](https://github.com/gradio-app/gradio/blob/
import secrets
from gradio import networking
token=secrets.token_urlsafe(32)
local_port=5000
local_port=5670
url = networking.setup_tunnel('0.0.0.0', local_port, token)
print(f'Public url: {url}')
time.sleep(60 * 60 * 24)

View File

@@ -44,7 +44,7 @@ You can view the specific usage through the command `bash docker/build_all_image
```python
docker run --ipc host --gpus all -d \
-p 5000:5000 \
-p 5670:5670 \
-e LOCAL_DB_TYPE=sqlite \
-e LOCAL_DB_PATH=data/default_sqlite.db \
-e LLM_MODEL=vicuna-13b-v1.5 \
@@ -53,7 +53,7 @@ docker run --ipc host --gpus all -d \
--name dbgpt \
eosphorosai/dbgpt
```
Open the browser and visit [http://localhost:5000](http://localhost:5000)
Open the browser and visit [http://localhost:5670](http://localhost:5670)
- `-e LLM_MODEL=vicuna-13b-v1.5`, which means the base model uses `vicuna-13b-v1.5`. For more model usage, you can view the configuration in `/pilot/configs/model_config.LLM_MODEL_CONFIG`.
- `-v /data/models:/app/models`, specifies the model file to be mounted. The directory `/data/models` is mounted in `/app/models` of the container. Of course, it can be replaced with other paths.
@@ -67,7 +67,7 @@ docker logs dbgpt -f
```python
docker run --ipc host --gpus all -d -p 3306:3306 \
-p 5000:5000 \
-p 5670:5670 \
-e LOCAL_DB_HOST=127.0.0.1 \
-e LOCAL_DB_PASSWORD=aa123456 \
-e MYSQL_ROOT_PASSWORD=aa123456 \
@@ -77,7 +77,7 @@ docker run --ipc host --gpus all -d -p 3306:3306 \
--name db-gpt-allinone \
db-gpt-allinone
```
Open the browser and visit [http://localhost:5000](http://localhost:5000)
Open the browser and visit [http://localhost:5670](http://localhost:5670)
- `-e LLM_MODEL=vicuna-13b-v1.5`, which means the base model uses `vicuna-13b-v1.5`. For more model usage, you can view the configuration in `/pilot/configs/model_config.LLM_MODEL_CONFIG`.
- `-v /data/models:/app/models`, specifies the model file to be mounted. The directory `/data/models` is mounted in `/app/models` of the container. Of course, it can be replaced with other paths.
@@ -92,7 +92,7 @@ docker logs db-gpt-allinone -f
PROXY_API_KEY="You api key"
PROXY_SERVER_URL="https://api.openai.com/v1/chat/completions"
docker run --gpus all -d -p 3306:3306 \
-p 5000:5000 \
-p 5670:5670 \
-e LOCAL_DB_HOST=127.0.0.1 \
-e LOCAL_DB_PASSWORD=aa123456 \
-e MYSQL_ROOT_PASSWORD=aa123456 \
@@ -107,6 +107,6 @@ db-gpt-allinone
- `-e LLM_MODEL=proxyllm`, set the model to serve the third-party model service API, which can be openai or fastchat interface.
- `-v /data/models/text2vec-large-chinese:/app/models/text2vec-large-chinese`, sets the knowledge base embedding model to `text2vec`
Open the browser and visit [http://localhost:5000](http://localhost:5000)
Open the browser and visit [http://localhost:5670](http://localhost:5670)

View File

@@ -23,4 +23,4 @@ For more configuration content, you can view the `docker-compose.yml` file
## Visit
Open the browser and visit [http://localhost:5000](http://localhost:5000)
Open the browser and visit [http://localhost:5670](http://localhost:5670)

View File

@@ -458,4 +458,4 @@ python pilot/server/dbgpt_server.py
:::
## Visit website
Open the browser and visit [`http://localhost:5000`](http://localhost:5000)
Open the browser and visit [`http://localhost:5670`](http://localhost:5670)

View File

@@ -168,13 +168,13 @@ python pilot/server/dbgpt_server.py
## Visit website
#### 1. Production model:
Open the browser and visit [`http://localhost:5000`](http://localhost:5000)
Open the browser and visit [`http://localhost:5670`](http://localhost:5670)
#### 2. Development mode:
```
cd web & npm install
cp .env.template .env
// set the API_BASE_URL to your DB-GPT server address, it usually is http://localhost:5000
// set the API_BASE_URL to your DB-GPT server address, it usually is http://localhost:5670
npm run dev
```
Open the browser and visit [`http://localhost:3000`](http://localhost:3000)