mirror of
https://github.com/csunny/DB-GPT.git
synced 2025-08-17 15:58:25 +00:00
Merge branch 'main' into TY_07_DEV
This commit is contained in:
commit
0ed51b5ee4
@ -31,6 +31,9 @@ QUANTIZE_QLORA=True
|
||||
## FAST_LLM_MODEL - Fast language model (Default: chatglm-6b)
|
||||
# SMART_LLM_MODEL=vicuna-13b
|
||||
# FAST_LLM_MODEL=chatglm-6b
|
||||
## Proxy llm backend, this configuration is only valid when "LLM_MODEL=proxyllm", When we use the rest API provided by deployment frameworks like fastchat as a proxyllm,
|
||||
## "PROXYLLM_BACKEND" is the model they actually deploy. We can use "PROXYLLM_BACKEND" to load the prompt of the corresponding scene.
|
||||
# PROXYLLM_BACKEND=
|
||||
|
||||
|
||||
#*******************************************************************#
|
||||
|
56
README.md
56
README.md
@ -1,5 +1,10 @@
|
||||
# DB-GPT: Revolutionizing Database Interactions with Private LLM Technology
|
||||
|
||||
<p align="left">
|
||||
<img src="./assets/LOGO.png" width="100%" />
|
||||
</p>
|
||||
|
||||
|
||||
<div align="center">
|
||||
<p>
|
||||
<a href="https://github.com/csunny/DB-GPT">
|
||||
@ -55,39 +60,6 @@ https://github.com/csunny/DB-GPT/assets/13723926/55f31781-1d49-4757-b96e-7ef6d3d
|
||||
<source id="mp4" src="https://github.com/csunny/DB-GPT/assets/17919400/654b5a49-5ea4-4c02-b5b2-72d089dcc1f0" type="video/mp4">
|
||||
</videos> -->
|
||||
|
||||
|
||||
#### Chat with data, and figure charts.
|
||||
|
||||
<p align="left">
|
||||
<img src="./assets/dashboard.png" width="800px" />
|
||||
</p>
|
||||
|
||||
#### Text2SQL, generate SQL from chat
|
||||
<p align="left">
|
||||
<img src="./assets/chatSQL.png" width="800px" />
|
||||
</p>
|
||||
|
||||
#### Chat with database meta information.
|
||||
<p align="left">
|
||||
<img src="./assets/chatdb.png" width="800px" />
|
||||
</p>
|
||||
|
||||
#### Chat with data, and execute results.
|
||||
<p align="left">
|
||||
<img src="./assets/chatdata.png" width="800px" />
|
||||
</p>
|
||||
|
||||
#### Knownledge space to manage docs.
|
||||
<p align="left">
|
||||
<img src="./assets/ks.png" width="800px" />
|
||||
</p>
|
||||
|
||||
#### Chat with knowledge, such as txt、pdf、csv、words. etc
|
||||
<p align="left">
|
||||
<img src="./assets/chat_knowledge.png" width="800px" />
|
||||
</p>
|
||||
|
||||
|
||||
## Features
|
||||
|
||||
Currently, we have released multiple key features, which are listed below to demonstrate our current capabilities:
|
||||
@ -99,6 +71,9 @@ Currently, we have released multiple key features, which are listed below to dem
|
||||
- Knowledge Management(We currently support many document formats: txt, pdf, md, html, doc, ppt, and url.)
|
||||
- Database knowledge Q&A
|
||||
- knowledge Embedding
|
||||
|
||||
- ChatDB
|
||||
- ChatDashboard
|
||||
- Plugins
|
||||
- Support custom plugin execution tasks and natively support the Auto-GPT plugin, such as:
|
||||
- Automatic execution of SQL and retrieval of query results
|
||||
@ -106,13 +81,14 @@ Currently, we have released multiple key features, which are listed below to dem
|
||||
- Unified vector storage/indexing of knowledge base
|
||||
- Support for unstructured data such as PDF, TXT, Markdown, CSV, DOC, PPT, and WebURL
|
||||
|
||||
- Multi LLMs Support
|
||||
- Supports multiple large language models, currently supporting Vicuna (7b, 13b), ChatGLM-6b (int4, int8), guanaco(7b,13b,33b), Gorilla(7b,13b), 🔥 llama-2(7b, 13b, 70b)
|
||||
- TODO: baichuan(7b, 13b)
|
||||
|
||||
|
||||
[](https://star-history.com/#csunny/DB-GPT)
|
||||
|
||||
- Multi LLMs Support, Supports multiple large language models, currently supporting
|
||||
- 🔥 llama-2(7b,13b,70b)
|
||||
- Vicuna (7b,13b)
|
||||
- ChatGLM-6b (int4,int8)
|
||||
- ChatGLM2-6b (int4,int8)
|
||||
- guanaco(7b,13b,33b)
|
||||
- Gorilla(7b,13b)
|
||||
- baichuan(7b,13b)
|
||||
|
||||
## Introduction
|
||||
DB-GPT creates a vast model operating system using [FastChat](https://github.com/lm-sys/FastChat) and offers a large language model powered by [Vicuna](https://huggingface.co/Tribbiani/vicuna-7b). In addition, we provide private domain knowledge base question-answering capability. Furthermore, we also provide support for additional plugins, and our design natively supports the Auto-GPT plugin.Our vision is to make it easier and more convenient to build applications around databases and llm.
|
||||
|
23
README.zh.md
23
README.zh.md
@ -1,4 +1,10 @@
|
||||
# DB-GPT: 用私有化LLM技术定义数据库下一代交互方式
|
||||
|
||||
<p align="left">
|
||||
<img src="./assets/LOGO.png" width="100%" />
|
||||
</p>
|
||||
|
||||
|
||||
<div align="center">
|
||||
<p>
|
||||
<a href="https://github.com/csunny/DB-GPT">
|
||||
@ -50,7 +56,6 @@ DB-GPT 是一个开源的以数据库为基础的GPT实验项目,使用本地
|
||||
|
||||
示例通过 RTX 4090 GPU 演示
|
||||
|
||||
|
||||
https://github.com/csunny/DB-GPT/assets/13723926/55f31781-1d49-4757-b96e-7ef6d3dbcf80
|
||||
|
||||
#### 根据自然语言对话生成分析图表
|
||||
@ -96,6 +101,8 @@ https://github.com/csunny/DB-GPT/assets/13723926/55f31781-1d49-4757-b96e-7ef6d3d
|
||||
- 知识库管理(目前支持 txt, pdf, md, html, doc, ppt, and url)
|
||||
- 数据库知识问答
|
||||
- 数据处理
|
||||
- 数据库对话
|
||||
- Chat2Dashboard
|
||||
- 插件模型
|
||||
- 支持自定义插件执行任务,原生支持Auto-GPT插件。如:
|
||||
- SQL自动执行,获取查询结果
|
||||
@ -104,11 +111,13 @@ https://github.com/csunny/DB-GPT/assets/13723926/55f31781-1d49-4757-b96e-7ef6d3d
|
||||
- 非结构化数据支持包括PDF、MarkDown、CSV、WebURL
|
||||
|
||||
- 多模型支持
|
||||
- 支持多种大语言模型, 当前已支持Vicuna(7b,13b), ChatGLM-6b(int4, int8), guanaco(7b,13b,33b), Gorilla(7b,13b), 🔥 llama-2(7b, 13b, 70b)
|
||||
- TODO: baichuan(7b, 13b)
|
||||
|
||||
|
||||
[](https://star-history.com/#csunny/DB-GPT)
|
||||
- 支持多种大语言模型, 当前已支持如下模型:
|
||||
- Vicuna(7b,13b)
|
||||
- ChatGLM-6b(int4,int8)
|
||||
- guanaco(7b,13b,33b)
|
||||
- Gorilla(7b,13b)
|
||||
- 🔥 llama-2(7b,13b,70b)
|
||||
- baichuan(7b,13b)
|
||||
|
||||
## 架构方案
|
||||
DB-GPT基于 [FastChat](https://github.com/lm-sys/FastChat) 构建大模型运行环境,并提供 vicuna 作为基础的大语言模型。此外,我们通过LangChain提供私域知识库问答能力。同时我们支持插件模式, 在设计上原生支持Auto-GPT插件。我们的愿景是让围绕数据库和LLM构建应用程序更加简便和便捷。
|
||||
@ -195,5 +204,3 @@ The MIT License (MIT)
|
||||
<img src="./assets/wechat.jpg" width="300px" />
|
||||
</p>
|
||||
|
||||
|
||||
|
||||
|
BIN
assets/LOGO.png
Normal file
BIN
assets/LOGO.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 130 KiB |
Binary file not shown.
Before Width: | Height: | Size: 248 KiB After Width: | Height: | Size: 221 KiB |
@ -2,57 +2,45 @@ version: '3.10'
|
||||
|
||||
services:
|
||||
db:
|
||||
image: mysql:8.0.33
|
||||
image: mysql/mysql-server
|
||||
environment:
|
||||
MYSQL_DATABASE: 'db'
|
||||
MYSQL_USER: 'user'
|
||||
MYSQL_PASSWORD: 'password'
|
||||
MYSQL_ROOT_PASSWORD: 'aa123456'
|
||||
ports:
|
||||
- 3306:3306
|
||||
volumes:
|
||||
- my-db:/var/lib/mysql
|
||||
- dbgpt-myql-db:/var/lib/mysql
|
||||
- ./docker/examples/my.cnf:/etc/my.cnf
|
||||
- ./docker/examples/sqls:/docker-entrypoint-initdb.d
|
||||
- ./assets/schema/knowledge_management.sql:/docker-entrypoint-initdb.d/knowledge_management.sql
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- dbgptnet
|
||||
webserver:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
command: python3 pilot/server/webserver.py
|
||||
image: db-gpt:latest
|
||||
command: python3 pilot/server/dbgpt_server.py
|
||||
environment:
|
||||
- MODEL_SERVER=http://llmserver:8000
|
||||
- LOCAL_DB_HOST=db
|
||||
- WEB_SERVER_PORT=7860
|
||||
- LOCAL_DB_PASSWORD=aa123456
|
||||
- ALLOWLISTED_PLUGINS=db_dashboard
|
||||
depends_on:
|
||||
- db
|
||||
- llmserver
|
||||
volumes:
|
||||
- ./models:/app/models
|
||||
- ./plugins:/app/plugins
|
||||
- data:/app/pilot/data
|
||||
env_file:
|
||||
- .env.template
|
||||
ports:
|
||||
- 7860:7860/tcp
|
||||
expose:
|
||||
- 7860/tcp
|
||||
restart: unless-stopped
|
||||
llmserver:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
command: python3 pilot/server/llmserver.py
|
||||
environment:
|
||||
- LOCAL_DB_HOST=db
|
||||
- LLM_MODEL=vicuna-13b
|
||||
depends_on:
|
||||
- db
|
||||
volumes:
|
||||
- ./models:/app/models
|
||||
- /data:/data
|
||||
# Please modify it to your own model directory
|
||||
- /data/models:/app/models
|
||||
- dbgpt-data:/app/pilot/data
|
||||
- dbgpt-message:/app/pilot/message
|
||||
env_file:
|
||||
- .env.template
|
||||
ports:
|
||||
- 8000:8000
|
||||
- 5000:5000/tcp
|
||||
# webserver may be failed, it must wait all sqls in /docker-entrypoint-initdb.d execute finish.
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- dbgptnet
|
||||
deploy:
|
||||
resources:
|
||||
reservations:
|
||||
@ -60,17 +48,11 @@ services:
|
||||
- driver: nvidia
|
||||
device_ids: ['0']
|
||||
capabilities: [gpu]
|
||||
tunnel:
|
||||
image: cloudflare/cloudflared:latest
|
||||
container_name: cloudflared-tunnel
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
- TUNNEL_URL=http://webserver:7860
|
||||
command: tunnel --no-autoupdate
|
||||
depends_on:
|
||||
- webserver
|
||||
|
||||
|
||||
volumes:
|
||||
my-db:
|
||||
data:
|
||||
dbgpt-myql-db:
|
||||
dbgpt-data:
|
||||
dbgpt-message:
|
||||
networks:
|
||||
dbgptnet:
|
||||
driver: bridge
|
||||
name: dbgptnet
|
30
docker/allinone/Dockerfile
Normal file
30
docker/allinone/Dockerfile
Normal file
@ -0,0 +1,30 @@
|
||||
ARG BASE_IMAGE="db-gpt:latest"
|
||||
|
||||
FROM ${BASE_IMAGE}
|
||||
|
||||
RUN apt-get update && apt-get install -y wget gnupg lsb-release net-tools
|
||||
|
||||
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 467B942D3A79BD29
|
||||
|
||||
RUN wget https://dev.mysql.com/get/mysql-apt-config_0.8.17-1_all.deb
|
||||
RUN dpkg -i mysql-apt-config_0.8.17-1_all.deb
|
||||
|
||||
RUN apt-get update && apt-get install -y mysql-server && apt-get clean
|
||||
|
||||
# Remote access
|
||||
RUN sed -i 's/bind-address\s*=\s*127.0.0.1/bind-address = 0.0.0.0/g' /etc/mysql/mysql.conf.d/mysqld.cnf \
|
||||
&& echo "[mysqld]\ncharacter_set_server=utf8mb4\ncollation-server=utf8mb4_unicode_ci\ninit_connect='SET NAMES utf8mb4'\n[mysql]\ndefault-character-set=utf8mb4\n[client]\ndefault-character-set=utf8mb4\n" >> /etc/mysql/my.cnf
|
||||
|
||||
# Init sql
|
||||
RUN mkdir /docker-entrypoint-initdb.d \
|
||||
&& echo "USE mysql;\nUPDATE user SET Host='%' WHERE User='root';\nFLUSH PRIVILEGES;" > /docker-entrypoint-initdb.d/init.sql
|
||||
|
||||
ENV MYSQL_ROOT_PASSWORD=aa123456
|
||||
ENV LOCAL_DB_PASSWORD="$MYSQL_ROOT_PASSWORD"
|
||||
|
||||
RUN cp /app/assets/schema/knowledge_management.sql /docker-entrypoint-initdb.d/
|
||||
|
||||
COPY docker/allinone/allinone-entrypoint.sh /usr/local/bin/allinone-entrypoint.sh
|
||||
COPY docker/examples/sqls/ /docker-entrypoint-initdb.d/
|
||||
|
||||
ENTRYPOINT ["/usr/local/bin/allinone-entrypoint.sh"]
|
17
docker/allinone/allinone-entrypoint.sh
Executable file
17
docker/allinone/allinone-entrypoint.sh
Executable file
@ -0,0 +1,17 @@
|
||||
#!/bin/bash
|
||||
|
||||
service mysql start
|
||||
|
||||
# execute all mysql init script
|
||||
for file in /docker-entrypoint-initdb.d/*.sql
|
||||
do
|
||||
echo "execute sql file: $file"
|
||||
mysql -u root -p${MYSQL_ROOT_PASSWORD} < "$file"
|
||||
done
|
||||
|
||||
mysql -u root -p${MYSQL_ROOT_PASSWORD} -e "
|
||||
ALTER USER 'root'@'%' IDENTIFIED WITH mysql_native_password BY '$MYSQL_ROOT_PASSWORD';
|
||||
FLUSH PRIVILEGES;
|
||||
"
|
||||
|
||||
python3 pilot/server/dbgpt_server.py
|
9
docker/allinone/build_image.sh
Executable file
9
docker/allinone/build_image.sh
Executable file
@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
SCRIPT_LOCATION=$0
|
||||
cd "$(dirname "$SCRIPT_LOCATION")"
|
||||
WORK_DIR=$(pwd)
|
||||
|
||||
IMAGE_NAME="db-gpt-allinone"
|
||||
|
||||
docker build -f Dockerfile -t $IMAGE_NAME $WORK_DIR/../../
|
13
docker/allinone/run.sh
Executable file
13
docker/allinone/run.sh
Executable file
@ -0,0 +1,13 @@
|
||||
#!/bin/bash
|
||||
|
||||
docker run --gpus "device=0" -d -p 3306:3306 \
|
||||
-p 5000:5000 \
|
||||
-e LOCAL_DB_HOST=127.0.0.1 \
|
||||
-e LOCAL_DB_PASSWORD=aa123456 \
|
||||
-e MYSQL_ROOT_PASSWORD=aa123456 \
|
||||
-e LLM_MODEL=vicuna-13b \
|
||||
-e LANGUAGE=zh \
|
||||
-v /data:/data \
|
||||
-v /data/models:/app/models \
|
||||
--name db-gpt-allinone \
|
||||
db-gpt-allinone
|
19
docker/allinone/run_proxyllm.sh
Executable file
19
docker/allinone/run_proxyllm.sh
Executable file
@ -0,0 +1,19 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Your api key
|
||||
PROXY_API_KEY="$PROXY_API_KEY"
|
||||
PROXY_SERVER_URL="${PROXY_SERVER_URL-'https://api.openai.com/v1/chat/completions'}"
|
||||
|
||||
docker run --gpus "device=0" -d -p 3306:3306 \
|
||||
-p 5000:5000 \
|
||||
-e LOCAL_DB_HOST=127.0.0.1 \
|
||||
-e LOCAL_DB_PASSWORD=aa123456 \
|
||||
-e MYSQL_ROOT_PASSWORD=aa123456 \
|
||||
-e LLM_MODEL=proxyllm \
|
||||
-e PROXY_API_KEY=$PROXY_API_KEY \
|
||||
-e PROXY_SERVER_URL=$PROXY_SERVER_URL \
|
||||
-e LANGUAGE=zh \
|
||||
-v /data:/data \
|
||||
-v /data/models:/app/models \
|
||||
--name db-gpt-allinone \
|
||||
db-gpt-allinone
|
25
docker/base/Dockerfile
Normal file
25
docker/base/Dockerfile
Normal file
@ -0,0 +1,25 @@
|
||||
FROM nvidia/cuda:11.8.0-devel-ubuntu22.04
|
||||
|
||||
RUN apt-get update && apt-get install -y git python3 pip wget \
|
||||
&& apt-get clean
|
||||
|
||||
# download code from githu: https://github.com/csunny/DB-GPT
|
||||
# ENV DBGPT_VERSION="v0.3.3"
|
||||
# RUN wget https://github.com/csunny/DB-GPT/archive/refs/tags/$DBGPT_VERSION.zip
|
||||
|
||||
# clone latest code, and rename to /app
|
||||
RUN git clone https://github.com/csunny/DB-GPT.git /app
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
RUN pip3 install --upgrade pip \
|
||||
&& pip3 install --no-cache-dir -r requirements.txt \
|
||||
&& pip3 install seaborn mpld3 \
|
||||
&& wget https://github.com/explosion/spacy-models/releases/download/zh_core_web_sm-3.5.0/zh_core_web_sm-3.5.0-py3-none-any.whl -O /tmp/zh_core_web_sm-3.5.0-py3-none-any.whl \
|
||||
&& pip3 install /tmp/zh_core_web_sm-3.5.0-py3-none-any.whl \
|
||||
&& rm /tmp/zh_core_web_sm-3.5.0-py3-none-any.whl \
|
||||
&& rm -rf `pip3 cache dir`
|
||||
|
||||
# RUN python3 -m spacy download zh_core_web_sm
|
||||
|
||||
EXPOSE 5000
|
8
docker/base/build_image.sh
Executable file
8
docker/base/build_image.sh
Executable file
@ -0,0 +1,8 @@
|
||||
#!/bin/bash
|
||||
|
||||
SCRIPT_LOCATION=$0
|
||||
cd "$(dirname "$SCRIPT_LOCATION")"
|
||||
WORK_DIR=$(pwd)
|
||||
|
||||
IMAGE_NAME="db-gpt"
|
||||
docker build -f Dockerfile -t $IMAGE_NAME $WORK_DIR/../../
|
9
docker/build_all_images.sh
Executable file
9
docker/build_all_images.sh
Executable file
@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
SCRIPT_LOCATION=$0
|
||||
cd "$(dirname "$SCRIPT_LOCATION")"
|
||||
WORK_DIR=$(pwd)
|
||||
|
||||
bash $WORK_DIR/base/build_image.sh
|
||||
|
||||
bash $WORK_DIR/allinone/build_image.sh
|
44
docker/examples/my.cnf
Normal file
44
docker/examples/my.cnf
Normal file
@ -0,0 +1,44 @@
|
||||
# For advice on how to change settings please see
|
||||
# http://dev.mysql.com/doc/refman/8.0/en/server-configuration-defaults.html
|
||||
|
||||
[mysqld]
|
||||
#
|
||||
# Remove leading # and set to the amount of RAM for the most important data
|
||||
# cache in MySQL. Start at 70% of total RAM for dedicated server, else 10%.
|
||||
# innodb_buffer_pool_size = 128M
|
||||
#
|
||||
# Remove leading # to turn on a very important data integrity option: logging
|
||||
# changes to the binary log between backups.
|
||||
# log_bin
|
||||
#
|
||||
# Remove leading # to set options mainly useful for reporting servers.
|
||||
# The server defaults are faster for transactions and fast SELECTs.
|
||||
# Adjust sizes as needed, experiment to find the optimal values.
|
||||
# join_buffer_size = 128M
|
||||
# sort_buffer_size = 2M
|
||||
# read_rnd_buffer_size = 2M
|
||||
|
||||
# Remove leading # to revert to previous value for default_authentication_plugin,
|
||||
# this will increase compatibility with older clients. For background, see:
|
||||
# https://dev.mysql.com/doc/refman/8.0/en/server-system-variables.html#sysvar_default_authentication_plugin
|
||||
# default-authentication-plugin=mysql_native_password
|
||||
skip-host-cache
|
||||
skip-name-resolve
|
||||
datadir=/var/lib/mysql
|
||||
socket=/var/lib/mysql/mysql.sock
|
||||
secure-file-priv=/var/lib/mysql-files
|
||||
user=mysql
|
||||
|
||||
pid-file=/var/run/mysqld/mysqld.pid
|
||||
|
||||
# add example config
|
||||
|
||||
default-authentication-plugin=mysql_native_password
|
||||
character_set_server=utf8mb4
|
||||
collation-server=utf8mb4_unicode_ci
|
||||
init_connect='SET NAMES utf8mb4'
|
||||
|
||||
[mysql]
|
||||
default-character-set=utf8mb4
|
||||
[client]
|
||||
default-character-set=utf8mb4
|
63
docker/examples/sqls/case_1_student_manager.sql
Normal file
63
docker/examples/sqls/case_1_student_manager.sql
Normal file
@ -0,0 +1,63 @@
|
||||
create database case_1_student_manager character set utf8;
|
||||
use case_1_student_manager;
|
||||
|
||||
CREATE TABLE students (
|
||||
student_id INT PRIMARY KEY,
|
||||
student_name VARCHAR(100) COMMENT '学生姓名',
|
||||
major VARCHAR(100) COMMENT '专业',
|
||||
year_of_enrollment INT COMMENT '入学年份',
|
||||
student_age INT COMMENT '学生年龄'
|
||||
) COMMENT '学生信息表';
|
||||
|
||||
CREATE TABLE courses (
|
||||
course_id INT PRIMARY KEY,
|
||||
course_name VARCHAR(100) COMMENT '课程名称',
|
||||
credit FLOAT COMMENT '学分'
|
||||
) COMMENT '课程信息表';
|
||||
|
||||
CREATE TABLE scores (
|
||||
student_id INT,
|
||||
course_id INT,
|
||||
score INT COMMENT '得分',
|
||||
semester VARCHAR(50) COMMENT '学期',
|
||||
PRIMARY KEY (student_id, course_id),
|
||||
FOREIGN KEY (student_id) REFERENCES students(student_id),
|
||||
FOREIGN KEY (course_id) REFERENCES courses(course_id)
|
||||
) COMMENT '学生成绩表';
|
||||
|
||||
|
||||
INSERT INTO students (student_id, student_name, major, year_of_enrollment, student_age) VALUES
|
||||
(1, '张三', '计算机科学', 2020, 20),
|
||||
(2, '李四', '计算机科学', 2021, 19),
|
||||
(3, '王五', '物理学', 2020, 21),
|
||||
(4, '赵六', '数学', 2021, 19),
|
||||
(5, '周七', '计算机科学', 2022, 18),
|
||||
(6, '吴八', '物理学', 2020, 21),
|
||||
(7, '郑九', '数学', 2021, 19),
|
||||
(8, '孙十', '计算机科学', 2022, 18),
|
||||
(9, '刘十一', '物理学', 2020, 21),
|
||||
(10, '陈十二', '数学', 2021, 19);
|
||||
|
||||
INSERT INTO courses (course_id, course_name, credit) VALUES
|
||||
(1, '计算机基础', 3),
|
||||
(2, '数据结构', 4),
|
||||
(3, '高等物理', 3),
|
||||
(4, '线性代数', 4),
|
||||
(5, '微积分', 5),
|
||||
(6, '编程语言', 4),
|
||||
(7, '量子力学', 3),
|
||||
(8, '概率论', 4),
|
||||
(9, '数据库系统', 4),
|
||||
(10, '计算机网络', 4);
|
||||
|
||||
INSERT INTO scores (student_id, course_id, score, semester) VALUES
|
||||
(1, 1, 90, '2020年秋季'),
|
||||
(1, 2, 85, '2021年春季'),
|
||||
(2, 1, 88, '2021年秋季'),
|
||||
(2, 2, 90, '2022年春季'),
|
||||
(3, 3, 92, '2020年秋季'),
|
||||
(3, 4, 85, '2021年春季'),
|
||||
(4, 3, 88, '2021年秋季'),
|
||||
(4, 4, 86, '2022年春季'),
|
||||
(5, 1, 90, '2022年秋季'),
|
||||
(5, 2, 87, '2023年春季');
|
63
docker/examples/sqls/case_2_ecom.sql
Normal file
63
docker/examples/sqls/case_2_ecom.sql
Normal file
@ -0,0 +1,63 @@
|
||||
create database case_2_ecom character set utf8;
|
||||
use case_2_ecom;
|
||||
|
||||
CREATE TABLE users (
|
||||
user_id INT PRIMARY KEY,
|
||||
user_name VARCHAR(100) COMMENT '用户名',
|
||||
user_email VARCHAR(100) COMMENT '用户邮箱',
|
||||
registration_date DATE COMMENT '注册日期',
|
||||
user_country VARCHAR(100) COMMENT '用户国家'
|
||||
) COMMENT '用户信息表';
|
||||
|
||||
CREATE TABLE products (
|
||||
product_id INT PRIMARY KEY,
|
||||
product_name VARCHAR(100) COMMENT '商品名称',
|
||||
product_price FLOAT COMMENT '商品价格'
|
||||
) COMMENT '商品信息表';
|
||||
|
||||
CREATE TABLE orders (
|
||||
order_id INT PRIMARY KEY,
|
||||
user_id INT,
|
||||
product_id INT,
|
||||
quantity INT COMMENT '数量',
|
||||
order_date DATE COMMENT '订单日期',
|
||||
FOREIGN KEY (user_id) REFERENCES users(user_id),
|
||||
FOREIGN KEY (product_id) REFERENCES products(product_id)
|
||||
) COMMENT '订单信息表';
|
||||
|
||||
|
||||
INSERT INTO users (user_id, user_name, user_email, registration_date, user_country) VALUES
|
||||
(1, 'John', 'john@gmail.com', '2020-01-01', 'USA'),
|
||||
(2, 'Mary', 'mary@gmail.com', '2021-01-01', 'UK'),
|
||||
(3, 'Bob', 'bob@gmail.com', '2020-01-01', 'USA'),
|
||||
(4, 'Alice', 'alice@gmail.com', '2021-01-01', 'UK'),
|
||||
(5, 'Charlie', 'charlie@gmail.com', '2020-01-01', 'USA'),
|
||||
(6, 'David', 'david@gmail.com', '2021-01-01', 'UK'),
|
||||
(7, 'Eve', 'eve@gmail.com', '2020-01-01', 'USA'),
|
||||
(8, 'Frank', 'frank@gmail.com', '2021-01-01', 'UK'),
|
||||
(9, 'Grace', 'grace@gmail.com', '2020-01-01', 'USA'),
|
||||
(10, 'Helen', 'helen@gmail.com', '2021-01-01', 'UK');
|
||||
|
||||
INSERT INTO products (product_id, product_name, product_price) VALUES
|
||||
(1, 'iPhone', 699),
|
||||
(2, 'Samsung Galaxy', 599),
|
||||
(3, 'iPad', 329),
|
||||
(4, 'Macbook', 1299),
|
||||
(5, 'Apple Watch', 399),
|
||||
(6, 'AirPods', 159),
|
||||
(7, 'Echo', 99),
|
||||
(8, 'Kindle', 89),
|
||||
(9, 'Fire TV Stick', 39),
|
||||
(10, 'Echo Dot', 49);
|
||||
|
||||
INSERT INTO orders (order_id, user_id, product_id, quantity, order_date) VALUES
|
||||
(1, 1, 1, 1, '2022-01-01'),
|
||||
(2, 1, 2, 1, '2022-02-01'),
|
||||
(3, 2, 3, 2, '2022-03-01'),
|
||||
(4, 2, 4, 1, '2022-04-01'),
|
||||
(5, 3, 5, 2, '2022-05-01'),
|
||||
(6, 3, 6, 3, '2022-06-01'),
|
||||
(7, 4, 7, 2, '2022-07-01'),
|
||||
(8, 4, 8, 1, '2022-08-01'),
|
||||
(9, 5, 9, 2, '2022-09-01'),
|
||||
(10, 5, 10, 3, '2022-10-01');
|
87
docker/examples/sqls/test_case.md
Normal file
87
docker/examples/sqls/test_case.md
Normal file
@ -0,0 +1,87 @@
|
||||
# 测试问题
|
||||
|
||||
## 场景一
|
||||
|
||||
学校管理系统,主要测试SQL助手的联合查询,条件查询和排序功能。
|
||||
|
||||
我们的数据库有三个表:学生表、课程表和成绩表。我们要测试SQL助手能否处理复杂的SQL查询,包括连接多个表,按照一定的条件筛选数据,以及对结果进行排序。
|
||||
|
||||
### Q1
|
||||
|
||||
查询所有学生的姓名,专业和成绩,按成绩降序排序
|
||||
|
||||
SQL:
|
||||
```sql
|
||||
SELECT students.student_name, students.major, scores.score
|
||||
FROM students
|
||||
JOIN scores ON students.student_id = scores.student_id
|
||||
ORDER BY scores.score DESC;
|
||||
```
|
||||
|
||||
### Q2
|
||||
|
||||
查询 "计算机科学" 专业的学生的平均成绩
|
||||
|
||||
SQL:
|
||||
```sql
|
||||
SELECT AVG(scores.score) as avg_score
|
||||
FROM students
|
||||
JOIN scores ON students.student_id = scores.student_id
|
||||
WHERE students.major = '计算机科学';
|
||||
```
|
||||
|
||||
### Q3
|
||||
|
||||
查询哪些学生在 "2023年春季" 学期的课程学分总和超过2学分
|
||||
|
||||
```sql
|
||||
SELECT students.student_name
|
||||
FROM students
|
||||
JOIN scores ON students.student_id = scores.student_id
|
||||
JOIN courses ON scores.course_id = courses.course_id
|
||||
WHERE scores.semester = '2023年春季'
|
||||
GROUP BY students.student_id
|
||||
HAVING SUM(courses.credit) > 2;
|
||||
```
|
||||
|
||||
## 场景二:电商系统,主要测试SQL助手的数据聚合和分组功能。
|
||||
|
||||
我们的数据库有三个表:用户表、商品表和订单表。我们要测试SQL助手能否处理复杂的SQL查询,包括对数据进行聚合和分组。
|
||||
|
||||
### Q1
|
||||
|
||||
查询每个用户的总订单数量
|
||||
|
||||
SQL:
|
||||
|
||||
```sql
|
||||
SELECT users.user_name, COUNT(orders.order_id) as order_count
|
||||
FROM users
|
||||
JOIN orders ON users.user_id = orders.user_id
|
||||
GROUP BY users.user_id;
|
||||
```
|
||||
|
||||
### Q2
|
||||
|
||||
查询每种商品的总销售额
|
||||
|
||||
```sql
|
||||
SELECT products.product_name, SUM(products.product_price * orders.quantity) as total_sales
|
||||
FROM products
|
||||
JOIN orders ON products.product_id = orders.product_id
|
||||
GROUP BY products.product_id;
|
||||
```
|
||||
|
||||
### Q3
|
||||
|
||||
查询2023年最受欢迎的商品(订单数量最多的商品)
|
||||
|
||||
```sql
|
||||
SELECT products.product_name
|
||||
FROM products
|
||||
JOIN orders ON products.product_id = orders.product_id
|
||||
WHERE YEAR(orders.order_date) = 2023
|
||||
GROUP BY products.product_id
|
||||
ORDER BY COUNT(orders.order_id) DESC
|
||||
LIMIT 1;
|
||||
```
|
19
docker/examples/sqls/test_case_info.sql
Normal file
19
docker/examples/sqls/test_case_info.sql
Normal file
@ -0,0 +1,19 @@
|
||||
create database test_case_info character set utf8;
|
||||
use test_case_info;
|
||||
|
||||
CREATE TABLE test_cases (
|
||||
case_id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
scenario_name VARCHAR(100) COMMENT '场景名称',
|
||||
scenario_description TEXT COMMENT '场景描述',
|
||||
test_question VARCHAR(500) COMMENT '测试问题',
|
||||
expected_sql TEXT COMMENT '预期SQL',
|
||||
correct_output TEXT COMMENT '正确输出'
|
||||
) COMMENT '测试用例表';
|
||||
|
||||
INSERT INTO test_cases (scenario_name, scenario_description, test_question, expected_sql, correct_output) VALUES
|
||||
('学校管理系统', '测试SQL助手的联合查询,条件查询和排序功能', '查询所有学生的姓名,专业和成绩,按成绩降序排序', 'SELECT students.student_name, students.major, scores.score FROM students JOIN scores ON students.student_id = scores.student_id ORDER BY scores.score DESC;', '返回所有学生的姓名,专业和成绩,按成绩降序排序的结果'),
|
||||
('学校管理系统', '测试SQL助手的联合查询,条件查询和排序功能', '查询计算机科学专业的学生的平均成绩', 'SELECT AVG(scores.score) as avg_score FROM students JOIN scores ON students.student_id = scores.student_id WHERE students.major = ''计算机科学'';', '返回计算机科学专业学生的平均成绩'),
|
||||
('学校管理系统', '测试SQL助手的联合查询,条件查询和排序功能', '查询哪些学生在2023年秋季学期的课程学分总和超过15', 'SELECT students.student_name FROM students JOIN scores ON students.student_id = scores.student_id JOIN courses ON scores.course_id = courses.course_id WHERE scores.semester = ''2023年秋季'' GROUP BY students.student_id HAVING SUM(courses.credit) > 15;', '返回在2023年秋季学期的课程学分总和超过15的学生的姓名'),
|
||||
('电商系统', '测试SQL助手的数据聚合和分组功能', '查询每个用户的总订单数量', 'SELECT users.user_name, COUNT(orders.order_id) as order_count FROM users JOIN orders ON users.user_id = orders.user_id GROUP BY users.user_id;', '返回每个用户的总订单数量'),
|
||||
('电商系统', '测试SQL助手的数据聚合和分组功能', '查询每种商品的总销售额', 'SELECT products.product_name, SUM(products.product_price * orders.quantity) as total_sales FROM products JOIN orders ON products.product_id = orders.product_id GROUP BY products.product_id;', '返回每种商品的总销售额'),
|
||||
('电商系统', '测试SQL助手的数据聚合和分组功能', '查询2023年最受欢迎的商品(订单数量最多的商品)', 'SELECT products.product_name FROM products JOIN orders ON products.product_id = orders.product_id WHERE YEAR(orders.order_date) = 2023 GROUP BY products.product_id ORDER BY COUNT(orders.order_id) DESC LIMIT 1;', '返回2023年最受欢迎的商品(订单数量最多的商品)的名称');
|
3
docker/examples/sqls/user_config.sql
Normal file
3
docker/examples/sqls/user_config.sql
Normal file
@ -0,0 +1,3 @@
|
||||
USE mysql;
|
||||
UPDATE user SET Host='%' WHERE User='root';
|
||||
FLUSH PRIVILEGES;
|
@ -86,3 +86,98 @@ $ python pilot/server/dbgpt_server.py --light
|
||||
|
||||
If you want to learn about dbgpt-webui, read https://github.com/csunny/DB-GPT/tree/new-page-framework/datacenter
|
||||
|
||||
### 4. Docker (Experimental)
|
||||
|
||||
#### 4.1 Building Docker image
|
||||
|
||||
```bash
|
||||
$ bash docker/build_all_images.sh
|
||||
```
|
||||
|
||||
Review images by listing them:
|
||||
|
||||
```bash
|
||||
$ docker images|grep db-gpt
|
||||
```
|
||||
|
||||
Output should look something like the following:
|
||||
|
||||
```
|
||||
db-gpt-allinone latest e1ffd20b85ac 45 minutes ago 14.5GB
|
||||
db-gpt latest e36fb0cca5d9 3 hours ago 14GB
|
||||
```
|
||||
|
||||
#### 4.2. Run all in one docker container
|
||||
|
||||
**Run with local model**
|
||||
|
||||
```bash
|
||||
$ docker run --gpus "device=0" -d -p 3306:3306 \
|
||||
-p 5000:5000 \
|
||||
-e LOCAL_DB_HOST=127.0.0.1 \
|
||||
-e LOCAL_DB_PASSWORD=aa123456 \
|
||||
-e MYSQL_ROOT_PASSWORD=aa123456 \
|
||||
-e LLM_MODEL=vicuna-13b \
|
||||
-e LANGUAGE=zh \
|
||||
-v /data/models:/app/models \
|
||||
--name db-gpt-allinone \
|
||||
db-gpt-allinone
|
||||
```
|
||||
|
||||
Open http://localhost:5000 with your browser to see the product.
|
||||
|
||||
|
||||
- `-e LLM_MODEL=vicuna-13b`, means we use vicuna-13b as llm model, see /pilot/configs/model_config.LLM_MODEL_CONFIG
|
||||
- `-v /data/models:/app/models`, means we mount the local model file directory `/data/models` to the docker container directory `/app/models`, please replace it with your model file directory.
|
||||
|
||||
You can see log with command:
|
||||
|
||||
```bash
|
||||
$ docker logs db-gpt-allinone -f
|
||||
```
|
||||
|
||||
**Run with openai interface**
|
||||
|
||||
```bash
|
||||
$ PROXY_API_KEY="You api key"
|
||||
$ PROXY_SERVER_URL="https://api.openai.com/v1/chat/completions"
|
||||
$ docker run --gpus "device=0" -d -p 3306:3306 \
|
||||
-p 5000:5000 \
|
||||
-e LOCAL_DB_HOST=127.0.0.1 \
|
||||
-e LOCAL_DB_PASSWORD=aa123456 \
|
||||
-e MYSQL_ROOT_PASSWORD=aa123456 \
|
||||
-e LLM_MODEL=proxyllm \
|
||||
-e PROXY_API_KEY=$PROXY_API_KEY \
|
||||
-e PROXY_SERVER_URL=$PROXY_SERVER_URL \
|
||||
-e LANGUAGE=zh \
|
||||
-v /data/models/text2vec-large-chinese:/app/models/text2vec-large-chinese \
|
||||
--name db-gpt-allinone \
|
||||
db-gpt-allinone
|
||||
```
|
||||
|
||||
- `-e LLM_MODEL=proxyllm`, means we use proxy llm(openai interface, fastchat interface...)
|
||||
- `-v /data/models/text2vec-large-chinese:/app/models/text2vec-large-chinese`, means we mount the local text2vec model to the docker container.
|
||||
|
||||
#### 4.2. Run with docker compose
|
||||
|
||||
```bash
|
||||
$ docker compose up -d
|
||||
```
|
||||
|
||||
Output should look something like the following:
|
||||
```
|
||||
[+] Building 0.0s (0/0)
|
||||
[+] Running 2/2
|
||||
✔ Container db-gpt-db-1 Started 0.4s
|
||||
✔ Container db-gpt-webserver-1 Started
|
||||
```
|
||||
|
||||
You can see log with command:
|
||||
|
||||
```bash
|
||||
$ docker logs db-gpt-webserver-1 -f
|
||||
```
|
||||
|
||||
Open http://localhost:5000 with your browser to see the product.
|
||||
|
||||
You can open docker-compose.yml in the project root directory to see more details.
|
||||
|
@ -11,7 +11,7 @@ cp .env.template .env
|
||||
LLM_MODEL=vicuna-13b
|
||||
MODEL_SERVER=http://127.0.0.1:8000
|
||||
```
|
||||
now we support models vicuna-13b, vicuna-7b, chatglm-6b, flan-t5-base, guanaco-33b-merged, falcon-40b, gorilla-7b, llama-2-7b, llama-2-13b.
|
||||
now we support models vicuna-13b, vicuna-7b, chatglm-6b, flan-t5-base, guanaco-33b-merged, falcon-40b, gorilla-7b, llama-2-7b, llama-2-13b, baichuan-7b, baichuan-13b
|
||||
|
||||
if you want use other model, such as chatglm-6b, you just need update .env config file.
|
||||
```
|
||||
|
@ -36,7 +36,19 @@ class StrictFormatter(Formatter):
|
||||
super().format(format_string, **dummy_inputs)
|
||||
|
||||
|
||||
class NoStrictFormatter(StrictFormatter):
|
||||
def check_unused_args(
|
||||
self,
|
||||
used_args: Sequence[Union[int, str]],
|
||||
args: Sequence,
|
||||
kwargs: Mapping[str, Any],
|
||||
) -> None:
|
||||
"""Not check unused args"""
|
||||
pass
|
||||
|
||||
|
||||
formatter = StrictFormatter()
|
||||
no_strict_formatter = NoStrictFormatter()
|
||||
|
||||
|
||||
class MyEncoder(json.JSONEncoder):
|
||||
|
@ -131,6 +131,13 @@ class Config(metaclass=Singleton):
|
||||
|
||||
### LLM Model Service Configuration
|
||||
self.LLM_MODEL = os.getenv("LLM_MODEL", "vicuna-13b")
|
||||
### Proxy llm backend, this configuration is only valid when "LLM_MODEL=proxyllm"
|
||||
### When we use the rest API provided by deployment frameworks like fastchat as a proxyllm, "PROXYLLM_BACKEND" is the model they actually deploy.
|
||||
### We need to use "PROXYLLM_BACKEND" to load the prompt of the corresponding scene.
|
||||
self.PROXYLLM_BACKEND = None
|
||||
if self.LLM_MODEL == "proxyllm":
|
||||
self.PROXYLLM_BACKEND = os.getenv("PROXYLLM_BACKEND")
|
||||
|
||||
self.LIMIT_MODEL_CONCURRENCY = int(os.getenv("LIMIT_MODEL_CONCURRENCY", 5))
|
||||
self.MAX_POSITION_EMBEDDINGS = int(os.getenv("MAX_POSITION_EMBEDDINGS", 4096))
|
||||
self.MODEL_PORT = os.getenv("MODEL_PORT", 8000)
|
||||
|
@ -50,6 +50,9 @@ LLM_MODEL_CONFIG = {
|
||||
"llama-2-7b": os.path.join(MODEL_PATH, "Llama-2-7b-chat-hf"),
|
||||
"llama-2-13b": os.path.join(MODEL_PATH, "Llama-2-13b-chat-hf"),
|
||||
"llama-2-70b": os.path.join(MODEL_PATH, "Llama-2-70b-chat-hf"),
|
||||
"baichuan-13b": os.path.join(MODEL_PATH, "Baichuan-13B-Chat"),
|
||||
# please rename "fireballoon/baichuan-vicuna-chinese-7b" to "baichuan-7b"
|
||||
"baichuan-7b": os.path.join(MODEL_PATH, "baichuan-7b"),
|
||||
}
|
||||
|
||||
# Load model config
|
||||
|
28
pilot/embedding_engine/docx_loader.py
Normal file
28
pilot/embedding_engine/docx_loader.py
Normal file
@ -0,0 +1,28 @@
|
||||
from typing import List, Optional
|
||||
|
||||
from langchain.docstore.document import Document
|
||||
from langchain.document_loaders.base import BaseLoader
|
||||
import docx
|
||||
|
||||
|
||||
class DocxLoader(BaseLoader):
|
||||
"""Load docx files."""
|
||||
|
||||
def __init__(self, file_path: str, encoding: Optional[str] = None):
|
||||
"""Initialize with file path."""
|
||||
self.file_path = file_path
|
||||
self.encoding = encoding
|
||||
|
||||
def load(self) -> List[Document]:
|
||||
"""Load from file path."""
|
||||
docs = []
|
||||
doc = docx.Document(self.file_path)
|
||||
content = []
|
||||
for i in range(len(doc.paragraphs)):
|
||||
para = doc.paragraphs[i]
|
||||
text = para.text
|
||||
content.append(text)
|
||||
docs.append(
|
||||
Document(page_content="".join(content), metadata={"source": self.file_path})
|
||||
)
|
||||
return docs
|
@ -2,7 +2,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
from typing import List, Optional
|
||||
|
||||
from langchain.document_loaders import UnstructuredPowerPointLoader
|
||||
from langchain.schema import Document
|
||||
from langchain.text_splitter import (
|
||||
SpacyTextSplitter,
|
||||
@ -11,6 +10,7 @@ from langchain.text_splitter import (
|
||||
)
|
||||
|
||||
from pilot.embedding_engine import SourceEmbedding, register
|
||||
from pilot.embedding_engine.ppt_loader import PPTLoader
|
||||
|
||||
|
||||
class PPTEmbedding(SourceEmbedding):
|
||||
@ -36,7 +36,7 @@ class PPTEmbedding(SourceEmbedding):
|
||||
def read(self):
|
||||
"""Load from ppt path."""
|
||||
if self.source_reader is None:
|
||||
self.source_reader = UnstructuredPowerPointLoader(self.file_path)
|
||||
self.source_reader = PPTLoader(self.file_path)
|
||||
if self.text_splitter is None:
|
||||
try:
|
||||
self.text_splitter = SpacyTextSplitter(
|
||||
|
28
pilot/embedding_engine/ppt_loader.py
Normal file
28
pilot/embedding_engine/ppt_loader.py
Normal file
@ -0,0 +1,28 @@
|
||||
from typing import List, Optional
|
||||
|
||||
from langchain.docstore.document import Document
|
||||
from langchain.document_loaders.base import BaseLoader
|
||||
from pptx import Presentation
|
||||
|
||||
|
||||
class PPTLoader(BaseLoader):
|
||||
"""Load PPT files."""
|
||||
|
||||
def __init__(self, file_path: str, encoding: Optional[str] = None):
|
||||
"""Initialize with file path."""
|
||||
self.file_path = file_path
|
||||
self.encoding = encoding
|
||||
|
||||
def load(self) -> List[Document]:
|
||||
"""Load from file path."""
|
||||
pr = Presentation(self.file_path)
|
||||
docs = []
|
||||
for slide in pr.slides:
|
||||
for shape in slide.shapes:
|
||||
if hasattr(shape, "text") and shape.text is not "":
|
||||
docs.append(
|
||||
Document(
|
||||
page_content=shape.text, metadata={"source": slide.slide_id}
|
||||
)
|
||||
)
|
||||
return docs
|
@ -2,7 +2,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
from typing import List, Optional
|
||||
|
||||
from langchain.document_loaders import UnstructuredWordDocumentLoader
|
||||
from langchain.schema import Document
|
||||
from langchain.text_splitter import (
|
||||
SpacyTextSplitter,
|
||||
@ -11,6 +10,7 @@ from langchain.text_splitter import (
|
||||
)
|
||||
|
||||
from pilot.embedding_engine import SourceEmbedding, register
|
||||
from pilot.embedding_engine.docx_loader import DocxLoader
|
||||
|
||||
|
||||
class WordEmbedding(SourceEmbedding):
|
||||
@ -36,7 +36,7 @@ class WordEmbedding(SourceEmbedding):
|
||||
def read(self):
|
||||
"""Load from word path."""
|
||||
if self.source_reader is None:
|
||||
self.source_reader = UnstructuredWordDocumentLoader(self.file_path)
|
||||
self.source_reader = DocxLoader(self.file_path)
|
||||
if self.text_splitter is None:
|
||||
try:
|
||||
self.text_splitter = SpacyTextSplitter(
|
||||
|
@ -276,6 +276,25 @@ class Llama2Adapter(BaseLLMAdaper):
|
||||
return model, tokenizer
|
||||
|
||||
|
||||
class BaichuanAdapter(BaseLLMAdaper):
|
||||
"""The model adapter for Baichuan models (e.g., baichuan-inc/Baichuan-13B-Chat)"""
|
||||
|
||||
def match(self, model_path: str):
|
||||
return "baichuan" in model_path.lower()
|
||||
|
||||
def loader(self, model_path: str, from_pretrained_kwargs: dict):
|
||||
tokenizer = AutoTokenizer.from_pretrained(
|
||||
model_path, trust_remote_code=True, use_fast=False
|
||||
)
|
||||
model = AutoModelForCausalLM.from_pretrained(
|
||||
model_path,
|
||||
trust_remote_code=True,
|
||||
low_cpu_mem_usage=True,
|
||||
**from_pretrained_kwargs,
|
||||
)
|
||||
return model, tokenizer
|
||||
|
||||
|
||||
register_llm_model_adapters(VicunaLLMAdapater)
|
||||
register_llm_model_adapters(ChatGLMAdapater)
|
||||
register_llm_model_adapters(GuanacoAdapter)
|
||||
@ -283,6 +302,7 @@ register_llm_model_adapters(FalconAdapater)
|
||||
register_llm_model_adapters(GorillaAdapter)
|
||||
register_llm_model_adapters(GPT4AllAdapter)
|
||||
register_llm_model_adapters(Llama2Adapter)
|
||||
register_llm_model_adapters(BaichuanAdapter)
|
||||
# TODO Default support vicuna, other model need to tests and Evaluate
|
||||
|
||||
# just for test_py, remove this later
|
||||
|
@ -284,6 +284,21 @@ def get_conv_template(name: str) -> Conversation:
|
||||
return conv_templates[name].copy()
|
||||
|
||||
|
||||
# A template similar to the "one_shot" template above but remove the example.
|
||||
register_conv_template(
|
||||
Conversation(
|
||||
name="zero_shot",
|
||||
system="A chat between a curious human and an artificial intelligence assistant. "
|
||||
"The assistant gives helpful, detailed, and polite answers to the human's questions.",
|
||||
roles=("Human", "Assistant"),
|
||||
messages=(),
|
||||
offset=0,
|
||||
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
|
||||
sep="\n### ",
|
||||
stop_str="###",
|
||||
)
|
||||
)
|
||||
|
||||
# llama2 template
|
||||
# reference: https://github.com/facebookresearch/llama/blob/cfc3fc8c1968d390eb830e65c63865e980873a06/llama/generation.py#L212
|
||||
register_conv_template(
|
||||
@ -305,4 +320,21 @@ register_conv_template(
|
||||
)
|
||||
)
|
||||
|
||||
# Baichuan-13B-Chat template
|
||||
register_conv_template(
|
||||
# source: https://huggingface.co/baichuan-inc/Baichuan-13B-Chat/blob/f5f47be2adbbdceb784f334d6fa1ca2c73e65097/modeling_baichuan.py#L507
|
||||
# https://huggingface.co/baichuan-inc/Baichuan-13B-Chat/blob/main/generation_config.json
|
||||
Conversation(
|
||||
name="baichuan-chat",
|
||||
system="",
|
||||
roles=(" <reserved_102> ", " <reserved_103> "),
|
||||
messages=(),
|
||||
offset=0,
|
||||
sep_style=SeparatorStyle.NO_COLON_TWO,
|
||||
sep="",
|
||||
sep2="</s>",
|
||||
stop_token_ids=[2, 195],
|
||||
)
|
||||
)
|
||||
|
||||
# TODO Support other model conversation template
|
||||
|
@ -54,17 +54,19 @@ class BaseOutputParser(ABC):
|
||||
""" TODO Multi mode output handler, rewrite this for multi model, use adapter mode.
|
||||
"""
|
||||
model_context = data.get("model_context")
|
||||
has_echo = True
|
||||
if model_context and "prompt_echo_len_char" in model_context:
|
||||
prompt_echo_len_char = int(model_context.get("prompt_echo_len_char", -1))
|
||||
has_echo = bool(model_context.get("echo", True))
|
||||
if prompt_echo_len_char != -1:
|
||||
skip_echo_len = prompt_echo_len_char
|
||||
|
||||
if data.get("error_code", 0) == 0:
|
||||
if "vicuna" in CFG.LLM_MODEL or "llama-2" in CFG.LLM_MODEL:
|
||||
if has_echo and ("vicuna" in CFG.LLM_MODEL or "llama-2" in CFG.LLM_MODEL):
|
||||
# TODO Judging from model_context
|
||||
# output = data["text"][skip_echo_len + 11:].strip()
|
||||
output = data["text"][skip_echo_len:].strip()
|
||||
elif "guanaco" in CFG.LLM_MODEL:
|
||||
elif has_echo and "guanaco" in CFG.LLM_MODEL:
|
||||
# NO stream output
|
||||
# output = data["text"][skip_echo_len + 2:].replace("<s>", "").strip()
|
||||
|
||||
|
@ -4,7 +4,7 @@ from typing import Any, Callable, Dict, List, Mapping, Optional, Set, Union
|
||||
from pydantic import BaseModel, Extra, Field, root_validator
|
||||
|
||||
|
||||
from pilot.common.formatting import formatter
|
||||
from pilot.common.formatting import formatter, no_strict_formatter
|
||||
from pilot.out_parser.base import BaseOutputParser
|
||||
from pilot.common.schema import SeparatorStyle
|
||||
from pilot.prompts.example_base import ExampleSelector
|
||||
@ -24,8 +24,10 @@ def jinja2_formatter(template: str, **kwargs: Any) -> str:
|
||||
|
||||
|
||||
DEFAULT_FORMATTER_MAPPING: Dict[str, Callable] = {
|
||||
"f-string": formatter.format,
|
||||
"jinja2": jinja2_formatter,
|
||||
"f-string": lambda is_strict: formatter.format
|
||||
if is_strict
|
||||
else no_strict_formatter.format,
|
||||
"jinja2": lambda is_strict: jinja2_formatter,
|
||||
}
|
||||
|
||||
|
||||
@ -38,6 +40,8 @@ class PromptTemplate(BaseModel, ABC):
|
||||
template: Optional[str]
|
||||
"""The prompt template."""
|
||||
template_format: str = "f-string"
|
||||
"""strict template will check template args"""
|
||||
template_is_strict: bool = True
|
||||
"""The format of the prompt template. Options are: 'f-string', 'jinja2'."""
|
||||
response_format: Optional[str]
|
||||
"""default use stream out"""
|
||||
@ -68,10 +72,12 @@ class PromptTemplate(BaseModel, ABC):
|
||||
"""Format the prompt with the inputs."""
|
||||
if self.template:
|
||||
if self.response_format:
|
||||
kwargs["response"] = json.dumps(self.response_format, indent=4)
|
||||
kwargs["response"] = json.dumps(
|
||||
self.response_format, ensure_ascii=False, indent=4
|
||||
)
|
||||
return DEFAULT_FORMATTER_MAPPING[self.template_format](
|
||||
self.template, **kwargs
|
||||
)
|
||||
self.template_is_strict
|
||||
)(self.template, **kwargs)
|
||||
|
||||
def add_goals(self, goal: str) -> None:
|
||||
self.goals.append(goal)
|
||||
|
@ -58,13 +58,26 @@ class PromptTemplateRegistry:
|
||||
scene_registry, prompt_template, language, [_DEFAULT_MODEL_KEY]
|
||||
)
|
||||
|
||||
def get_prompt_template(self, scene_name: str, language: str, model_name: str):
|
||||
"""Get prompt template with scene name, language and model name"""
|
||||
def get_prompt_template(
|
||||
self,
|
||||
scene_name: str,
|
||||
language: str,
|
||||
model_name: str,
|
||||
proxyllm_backend: str = None,
|
||||
):
|
||||
"""Get prompt template with scene name, language and model name
|
||||
proxyllm_backend: see CFG.PROXYLLM_BACKEND
|
||||
"""
|
||||
scene_registry = self.registry[scene_name]
|
||||
registry = scene_registry.get(model_name)
|
||||
|
||||
print(
|
||||
f"Get prompt template of scene_name: {scene_name} with model_name: {model_name} language: {language}"
|
||||
f"Get prompt template of scene_name: {scene_name} with model_name: {model_name}, proxyllm_backend: {proxyllm_backend}, language: {language}"
|
||||
)
|
||||
registry = None
|
||||
if proxyllm_backend:
|
||||
registry = scene_registry.get(proxyllm_backend)
|
||||
if not registry:
|
||||
registry = scene_registry.get(model_name)
|
||||
if not registry:
|
||||
registry = scene_registry.get(_DEFAULT_MODEL_KEY)
|
||||
if not registry:
|
||||
|
@ -69,6 +69,7 @@ class BaseChat(ABC):
|
||||
self.chat_mode = chat_mode
|
||||
self.current_user_input: str = current_user_input
|
||||
self.llm_model = CFG.LLM_MODEL
|
||||
self.llm_echo = False
|
||||
### can configurable storage methods
|
||||
self.memory = DuckdbHistoryMemory(chat_session_id)
|
||||
|
||||
@ -78,7 +79,10 @@ class BaseChat(ABC):
|
||||
# ]
|
||||
self.prompt_template: PromptTemplate = (
|
||||
CFG.prompt_template_registry.get_prompt_template(
|
||||
self.chat_mode.value(), language=CFG.LANGUAGE, model_name=CFG.LLM_MODEL
|
||||
self.chat_mode.value(),
|
||||
language=CFG.LANGUAGE,
|
||||
model_name=CFG.LLM_MODEL,
|
||||
proxyllm_backend=CFG.PROXYLLM_BACKEND,
|
||||
)
|
||||
)
|
||||
self.history_message: List[OnceConversation] = self.memory.messages()
|
||||
@ -128,6 +132,7 @@ class BaseChat(ABC):
|
||||
"temperature": float(self.prompt_template.temperature),
|
||||
"max_new_tokens": int(self.prompt_template.max_new_tokens),
|
||||
"stop": self.prompt_template.sep,
|
||||
"echo": self.llm_echo,
|
||||
}
|
||||
return payload
|
||||
|
||||
|
@ -9,10 +9,7 @@ EXAMPLES = [
|
||||
{
|
||||
"type": "ai",
|
||||
"data": {
|
||||
"content": """{
|
||||
\"thoughts\": \"thought text\",
|
||||
\"sql\": \"SELECT city FROM user where user_name='test1'\",
|
||||
}""",
|
||||
"content": """{\n\"thoughts\": \"直接查询用户表中用户名为'test1'的记录即可\",\n\"sql\": \"SELECT city FROM user where user_name='test1'\"}""",
|
||||
"example": True,
|
||||
},
|
||||
},
|
||||
@ -24,10 +21,7 @@ EXAMPLES = [
|
||||
{
|
||||
"type": "ai",
|
||||
"data": {
|
||||
"content": """{
|
||||
\"thoughts\": \"thought text\",
|
||||
\"sql\": \"SELECT b.* FROM user a LEFT JOIN tran_order b ON a.user_name=b.user_name where a.city='成都'\",
|
||||
}""",
|
||||
"content": """{\n\"thoughts\": \"根据订单表的用户名和用户表的用户名关联用户表和订单表,再通过用户表的城市为'成都'的过滤即可\",\n\"sql\": \"SELECT b.* FROM user a LEFT JOIN tran_order b ON a.user_name=b.user_name where a.city='成都'\"}""",
|
||||
"example": True,
|
||||
},
|
||||
},
|
||||
|
@ -43,7 +43,7 @@ PROMPT_TEMPERATURE = 0.5
|
||||
prompt = PromptTemplate(
|
||||
template_scene=ChatScene.ChatWithDbExecute.value(),
|
||||
input_variables=["input", "table_info", "dialect", "top_k", "response"],
|
||||
response_format=json.dumps(RESPONSE_FORMAT_SIMPLE, indent=4),
|
||||
response_format=json.dumps(RESPONSE_FORMAT_SIMPLE, ensure_ascii=False, indent=4),
|
||||
template_define=PROMPT_SCENE_DEFINE,
|
||||
template=_DEFAULT_TEMPLATE,
|
||||
stream_out=PROMPT_NEED_NEED_STREAM_OUT,
|
||||
@ -54,3 +54,4 @@ prompt = PromptTemplate(
|
||||
temperature=PROMPT_TEMPERATURE,
|
||||
)
|
||||
CFG.prompt_template_registry.register(prompt, is_default=True)
|
||||
from . import prompt_baichuan
|
||||
|
66
pilot/scene/chat_db/auto_execute/prompt_baichuan.py
Normal file
66
pilot/scene/chat_db/auto_execute/prompt_baichuan.py
Normal file
@ -0,0 +1,66 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import json
|
||||
from pilot.prompts.prompt_new import PromptTemplate
|
||||
from pilot.configs.config import Config
|
||||
from pilot.scene.base import ChatScene
|
||||
from pilot.scene.chat_db.auto_execute.out_parser import DbChatOutputParser, SqlAction
|
||||
from pilot.common.schema import SeparatorStyle
|
||||
from pilot.scene.chat_db.auto_execute.example import sql_data_example
|
||||
|
||||
CFG = Config()
|
||||
|
||||
PROMPT_SCENE_DEFINE = None
|
||||
|
||||
_DEFAULT_TEMPLATE = """
|
||||
你是一个 SQL 专家,给你一个用户的问题,你会生成一条对应的 {dialect} 语法的 SQL 语句。
|
||||
|
||||
如果用户没有在问题中指定 sql 返回多少条数据,那么你生成的 sql 最多返回 {top_k} 条数据。
|
||||
你应该尽可能少地使用表。
|
||||
|
||||
已知表结构信息如下:
|
||||
{table_info}
|
||||
|
||||
注意:
|
||||
1. 只能使用表结构信息中提供的表来生成 sql,如果无法根据提供的表结构中生成 sql ,请说:“提供的表结构信息不足以生成 sql 查询。” 禁止随意捏造信息。
|
||||
2. 不要查询不存在的列,注意哪一列位于哪张表中。
|
||||
3. 使用 json 格式回答,确保你的回答是必须是正确的 json 格式,并且能被 python 语言的 `json.loads` 库解析, 格式如下:
|
||||
{response}
|
||||
"""
|
||||
|
||||
RESPONSE_FORMAT_SIMPLE = {
|
||||
"thoughts": "对用户说的想法摘要",
|
||||
"sql": "生成的将被执行的 SQL",
|
||||
}
|
||||
|
||||
PROMPT_SEP = SeparatorStyle.SINGLE.value
|
||||
|
||||
PROMPT_NEED_NEED_STREAM_OUT = False
|
||||
|
||||
# Temperature is a configuration hyperparameter that controls the randomness of language model output.
|
||||
# A high temperature produces more unpredictable and creative results, while a low temperature produces more common and conservative output.
|
||||
# For example, if you adjust the temperature to 0.5, the model will usually generate text that is more predictable and less creative than if you set the temperature to 1.0.
|
||||
PROMPT_TEMPERATURE = 0.5
|
||||
|
||||
prompt = PromptTemplate(
|
||||
template_scene=ChatScene.ChatWithDbExecute.value(),
|
||||
input_variables=["input", "table_info", "dialect", "top_k", "response"],
|
||||
response_format=json.dumps(RESPONSE_FORMAT_SIMPLE, ensure_ascii=False, indent=4),
|
||||
template_is_strict=False,
|
||||
template_define=PROMPT_SCENE_DEFINE,
|
||||
template=_DEFAULT_TEMPLATE,
|
||||
stream_out=PROMPT_NEED_NEED_STREAM_OUT,
|
||||
output_parser=DbChatOutputParser(
|
||||
sep=PROMPT_SEP, is_stream_out=PROMPT_NEED_NEED_STREAM_OUT
|
||||
),
|
||||
# example_selector=sql_data_example,
|
||||
temperature=PROMPT_TEMPERATURE,
|
||||
)
|
||||
|
||||
CFG.prompt_template_registry.register(
|
||||
prompt,
|
||||
language=CFG.LANGUAGE,
|
||||
is_default=False,
|
||||
model_names=["baichuan-13b", "baichuan-7b"],
|
||||
)
|
@ -19,17 +19,20 @@ class BaseChatAdpter:
|
||||
"""Return the generate stream handler func"""
|
||||
pass
|
||||
|
||||
def get_conv_template(self) -> Conversation:
|
||||
def get_conv_template(self, model_path: str) -> Conversation:
|
||||
return None
|
||||
|
||||
def model_adaptation(self, params: Dict) -> Tuple[Dict, Dict]:
|
||||
def model_adaptation(self, params: Dict, model_path: str) -> Tuple[Dict, Dict]:
|
||||
"""Params adaptation"""
|
||||
conv = self.get_conv_template()
|
||||
conv = self.get_conv_template(model_path)
|
||||
messages = params.get("messages")
|
||||
# Some model scontext to dbgpt server
|
||||
model_context = {"prompt_echo_len_char": -1}
|
||||
if not conv or not messages:
|
||||
# Nothing to do
|
||||
print(
|
||||
f"No conv from model_path {model_path} or no messages in params, {self}"
|
||||
)
|
||||
return params, model_context
|
||||
conv = conv.copy()
|
||||
system_messages = []
|
||||
@ -62,7 +65,12 @@ class BaseChatAdpter:
|
||||
# TODO remote bos token and eos token from tokenizer_config.json of model
|
||||
prompt_echo_len_char = len(new_prompt.replace("</s>", "").replace("<s>", ""))
|
||||
model_context["prompt_echo_len_char"] = prompt_echo_len_char
|
||||
model_context["echo"] = params.get("echo", True)
|
||||
params["prompt"] = new_prompt
|
||||
|
||||
# Overwrite model params:
|
||||
params["stop"] = conv.stop_str
|
||||
|
||||
return params, model_context
|
||||
|
||||
|
||||
@ -79,6 +87,7 @@ def get_llm_chat_adapter(model_path: str) -> BaseChatAdpter:
|
||||
"""Get a chat generate func for a model"""
|
||||
for adapter in llm_model_chat_adapters:
|
||||
if adapter.match(model_path):
|
||||
print(f"Get model path: {model_path} adapter {adapter}")
|
||||
return adapter
|
||||
|
||||
raise ValueError(f"Invalid model for chat adapter {model_path}")
|
||||
@ -186,7 +195,7 @@ class Llama2ChatAdapter(BaseChatAdpter):
|
||||
def match(self, model_path: str):
|
||||
return "llama-2" in model_path.lower()
|
||||
|
||||
def get_conv_template(self) -> Conversation:
|
||||
def get_conv_template(self, model_path: str) -> Conversation:
|
||||
return get_conv_template("llama-2")
|
||||
|
||||
def get_generate_stream_func(self):
|
||||
@ -195,6 +204,21 @@ class Llama2ChatAdapter(BaseChatAdpter):
|
||||
return generate_stream
|
||||
|
||||
|
||||
class BaichuanChatAdapter(BaseChatAdpter):
|
||||
def match(self, model_path: str):
|
||||
return "baichuan" in model_path.lower()
|
||||
|
||||
def get_conv_template(self, model_path: str) -> Conversation:
|
||||
if "chat" in model_path.lower():
|
||||
return get_conv_template("baichuan-chat")
|
||||
return get_conv_template("zero_shot")
|
||||
|
||||
def get_generate_stream_func(self):
|
||||
from pilot.model.inference import generate_stream
|
||||
|
||||
return generate_stream
|
||||
|
||||
|
||||
register_llm_model_chat_adapter(VicunaChatAdapter)
|
||||
register_llm_model_chat_adapter(ChatGLMChatAdapter)
|
||||
register_llm_model_chat_adapter(GuanacoChatAdapter)
|
||||
@ -202,6 +226,7 @@ register_llm_model_chat_adapter(FalconChatAdapter)
|
||||
register_llm_model_chat_adapter(GorillaChatAdapter)
|
||||
register_llm_model_chat_adapter(GPT4AllChatAdapter)
|
||||
register_llm_model_chat_adapter(Llama2ChatAdapter)
|
||||
register_llm_model_chat_adapter(BaichuanChatAdapter)
|
||||
|
||||
# Proxy model for test and develop, it's cheap for us now.
|
||||
register_llm_model_chat_adapter(ProxyllmChatAdapter)
|
||||
|
@ -78,7 +78,9 @@ class ModelWorker:
|
||||
def generate_stream_gate(self, params):
|
||||
try:
|
||||
# params adaptation
|
||||
params, model_context = self.llm_chat_adapter.model_adaptation(params)
|
||||
params, model_context = self.llm_chat_adapter.model_adaptation(
|
||||
params, self.ml.model_path
|
||||
)
|
||||
for output in self.generate_stream_func(
|
||||
self.model, self.tokenizer, params, DEVICE, CFG.MAX_POSITION_EMBEDDINGS
|
||||
):
|
||||
@ -136,6 +138,7 @@ class PromptRequest(BaseModel):
|
||||
max_new_tokens: int
|
||||
model: str
|
||||
stop: str = None
|
||||
echo: bool = True
|
||||
|
||||
|
||||
class StreamRequest(BaseModel):
|
||||
@ -178,6 +181,7 @@ def generate(prompt_request: PromptRequest) -> str:
|
||||
"temperature": prompt_request.temperature,
|
||||
"max_new_tokens": prompt_request.max_new_tokens,
|
||||
"stop": prompt_request.stop,
|
||||
"echo": prompt_request.echo,
|
||||
}
|
||||
|
||||
rsp_str = ""
|
||||
|
@ -1 +1 @@
|
||||
<!DOCTYPE html><html><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width"/><title>404: This page could not be found</title><meta name="next-head-count" content="3"/><noscript data-n-css=""></noscript><script defer="" nomodule="" src="/_next/static/chunks/polyfills-78c92fac7aa8fdd8.js"></script><script src="/_next/static/chunks/webpack-81b9e46a3f1e5c68.js" defer=""></script><script src="/_next/static/chunks/framework-43665103d101a22d.js" defer=""></script><script src="/_next/static/chunks/main-c6e90425c3eeb90a.js" defer=""></script><script src="/_next/static/chunks/pages/_app-1f2755172264764d.js" defer=""></script><script src="/_next/static/chunks/pages/_error-f5357f382422dd96.js" defer=""></script><script src="/_next/static/Cow4Dk3Cb5ywOteYPWBYm/_buildManifest.js" defer=""></script><script src="/_next/static/Cow4Dk3Cb5ywOteYPWBYm/_ssgManifest.js" defer=""></script></head><body><div id="__next"><div style="font-family:system-ui,"Segoe UI",Roboto,Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji";height:100vh;text-align:center;display:flex;flex-direction:column;align-items:center;justify-content:center"><div style="line-height:48px"><style>body{color:#000;background:#fff;margin:0}.next-error-h1{border-right:1px solid rgba(0,0,0,.3)}@media (prefers-color-scheme:dark){body{color:#fff;background:#000}.next-error-h1{border-right:1px solid rgba(255,255,255,.3)}}</style><h1 class="next-error-h1" style="display:inline-block;margin:0 20px 0 0;padding-right:23px;font-size:24px;font-weight:500;vertical-align:top">404</h1><div style="display:inline-block"><h2 style="font-size:14px;font-weight:400;line-height:28px">This page could not be found<!-- -->.</h2></div></div></div></div><script id="__NEXT_DATA__" type="application/json">{"props":{"pageProps":{"statusCode":404}},"page":"/_error","query":{},"buildId":"Cow4Dk3Cb5ywOteYPWBYm","nextExport":true,"isFallback":false,"gip":true,"scriptLoader":[]}</script></body></html>
|
||||
<!DOCTYPE html><html><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width"/><title>404: This page could not be found</title><meta name="next-head-count" content="3"/><noscript data-n-css=""></noscript><script defer="" nomodule="" src="/_next/static/chunks/polyfills-78c92fac7aa8fdd8.js"></script><script src="/_next/static/chunks/webpack-e0b549c3ec4ce91b.js" defer=""></script><script src="/_next/static/chunks/framework-43665103d101a22d.js" defer=""></script><script src="/_next/static/chunks/main-c6e90425c3eeb90a.js" defer=""></script><script src="/_next/static/chunks/pages/_app-1f2755172264764d.js" defer=""></script><script src="/_next/static/chunks/pages/_error-f5357f382422dd96.js" defer=""></script><script src="/_next/static/AVF7sR15c1tF8wuv8mGBK/_buildManifest.js" defer=""></script><script src="/_next/static/AVF7sR15c1tF8wuv8mGBK/_ssgManifest.js" defer=""></script></head><body><div id="__next"><div style="font-family:system-ui,"Segoe UI",Roboto,Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji";height:100vh;text-align:center;display:flex;flex-direction:column;align-items:center;justify-content:center"><div style="line-height:48px"><style>body{color:#000;background:#fff;margin:0}.next-error-h1{border-right:1px solid rgba(0,0,0,.3)}@media (prefers-color-scheme:dark){body{color:#fff;background:#000}.next-error-h1{border-right:1px solid rgba(255,255,255,.3)}}</style><h1 class="next-error-h1" style="display:inline-block;margin:0 20px 0 0;padding-right:23px;font-size:24px;font-weight:500;vertical-align:top">404</h1><div style="display:inline-block"><h2 style="font-size:14px;font-weight:400;line-height:28px">This page could not be found<!-- -->.</h2></div></div></div></div><script id="__NEXT_DATA__" type="application/json">{"props":{"pageProps":{"statusCode":404}},"page":"/_error","query":{},"buildId":"AVF7sR15c1tF8wuv8mGBK","nextExport":true,"isFallback":false,"gip":true,"scriptLoader":[]}</script></body></html>
|
@ -1 +1 @@
|
||||
<!DOCTYPE html><html><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width"/><title>404: This page could not be found</title><meta name="next-head-count" content="3"/><noscript data-n-css=""></noscript><script defer="" nomodule="" src="/_next/static/chunks/polyfills-78c92fac7aa8fdd8.js"></script><script src="/_next/static/chunks/webpack-81b9e46a3f1e5c68.js" defer=""></script><script src="/_next/static/chunks/framework-43665103d101a22d.js" defer=""></script><script src="/_next/static/chunks/main-c6e90425c3eeb90a.js" defer=""></script><script src="/_next/static/chunks/pages/_app-1f2755172264764d.js" defer=""></script><script src="/_next/static/chunks/pages/_error-f5357f382422dd96.js" defer=""></script><script src="/_next/static/Cow4Dk3Cb5ywOteYPWBYm/_buildManifest.js" defer=""></script><script src="/_next/static/Cow4Dk3Cb5ywOteYPWBYm/_ssgManifest.js" defer=""></script></head><body><div id="__next"><div style="font-family:system-ui,"Segoe UI",Roboto,Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji";height:100vh;text-align:center;display:flex;flex-direction:column;align-items:center;justify-content:center"><div style="line-height:48px"><style>body{color:#000;background:#fff;margin:0}.next-error-h1{border-right:1px solid rgba(0,0,0,.3)}@media (prefers-color-scheme:dark){body{color:#fff;background:#000}.next-error-h1{border-right:1px solid rgba(255,255,255,.3)}}</style><h1 class="next-error-h1" style="display:inline-block;margin:0 20px 0 0;padding-right:23px;font-size:24px;font-weight:500;vertical-align:top">404</h1><div style="display:inline-block"><h2 style="font-size:14px;font-weight:400;line-height:28px">This page could not be found<!-- -->.</h2></div></div></div></div><script id="__NEXT_DATA__" type="application/json">{"props":{"pageProps":{"statusCode":404}},"page":"/_error","query":{},"buildId":"Cow4Dk3Cb5ywOteYPWBYm","nextExport":true,"isFallback":false,"gip":true,"scriptLoader":[]}</script></body></html>
|
||||
<!DOCTYPE html><html><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width"/><title>404: This page could not be found</title><meta name="next-head-count" content="3"/><noscript data-n-css=""></noscript><script defer="" nomodule="" src="/_next/static/chunks/polyfills-78c92fac7aa8fdd8.js"></script><script src="/_next/static/chunks/webpack-e0b549c3ec4ce91b.js" defer=""></script><script src="/_next/static/chunks/framework-43665103d101a22d.js" defer=""></script><script src="/_next/static/chunks/main-c6e90425c3eeb90a.js" defer=""></script><script src="/_next/static/chunks/pages/_app-1f2755172264764d.js" defer=""></script><script src="/_next/static/chunks/pages/_error-f5357f382422dd96.js" defer=""></script><script src="/_next/static/AVF7sR15c1tF8wuv8mGBK/_buildManifest.js" defer=""></script><script src="/_next/static/AVF7sR15c1tF8wuv8mGBK/_ssgManifest.js" defer=""></script></head><body><div id="__next"><div style="font-family:system-ui,"Segoe UI",Roboto,Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji";height:100vh;text-align:center;display:flex;flex-direction:column;align-items:center;justify-content:center"><div style="line-height:48px"><style>body{color:#000;background:#fff;margin:0}.next-error-h1{border-right:1px solid rgba(0,0,0,.3)}@media (prefers-color-scheme:dark){body{color:#fff;background:#000}.next-error-h1{border-right:1px solid rgba(255,255,255,.3)}}</style><h1 class="next-error-h1" style="display:inline-block;margin:0 20px 0 0;padding-right:23px;font-size:24px;font-weight:500;vertical-align:top">404</h1><div style="display:inline-block"><h2 style="font-size:14px;font-weight:400;line-height:28px">This page could not be found<!-- -->.</h2></div></div></div></div><script id="__NEXT_DATA__" type="application/json">{"props":{"pageProps":{"statusCode":404}},"page":"/_error","query":{},"buildId":"AVF7sR15c1tF8wuv8mGBK","nextExport":true,"isFallback":false,"gip":true,"scriptLoader":[]}</script></body></html>
|
BIN
pilot/server/static/LOGO.png
Normal file
BIN
pilot/server/static/LOGO.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 130 KiB |
BIN
pilot/server/static/LOGO_1.png
Normal file
BIN
pilot/server/static/LOGO_1.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 9.7 KiB |
@ -0,0 +1 @@
|
||||
self.__BUILD_MANIFEST={__rewrites:{beforeFiles:[],afterFiles:[],fallback:[]},"/_error":["static/chunks/pages/_error-f5357f382422dd96.js"],sortedPages:["/_app","/_error"]},self.__BUILD_MANIFEST_CB&&self.__BUILD_MANIFEST_CB();
|
@ -0,0 +1 @@
|
||||
self.__SSG_MANIFEST=new Set([]);self.__SSG_MANIFEST_CB&&self.__SSG_MANIFEST_CB()
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
||||
(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[538],{40687:function(e,t,n){Promise.resolve().then(n.bind(n,26257))},26257:function(e,t,n){"use strict";n.r(t);var r=n(9268),a=n(56008),i=n(86006),c=n(78635),s=n(80937),o=n(44334),l=n(311),d=n(22046),h=n(83192),u=n(23910),g=n(1031),f=n(78915);t.default=()=>{let e=(0,a.useRouter)(),{mode:t}=(0,c.tv)(),n=(0,a.useSearchParams)().get("spacename"),j=(0,a.useSearchParams)().get("documentid"),[m,p]=(0,i.useState)(0),[x,P]=(0,i.useState)(0),[S,_]=(0,i.useState)([]);return(0,i.useEffect)(()=>{(async function(){let e=await (0,f.PR)("/knowledge/".concat(n,"/chunk/list"),{document_id:j,page:1,page_size:20});e.success&&(_(e.data.data),p(e.data.total),P(e.data.page))})()},[]),(0,r.jsxs)("div",{className:"p-4",children:[(0,r.jsx)(s.Z,{direction:"row",justifyContent:"flex-start",alignItems:"center",sx:{marginBottom:"20px"},children:(0,r.jsxs)(o.Z,{"aria-label":"breadcrumbs",children:[(0,r.jsx)(l.Z,{onClick:()=>{e.push("/datastores")},underline:"hover",color:"neutral",fontSize:"inherit",children:"Knowledge Space"},"Knowledge Space"),(0,r.jsx)(l.Z,{onClick:()=>{e.push("/datastores/documents?name=".concat(n))},underline:"hover",color:"neutral",fontSize:"inherit",children:"Documents"},"Knowledge Space"),(0,r.jsx)(d.ZP,{fontSize:"inherit",children:"Chunks"})]})}),(0,r.jsx)("div",{className:"p-4",children:S.length?(0,r.jsxs)(r.Fragment,{children:[(0,r.jsxs)(h.Z,{color:"primary",variant:"plain",size:"lg",sx:{"& tbody tr: hover":{backgroundColor:"light"===t?"rgb(246, 246, 246)":"rgb(33, 33, 40)"},"& tbody tr: hover a":{textDecoration:"underline"}},children:[(0,r.jsx)("thead",{children:(0,r.jsxs)("tr",{children:[(0,r.jsx)("th",{children:"Name"}),(0,r.jsx)("th",{children:"Content"}),(0,r.jsx)("th",{children:"Meta Data"})]})}),(0,r.jsx)("tbody",{children:S.map(e=>(0,r.jsxs)("tr",{children:[(0,r.jsx)("td",{children:e.doc_name}),(0,r.jsx)("td",{children:(0,r.jsx)(u.Z,{content:e.content,trigger:"hover",children:e.content.length>10?"".concat(e.content.slice(0,10),"..."):e.content})}),(0,r.jsx)("td",{children:(0,r.jsx)(u.Z,{content:JSON.stringify(e.meta_info||"{}",null,2),trigger:"hover",children:e.meta_info.length>10?"".concat(e.meta_info.slice(0,10),"..."):e.meta_info})})]},e.id))})]}),(0,r.jsx)(s.Z,{direction:"row",justifyContent:"flex-end",sx:{marginTop:"20px"},children:(0,r.jsx)(g.Z,{defaultPageSize:20,showSizeChanger:!1,current:x,total:m,onChange:async e=>{let t=await (0,f.PR)("/knowledge/".concat(n,"/chunk/list"),{document_id:j,page:e,page_size:20});t.success&&(_(t.data.data),p(t.data.total),P(t.data.page))},hideOnSinglePage:!0})})]}):(0,r.jsx)(r.Fragment,{})})]})}},78915:function(e,t,n){"use strict";n.d(t,{Tk:function(){return d},Kw:function(){return h},PR:function(){return u},Ej:function(){return g}});var r=n(21628),a=n(24214),i=n(52040);let c=a.Z.create({baseURL:i.env.API_BASE_URL});c.defaults.timeout=1e4,c.interceptors.response.use(e=>e.data,e=>Promise.reject(e));var s=n(84835);let o={"content-type":"application/json"},l=e=>{if(!(0,s.isPlainObject)(e))return JSON.stringify(e);let t={...e};for(let e in t){let n=t[e];"string"==typeof n&&(t[e]=n.trim())}return JSON.stringify(t)},d=(e,t)=>{if(t){let n=Object.keys(t).filter(e=>void 0!==t[e]&&""!==t[e]).map(e=>"".concat(e,"=").concat(t[e])).join("&");n&&(e+="?".concat(n))}return c.get("/api"+e,{headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})},h=(e,t)=>{let n=l(t);return c.post("/api"+e,{body:n,headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})},u=(e,t)=>(l(t),c.post(e,t,{headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})),g=(e,t)=>c.post(e,t).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})}},function(e){e.O(0,[180,110,160,679,144,767,957,253,769,744],function(){return e(e.s=40687)}),_N_E=e.O()}]);
|
@ -0,0 +1 @@
|
||||
(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[538],{68463:function(e,t,n){Promise.resolve().then(n.bind(n,26257))},26257:function(e,t,n){"use strict";n.r(t);var r=n(9268),a=n(56008),i=n(86006),c=n(78635),s=n(80937),o=n(44334),l=n(311),d=n(22046),h=n(83192),u=n(23910),g=n(1031),f=n(78915);t.default=()=>{let e=(0,a.useRouter)(),{mode:t}=(0,c.tv)(),n=(0,a.useSearchParams)().get("spacename"),j=(0,a.useSearchParams)().get("documentid"),[m,p]=(0,i.useState)(0),[x,P]=(0,i.useState)(0),[S,_]=(0,i.useState)([]);return(0,i.useEffect)(()=>{(async function(){let e=await (0,f.PR)("/knowledge/".concat(n,"/chunk/list"),{document_id:j,page:1,page_size:20});e.success&&(_(e.data.data),p(e.data.total),P(e.data.page))})()},[]),(0,r.jsxs)("div",{className:"p-4",children:[(0,r.jsx)(s.Z,{direction:"row",justifyContent:"flex-start",alignItems:"center",sx:{marginBottom:"20px"},children:(0,r.jsxs)(o.Z,{"aria-label":"breadcrumbs",children:[(0,r.jsx)(l.Z,{onClick:()=>{e.push("/datastores")},underline:"hover",color:"neutral",fontSize:"inherit",children:"Knowledge Space"},"Knowledge Space"),(0,r.jsx)(l.Z,{onClick:()=>{e.push("/datastores/documents?name=".concat(n))},underline:"hover",color:"neutral",fontSize:"inherit",children:"Documents"},"Knowledge Space"),(0,r.jsx)(d.ZP,{fontSize:"inherit",children:"Chunks"})]})}),(0,r.jsx)("div",{className:"p-4",children:S.length?(0,r.jsxs)(r.Fragment,{children:[(0,r.jsxs)(h.Z,{color:"primary",variant:"plain",size:"lg",sx:{"& tbody tr: hover":{backgroundColor:"light"===t?"rgb(246, 246, 246)":"rgb(33, 33, 40)"},"& tbody tr: hover a":{textDecoration:"underline"}},children:[(0,r.jsx)("thead",{children:(0,r.jsxs)("tr",{children:[(0,r.jsx)("th",{children:"Name"}),(0,r.jsx)("th",{children:"Content"}),(0,r.jsx)("th",{children:"Meta Data"})]})}),(0,r.jsx)("tbody",{children:S.map(e=>(0,r.jsxs)("tr",{children:[(0,r.jsx)("td",{children:e.doc_name}),(0,r.jsx)("td",{children:(0,r.jsx)(u.Z,{content:e.content,trigger:"hover",children:e.content.length>10?"".concat(e.content.slice(0,10),"..."):e.content})}),(0,r.jsx)("td",{children:(0,r.jsx)(u.Z,{content:JSON.stringify(e.meta_info||"{}",null,2),trigger:"hover",children:e.meta_info.length>10?"".concat(e.meta_info.slice(0,10),"..."):e.meta_info})})]},e.id))})]}),(0,r.jsx)(s.Z,{direction:"row",justifyContent:"flex-end",sx:{marginTop:"20px"},children:(0,r.jsx)(g.Z,{defaultPageSize:20,showSizeChanger:!1,current:x,total:m,onChange:async e=>{let t=await (0,f.PR)("/knowledge/".concat(n,"/chunk/list"),{document_id:j,page:e,page_size:20});t.success&&(_(t.data.data),p(t.data.total),P(t.data.page))},hideOnSinglePage:!0})})]}):(0,r.jsx)(r.Fragment,{})})]})}},78915:function(e,t,n){"use strict";n.d(t,{Tk:function(){return d},Kw:function(){return h},PR:function(){return u},Ej:function(){return g}});var r=n(21628),a=n(24214),i=n(52040);let c=a.Z.create({baseURL:i.env.API_BASE_URL});c.defaults.timeout=1e4,c.interceptors.response.use(e=>e.data,e=>Promise.reject(e));var s=n(84835);let o={"content-type":"application/json"},l=e=>{if(!(0,s.isPlainObject)(e))return JSON.stringify(e);let t={...e};for(let e in t){let n=t[e];"string"==typeof n&&(t[e]=n.trim())}return JSON.stringify(t)},d=(e,t)=>{if(t){let n=Object.keys(t).filter(e=>void 0!==t[e]&&""!==t[e]).map(e=>"".concat(e,"=").concat(t[e])).join("&");n&&(e+="?".concat(n))}return c.get("/api"+e,{headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})},h=(e,t)=>{let n=l(t);return c.post("/api"+e,{body:n,headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})},u=(e,t)=>(l(t),c.post(e,t,{headers:o}).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})),g=(e,t)=>c.post(e,t).then(e=>e).catch(e=>{r.ZP.error(e),Promise.reject(e)})}},function(e){e.O(0,[180,838,341,679,144,767,957,253,769,744],function(){return e(e.s=68463)}),_N_E=e.O()}]);
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
||||
(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[744],{34577:function(e,n,t){Promise.resolve().then(t.t.bind(t,68802,23)),Promise.resolve().then(t.t.bind(t,13211,23)),Promise.resolve().then(t.t.bind(t,5767,23)),Promise.resolve().then(t.t.bind(t,14299,23)),Promise.resolve().then(t.t.bind(t,37396,23))}},function(e){var n=function(n){return e(e.s=n)};e.O(0,[253,769],function(){return n(29070),n(34577)}),_N_E=e.O()}]);
|
@ -0,0 +1 @@
|
||||
(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[744],{72656:function(e,n,t){Promise.resolve().then(t.t.bind(t,68802,23)),Promise.resolve().then(t.t.bind(t,13211,23)),Promise.resolve().then(t.t.bind(t,5767,23)),Promise.resolve().then(t.t.bind(t,14299,23)),Promise.resolve().then(t.t.bind(t,37396,23))}},function(e){var n=function(n){return e(e.s=n)};e.O(0,[253,769],function(){return n(29070),n(72656)}),_N_E=e.O()}]);
|
@ -0,0 +1 @@
|
||||
!function(){"use strict";var e,t,n,r,o,u,i,c,f,a={},l={};function d(e){var t=l[e];if(void 0!==t)return t.exports;var n=l[e]={id:e,loaded:!1,exports:{}},r=!0;try{a[e].call(n.exports,n,n.exports,d),r=!1}finally{r&&delete l[e]}return n.loaded=!0,n.exports}d.m=a,d.amdD=function(){throw Error("define cannot be used indirect")},e=[],d.O=function(t,n,r,o){if(n){o=o||0;for(var u=e.length;u>0&&e[u-1][2]>o;u--)e[u]=e[u-1];e[u]=[n,r,o];return}for(var i=1/0,u=0;u<e.length;u++){for(var n=e[u][0],r=e[u][1],o=e[u][2],c=!0,f=0;f<n.length;f++)i>=o&&Object.keys(d.O).every(function(e){return d.O[e](n[f])})?n.splice(f--,1):(c=!1,o<i&&(i=o));if(c){e.splice(u--,1);var a=r();void 0!==a&&(t=a)}}return t},d.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return d.d(t,{a:t}),t},n=Object.getPrototypeOf?function(e){return Object.getPrototypeOf(e)}:function(e){return e.__proto__},d.t=function(e,r){if(1&r&&(e=this(e)),8&r||"object"==typeof e&&e&&(4&r&&e.__esModule||16&r&&"function"==typeof e.then))return e;var o=Object.create(null);d.r(o);var u={};t=t||[null,n({}),n([]),n(n)];for(var i=2&r&&e;"object"==typeof i&&!~t.indexOf(i);i=n(i))Object.getOwnPropertyNames(i).forEach(function(t){u[t]=function(){return e[t]}});return u.default=function(){return e},d.d(o,u),o},d.d=function(e,t){for(var n in t)d.o(t,n)&&!d.o(e,n)&&Object.defineProperty(e,n,{enumerable:!0,get:t[n]})},d.f={},d.e=function(e){return Promise.all(Object.keys(d.f).reduce(function(t,n){return d.f[n](e,t),t},[]))},d.u=function(e){},d.miniCssF=function(e){return"static/css/70a90cb7ce1e4b6d.css"},d.g=function(){if("object"==typeof globalThis)return globalThis;try{return this||Function("return this")()}catch(e){if("object"==typeof window)return window}}(),d.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},r={},o="_N_E:",d.l=function(e,t,n,u){if(r[e]){r[e].push(t);return}if(void 0!==n)for(var i,c,f=document.getElementsByTagName("script"),a=0;a<f.length;a++){var l=f[a];if(l.getAttribute("src")==e||l.getAttribute("data-webpack")==o+n){i=l;break}}i||(c=!0,(i=document.createElement("script")).charset="utf-8",i.timeout=120,d.nc&&i.setAttribute("nonce",d.nc),i.setAttribute("data-webpack",o+n),i.src=d.tu(e)),r[e]=[t];var s=function(t,n){i.onerror=i.onload=null,clearTimeout(p);var o=r[e];if(delete r[e],i.parentNode&&i.parentNode.removeChild(i),o&&o.forEach(function(e){return e(n)}),t)return t(n)},p=setTimeout(s.bind(null,void 0,{type:"timeout",target:i}),12e4);i.onerror=s.bind(null,i.onerror),i.onload=s.bind(null,i.onload),c&&document.head.appendChild(i)},d.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},d.nmd=function(e){return e.paths=[],e.children||(e.children=[]),e},d.tt=function(){return void 0===u&&(u={createScriptURL:function(e){return e}},"undefined"!=typeof trustedTypes&&trustedTypes.createPolicy&&(u=trustedTypes.createPolicy("nextjs#bundler",u))),u},d.tu=function(e){return d.tt().createScriptURL(e)},d.p="/_next/",i={272:0},d.f.j=function(e,t){var n=d.o(i,e)?i[e]:void 0;if(0!==n){if(n)t.push(n[2]);else if(272!=e){var r=new Promise(function(t,r){n=i[e]=[t,r]});t.push(n[2]=r);var o=d.p+d.u(e),u=Error();d.l(o,function(t){if(d.o(i,e)&&(0!==(n=i[e])&&(i[e]=void 0),n)){var r=t&&("load"===t.type?"missing":t.type),o=t&&t.target&&t.target.src;u.message="Loading chunk "+e+" failed.\n("+r+": "+o+")",u.name="ChunkLoadError",u.type=r,u.request=o,n[1](u)}},"chunk-"+e,e)}else i[e]=0}},d.O.j=function(e){return 0===i[e]},c=function(e,t){var n,r,o=t[0],u=t[1],c=t[2],f=0;if(o.some(function(e){return 0!==i[e]})){for(n in u)d.o(u,n)&&(d.m[n]=u[n]);if(c)var a=c(d)}for(e&&e(t);f<o.length;f++)r=o[f],d.o(i,r)&&i[r]&&i[r][0](),i[r]=0;return d.O(a)},(f=self.webpackChunk_N_E=self.webpackChunk_N_E||[]).forEach(c.bind(null,0)),f.push=c.bind(null,f.push.bind(f))}();
|
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
||||
self.__BUILD_MANIFEST={__rewrites:{beforeFiles:[],afterFiles:[],fallback:[]},"/_error":["static/chunks/pages/_error-f5357f382422dd96.js"],sortedPages:["/_app","/_error"]},self.__BUILD_MANIFEST_CB&&self.__BUILD_MANIFEST_CB();
|
@ -0,0 +1 @@
|
||||
self.__SSG_MANIFEST=new Set([]);self.__SSG_MANIFEST_CB&&self.__SSG_MANIFEST_CB()
|
File diff suppressed because one or more lines are too long
@ -1,9 +1,9 @@
|
||||
1:HL["/_next/static/css/1c53d4eca82e2bb3.css",{"as":"style"}]
|
||||
0:["Cow4Dk3Cb5ywOteYPWBYm",[[["",{"children":["chat",{"children":["__PAGE__",{}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/1c53d4eca82e2bb3.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","144:static/chunks/144-8e8590698005aba2.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-34c784bda079f18d.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"65641","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","757:static/chunks/f60284a2-6891068c9ea7ce77.js","282:static/chunks/7e4358a0-8f10c290d655cdf1.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","86:static/chunks/86-6193a530bd8e3ef4.js","316:static/chunks/316-370750739484dff7.js","790:static/chunks/790-97e6b769f5c791cb.js","259:static/chunks/259-2c3490a9eca2f411.js","767:static/chunks/767-b93280f4b5b5e975.js","751:static/chunks/751-30fee9a32c6e64a2.js","436:static/chunks/436-0a7be5b31482f8e8.js","929:static/chunks/app/chat/page-fa8f6230bc48190e.js"],"name":"","async":false}
|
||||
1:HL["/_next/static/css/70a90cb7ce1e4b6d.css",{"as":"style"}]
|
||||
0:["AVF7sR15c1tF8wuv8mGBK",[[["",{"children":["chat",{"children":["__PAGE__",{}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/70a90cb7ce1e4b6d.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","144:static/chunks/144-8e8590698005aba2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","394:static/chunks/394-0ffa189aa535d3eb.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-2a5db76cf415780f.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"65641","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","757:static/chunks/f60284a2-6891068c9ea7ce77.js","282:static/chunks/7e4358a0-8f10c290d655cdf1.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","86:static/chunks/86-6193a530bd8e3ef4.js","316:static/chunks/316-370750739484dff7.js","790:static/chunks/790-97e6b769f5c791cb.js","767:static/chunks/767-b93280f4b5b5e975.js","259:static/chunks/259-2c3490a9eca2f411.js","751:static/chunks/751-30fee9a32c6e64a2.js","992:static/chunks/992-f088fd7821baa330.js","929:static/chunks/app/chat/page-4a580c13b269a988.js"],"name":"","async":false}
|
||||
2:[["$","$L4",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","chat","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":[["$","$L7",null,{"propsForComponent":{"params":{}},"Component":"$8"}],null],"segment":"__PAGE__"},"styles":[]}],"segment":"chat"},"styles":[]}],"params":{}}],null]
|
||||
3:[["$","meta","0",{"charSet":"utf-8"}],["$","meta","1",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","link","2",{"rel":"icon","href":"/favicon.ico","type":"image/x-icon","sizes":"any"}]]
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,9 +1,9 @@
|
||||
1:HL["/_next/static/css/1c53d4eca82e2bb3.css",{"as":"style"}]
|
||||
0:["Cow4Dk3Cb5ywOteYPWBYm",[[["",{"children":["datastores",{"children":["documents",{"children":["chunklist",{"children":["__PAGE__",{}]}]}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/1c53d4eca82e2bb3.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","144:static/chunks/144-8e8590698005aba2.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-34c784bda079f18d.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"26257","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","160:static/chunks/160-ba31b9436f6470d2.js","679:static/chunks/679-2432e2fce32149a4.js","144:static/chunks/144-8e8590698005aba2.js","767:static/chunks/767-b93280f4b5b5e975.js","957:static/chunks/957-80662c0af3fc4d0d.js","538:static/chunks/app/datastores/documents/chunklist/page-1fa22911a9476f41.js"],"name":"","async":false}
|
||||
1:HL["/_next/static/css/70a90cb7ce1e4b6d.css",{"as":"style"}]
|
||||
0:["AVF7sR15c1tF8wuv8mGBK",[[["",{"children":["datastores",{"children":["documents",{"children":["chunklist",{"children":["__PAGE__",{}]}]}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/70a90cb7ce1e4b6d.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","144:static/chunks/144-8e8590698005aba2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","394:static/chunks/394-0ffa189aa535d3eb.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-2a5db76cf415780f.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"26257","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","341:static/chunks/341-c3312a204c5835b8.js","679:static/chunks/679-2432e2fce32149a4.js","144:static/chunks/144-8e8590698005aba2.js","767:static/chunks/767-b93280f4b5b5e975.js","957:static/chunks/957-80662c0af3fc4d0d.js","538:static/chunks/app/datastores/documents/chunklist/page-76d75e816f549f8a.js"],"name":"","async":false}
|
||||
2:[["$","$L4",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children","documents","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children","documents","children","chunklist","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":[["$","$L7",null,{"propsForComponent":{"params":{}},"Component":"$8"}],null],"segment":"__PAGE__"},"styles":[]}],"segment":"chunklist"},"styles":[]}],"segment":"documents"},"styles":[]}],"segment":"datastores"},"styles":[]}],"params":{}}],null]
|
||||
3:[["$","meta","0",{"charSet":"utf-8"}],["$","meta","1",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","link","2",{"rel":"icon","href":"/favicon.ico","type":"image/x-icon","sizes":"any"}]]
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,9 +1,9 @@
|
||||
1:HL["/_next/static/css/1c53d4eca82e2bb3.css",{"as":"style"}]
|
||||
0:["Cow4Dk3Cb5ywOteYPWBYm",[[["",{"children":["datastores",{"children":["documents",{"children":["__PAGE__",{}]}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/1c53d4eca82e2bb3.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","144:static/chunks/144-8e8590698005aba2.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-34c784bda079f18d.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"42069","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","86:static/chunks/86-6193a530bd8e3ef4.js","679:static/chunks/679-2432e2fce32149a4.js","790:static/chunks/790-97e6b769f5c791cb.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","163:static/chunks/163-59f735b072797bdd.js","470:static/chunks/app/datastores/documents/page-7226571ba18444cc.js"],"name":"","async":false}
|
||||
1:HL["/_next/static/css/70a90cb7ce1e4b6d.css",{"as":"style"}]
|
||||
0:["AVF7sR15c1tF8wuv8mGBK",[[["",{"children":["datastores",{"children":["documents",{"children":["__PAGE__",{}]}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/70a90cb7ce1e4b6d.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","144:static/chunks/144-8e8590698005aba2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","394:static/chunks/394-0ffa189aa535d3eb.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-2a5db76cf415780f.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"16692","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","550:static/chunks/925f3d25-1af7259455ef26bd.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","86:static/chunks/86-6193a530bd8e3ef4.js","679:static/chunks/679-2432e2fce32149a4.js","144:static/chunks/144-8e8590698005aba2.js","790:static/chunks/790-97e6b769f5c791cb.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","767:static/chunks/767-b93280f4b5b5e975.js","957:static/chunks/957-80662c0af3fc4d0d.js","775:static/chunks/775-224c8c8f5ee3fd65.js","470:static/chunks/app/datastores/documents/page-5386a639d658c30c.js"],"name":"","async":false}
|
||||
2:[["$","$L4",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children","documents","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":[["$","$L7",null,{"propsForComponent":{"params":{}},"Component":"$8"}],null],"segment":"__PAGE__"},"styles":[]}],"segment":"documents"},"styles":[]}],"segment":"datastores"},"styles":[]}],"params":{}}],null]
|
||||
3:[["$","meta","0",{"charSet":"utf-8"}],["$","meta","1",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","link","2",{"rel":"icon","href":"/favicon.ico","type":"image/x-icon","sizes":"any"}]]
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,9 +1,9 @@
|
||||
1:HL["/_next/static/css/1c53d4eca82e2bb3.css",{"as":"style"}]
|
||||
0:["Cow4Dk3Cb5ywOteYPWBYm",[[["",{"children":["datastores",{"children":["__PAGE__",{}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/1c53d4eca82e2bb3.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","144:static/chunks/144-8e8590698005aba2.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-34c784bda079f18d.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"44323","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","86:static/chunks/86-6193a530bd8e3ef4.js","679:static/chunks/679-2432e2fce32149a4.js","790:static/chunks/790-97e6b769f5c791cb.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","163:static/chunks/163-59f735b072797bdd.js","43:static/chunks/app/datastores/page-643e5d19222b3bcd.js"],"name":"","async":false}
|
||||
1:HL["/_next/static/css/70a90cb7ce1e4b6d.css",{"as":"style"}]
|
||||
0:["AVF7sR15c1tF8wuv8mGBK",[[["",{"children":["datastores",{"children":["__PAGE__",{}]}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/70a90cb7ce1e4b6d.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","144:static/chunks/144-8e8590698005aba2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","394:static/chunks/394-0ffa189aa535d3eb.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-2a5db76cf415780f.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"44323","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","86:static/chunks/86-6193a530bd8e3ef4.js","679:static/chunks/679-2432e2fce32149a4.js","790:static/chunks/790-97e6b769f5c791cb.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","775:static/chunks/775-224c8c8f5ee3fd65.js","43:static/chunks/app/datastores/page-6193a6580da1c259.js"],"name":"","async":false}
|
||||
2:[["$","$L4",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children","datastores","children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":[["$","$L7",null,{"propsForComponent":{"params":{}},"Component":"$8"}],null],"segment":"__PAGE__"},"styles":[]}],"segment":"datastores"},"styles":[]}],"params":{}}],null]
|
||||
3:[["$","meta","0",{"charSet":"utf-8"}],["$","meta","1",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","link","2",{"rel":"icon","href":"/favicon.ico","type":"image/x-icon","sizes":"any"}]]
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,9 +1,9 @@
|
||||
1:HL["/_next/static/css/1c53d4eca82e2bb3.css",{"as":"style"}]
|
||||
0:["Cow4Dk3Cb5ywOteYPWBYm",[[["",{"children":["__PAGE__",{}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/1c53d4eca82e2bb3.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","160:static/chunks/160-ba31b9436f6470d2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","144:static/chunks/144-8e8590698005aba2.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-34c784bda079f18d.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-81b9e46a3f1e5c68.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"26925","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","110:static/chunks/110-470e5d8a0cb4cf14.js","60:static/chunks/60-8ef99caef9fdf742.js","86:static/chunks/86-6193a530bd8e3ef4.js","316:static/chunks/316-370750739484dff7.js","259:static/chunks/259-2c3490a9eca2f411.js","931:static/chunks/app/page-d81704e0a3437383.js"],"name":"","async":false}
|
||||
1:HL["/_next/static/css/70a90cb7ce1e4b6d.css",{"as":"style"}]
|
||||
0:["AVF7sR15c1tF8wuv8mGBK",[[["",{"children":["__PAGE__",{}]},"$undefined","$undefined",true],"$L2",[[["$","link","0",{"rel":"stylesheet","href":"/_next/static/css/70a90cb7ce1e4b6d.css","precedence":"next"}]],["$L3",null]]]]]
|
||||
4:I{"id":"50902","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","341:static/chunks/341-c3312a204c5835b8.js","144:static/chunks/144-8e8590698005aba2.js","316:static/chunks/316-370750739484dff7.js","946:static/chunks/946-3a66ddfd20b8ad3d.js","394:static/chunks/394-0ffa189aa535d3eb.js","751:static/chunks/751-30fee9a32c6e64a2.js","256:static/chunks/256-f82130fbef33c4d6.js","185:static/chunks/app/layout-2a5db76cf415780f.js"],"name":"","async":false}
|
||||
5:I{"id":"13211","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
6:I{"id":"5767","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
7:I{"id":"37396","chunks":["272:static/chunks/webpack-e0b549c3ec4ce91b.js","253:static/chunks/bce60fc1-18c9f145b45d8f36.js","769:static/chunks/769-76f7aafd375fdd6b.js"],"name":"","async":false}
|
||||
8:I{"id":"26925","chunks":["180:static/chunks/0e02fca3-615d0d51fa074d92.js","838:static/chunks/838-25c9b71d449c8910.js","60:static/chunks/60-8ef99caef9fdf742.js","86:static/chunks/86-6193a530bd8e3ef4.js","316:static/chunks/316-370750739484dff7.js","259:static/chunks/259-2c3490a9eca2f411.js","394:static/chunks/394-0ffa189aa535d3eb.js","931:static/chunks/app/page-eda7ab88dcc52057.js"],"name":"","async":false}
|
||||
2:[["$","$L4",null,{"children":["$","$L5",null,{"parallelRouterKey":"children","segmentPath":["children"],"error":"$undefined","errorStyles":"$undefined","loading":"$undefined","loadingStyles":"$undefined","hasLoading":false,"template":["$","$L6",null,{}],"templateStyles":"$undefined","notFound":"$undefined","notFoundStyles":"$undefined","childProp":{"current":[["$","$L7",null,{"propsForComponent":{"params":{}},"Component":"$8"}],null],"segment":"__PAGE__"},"styles":[]}],"params":{}}],null]
|
||||
3:[["$","meta","0",{"charSet":"utf-8"}],["$","meta","1",{"name":"viewport","content":"width=device-width, initial-scale=1"}],["$","link","2",{"rel":"icon","href":"/favicon.ico","type":"image/x-icon","sizes":"any"}]]
|
||||
|
@ -1,8 +1,9 @@
|
||||
from pilot.vector_store.chroma_store import ChromaStore
|
||||
|
||||
from pilot.vector_store.milvus_store import MilvusStore
|
||||
from pilot.vector_store.weaviate_store import WeaviateStore
|
||||
|
||||
connector = {"Chroma": ChromaStore, "Milvus": MilvusStore}
|
||||
connector = {"Chroma": ChromaStore, "Milvus": MilvusStore, "Weaviate": WeaviateStore}
|
||||
|
||||
|
||||
class VectorStoreConnector:
|
||||
|
@ -28,6 +28,7 @@ pyyaml==6.0
|
||||
tokenizers==0.13.2
|
||||
tqdm==4.64.1
|
||||
transformers==4.30.0
|
||||
transformers_stream_generator
|
||||
timm==0.6.13
|
||||
spacy==3.5.3
|
||||
webdataset==0.2.48
|
||||
@ -68,7 +69,7 @@ distro
|
||||
pypdf
|
||||
weaviate-client
|
||||
|
||||
# databse
|
||||
# database
|
||||
|
||||
pymysql
|
||||
duckdb
|
||||
|
Loading…
Reference in New Issue
Block a user