docs: SMMF introduction and usage (#878)
Co-authored-by: junewgl <1965259211@qq.com>
@ -190,6 +190,13 @@ TONGYI_PROXY_API_KEY={your-tongyi-sk}
|
||||
#BAICHUAN_PROXY_API_KEY={your-baichuan-sk}
|
||||
#BAICHUAN_PROXY_API_SECRET={your-baichuan-sct}
|
||||
|
||||
# Xunfei Spark
|
||||
#XUNFEI_SPARK_API_VERSION={version}
|
||||
#XUNFEI_SPARK_APPID={your_app_id}
|
||||
#XUNFEI_SPARK_API_KEY={your_api_key}
|
||||
#XUNFEI_SPARK_API_SECRET={your_api_secret}
|
||||
|
||||
|
||||
|
||||
#*******************************************************************#
|
||||
#** SUMMARY_CONFIG **#
|
||||
|
1
.gitignore
vendored
@ -9,7 +9,6 @@ __pycache__/
|
||||
message/
|
||||
|
||||
.env
|
||||
.idea
|
||||
.vscode
|
||||
.idea
|
||||
.chroma
|
||||
|
@ -10,7 +10,8 @@ git clone https://github.com/<YOUR-GITHUB-USERNAME>/DB-GPT
|
||||
```
|
||||
3. Install the project requirements
|
||||
```
|
||||
pip install -r requirements/dev-requirements.txt
|
||||
pip install -e ".[default]"
|
||||
|
||||
```
|
||||
4. Install pre-commit hooks
|
||||
```
|
||||
|
@ -1,3 +1,3 @@
|
||||
include README.md
|
||||
include LICENSE
|
||||
include README.md
|
||||
include requirements.txt
|
||||
|
@ -102,7 +102,11 @@ At present, we have introduced several key features to showcase our current capa
|
||||
|
||||
We offer extensive model support, including dozens of large language models (LLMs) from both open-source and API agents, such as LLaMA/LLaMA2, Baichuan, ChatGLM, Wenxin, Tongyi, Zhipu, and many more.
|
||||
|
||||
- [Current Supported LLMs](http://docs.dbgpt.site/docs/modules/smmf)
|
||||
- News
|
||||
- 🔥🔥🔥 [qwen-72b-chat](https://huggingface.co/Qwen/Qwen-72B-Chat)
|
||||
- 🔥🔥🔥 [Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat)
|
||||
- [More Supported LLMs](http://docs.dbgpt.site/docs/modules/smmf)
|
||||
|
||||
- **Privacy and Security**
|
||||
|
||||
We ensure the privacy and security of data through the implementation of various technologies, including privatized large models and proxy desensitization.
|
||||
|
50
README.zh.md
@ -114,27 +114,10 @@ DB-GPT是一个开源的数据库领域大模型框架。目的是构建大模
|
||||
|
||||
海量模型支持,包括开源、API代理等几十种大语言模型。如LLaMA/LLaMA2、Baichuan、ChatGLM、文心、通义、智谱等。当前已支持如下模型:
|
||||
|
||||
- [Vicuna](https://huggingface.co/Tribbiani/vicuna-13b)
|
||||
- [vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5)
|
||||
- [LLama2](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
|
||||
- [baichuan2-13b](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat)
|
||||
- [baichuan2-7b](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat)
|
||||
- [chatglm-6b](https://huggingface.co/THUDM/chatglm-6b)
|
||||
- [chatglm2-6b](https://huggingface.co/THUDM/chatglm2-6b)
|
||||
- [chatglm3-6b](https://huggingface.co/THUDM/chatglm3-6b)
|
||||
- [falcon-40b](https://huggingface.co/tiiuae/falcon-40b)
|
||||
- [internlm-chat-7b](https://huggingface.co/internlm/internlm-chat-7b)
|
||||
- [internlm-chat-20b](https://huggingface.co/internlm/internlm-chat-20b)
|
||||
- [qwen-7b-chat](https://huggingface.co/Qwen/Qwen-7B-Chat)
|
||||
- [qwen-14b-chat](https://huggingface.co/Qwen/Qwen-14B-Chat)
|
||||
- [qwen-72b-chat](https://huggingface.co/Qwen/Qwen-72B-Chat)
|
||||
- [wizardlm-13b](https://huggingface.co/WizardLM/WizardLM-13B-V1.2)
|
||||
- [orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
||||
- [orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b)
|
||||
- [openchat_3.5](https://huggingface.co/openchat/openchat_3.5)
|
||||
- [zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)
|
||||
- [mistral-7b-instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
|
||||
- [Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat)
|
||||
- 新增支持模型
|
||||
- 🔥🔥🔥 [qwen-72b-chat](https://huggingface.co/Qwen/Qwen-72B-Chat)
|
||||
- 🔥🔥🔥 [Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat)
|
||||
- [更多开源模型](https://www.yuque.com/eosphoros/dbgpt-docs/iqaaqwriwhp6zslc#qQktR)
|
||||
|
||||
- 支持在线代理模型
|
||||
- [x] [OpenAI·ChatGPT](https://api.openai.com/)
|
||||
@ -148,28 +131,8 @@ DB-GPT是一个开源的数据库领域大模型框架。目的是构建大模
|
||||
|
||||
通过私有化大模型、代理脱敏等多种技术保障数据的隐私安全。
|
||||
|
||||
- 支持数据源
|
||||
- [支持数据源](https://www.yuque.com/eosphoros/dbgpt-docs/rc4r27ybmdwg9472)
|
||||
|
||||
| DataSource | support | Notes |
|
||||
| ------------------------------------------------------------------------------ | ----------- | ------------------------------------------- |
|
||||
| [MySQL](https://www.mysql.com/) | Yes | |
|
||||
| [PostgresSQL](https://www.postgresql.org/) | Yes | |
|
||||
| [Spark](https://github.com/apache/spark) | Yes | |
|
||||
| [DuckDB](https://github.com/duckdb/duckdb) | Yes | |
|
||||
| [Sqlite](https://github.com/sqlite/sqlite) | Yes | |
|
||||
| [MSSQL](https://github.com/microsoft/mssql-jdbc) | Yes | |
|
||||
| [ClickHouse](https://github.com/ClickHouse/ClickHouse) | Yes | |
|
||||
| [Oracle](https://github.com/oracle) | No | TODO |
|
||||
| [Redis](https://github.com/redis/redis) | No | TODO |
|
||||
| [MongoDB](https://github.com/mongodb/mongo) | No | TODO |
|
||||
| [HBase](https://github.com/apache/hbase) | No | TODO |
|
||||
| [Doris](https://github.com/apache/doris) | No | TODO |
|
||||
| [DB2](https://github.com/IBM/Db2) | No | TODO |
|
||||
| [Couchbase](https://github.com/couchbase) | No | TODO |
|
||||
| [Elasticsearch](https://github.com/elastic/elasticsearch) | No | TODO |
|
||||
| [OceanBase](https://github.com/OceanBase) | No | TODO |
|
||||
| [TiDB](https://github.com/pingcap/tidb) | No | TODO |
|
||||
| [StarRocks](https://github.com/StarRocks/starrocks) | No | TODO |
|
||||
|
||||
## 架构方案
|
||||
整个DB-GPT的架构,如下图所示
|
||||
@ -266,6 +229,7 @@ The MIT License (MIT)
|
||||
- [x] Sqlite
|
||||
- [x] MSSQL
|
||||
- [x] ClickHouse
|
||||
- [x] StarRocks
|
||||
- [ ] Oracle
|
||||
- [ ] Redis
|
||||
- [ ] MongoDB
|
||||
@ -276,7 +240,7 @@ The MIT License (MIT)
|
||||
- [ ] Elasticsearch
|
||||
- [ ] OceanBase
|
||||
- [ ] TiDB
|
||||
- [ ] StarRocks
|
||||
|
||||
|
||||
### 多模型管理与推理优化
|
||||
- [x] [集群部署](https://db-gpt.readthedocs.io/en/latest/getting_started/install/cluster/vms/index.html)
|
||||
|
Before Width: | Height: | Size: 197 KiB After Width: | Height: | Size: 196 KiB |
@ -1,67 +0,0 @@
|
||||
csunny:
|
||||
name: csunny
|
||||
title: Owner
|
||||
url: https://github.com/csunny
|
||||
image_url: https://github.com/csunny.png
|
||||
|
||||
Aries-ckt:
|
||||
name: Aries-ckt
|
||||
title: Developer
|
||||
url: https://github.com/Aries-ckt
|
||||
image_url: https://github.com/Aries-ckt.png
|
||||
|
||||
yhjun1026:
|
||||
name: yhjun1026
|
||||
title: Developer
|
||||
url: https://github.com/yhjun1026
|
||||
image_url: https://github.com/yhjun1026.png
|
||||
|
||||
xuyuan23:
|
||||
name: Shinexy
|
||||
title: Developer
|
||||
url: https://github.com/xuyuan23
|
||||
image_url: https://github.com/xuyuan23.png
|
||||
|
||||
yihong0618:
|
||||
name: yihong0618
|
||||
title: Developer
|
||||
url: https://github.com/yihong0618
|
||||
image_url: https://github.com/yihong0618.png
|
||||
|
||||
zhanghy-sketchzh:
|
||||
name: zhanghy-sketchzh
|
||||
title: Developer
|
||||
url: https://github.com/zhanghy-sketchzh
|
||||
image_url: https://github.com/zhanghy-sketchzh.png
|
||||
|
||||
fangyinc:
|
||||
name: fangyinc
|
||||
title: Developer
|
||||
url: https://github.com/fangyinc
|
||||
image_url: https://github.com/fangyinc.png
|
||||
|
||||
wangzaistone:
|
||||
name: wangzaistone
|
||||
title: Developer
|
||||
url: https://github.com/wangzaistone
|
||||
image_url: https://github.com/wangzaistone.png
|
||||
|
||||
qutcat1997:
|
||||
name: qutcat1997
|
||||
title: Developer
|
||||
url: https://github.com/qutcat1997
|
||||
image_url: https://github.com/qutcat1997.png
|
||||
|
||||
|
||||
Aralhi:
|
||||
name: Aralhi
|
||||
title: Developer
|
||||
url: https://github.com/Aralhi
|
||||
image_url: https://github.com/Aralhi.png
|
||||
|
||||
|
||||
Ifffff:
|
||||
name: Ifffff
|
||||
title: Developer
|
||||
url: https://github.com/Ifffff
|
||||
image_url: https://github.com/Ifffff.png
|
@ -1,8 +0,0 @@
|
||||
---
|
||||
slug: welcome
|
||||
title: Welcome
|
||||
authors: [csunny, Aries-ckt, yhjun1026, xuyuan23, yihong0618, zhanghy-sketchzh, fangyinc, wangzaistone, qutcat1997, Aralhi, Ifffff]
|
||||
tags: [eosphoros-ai, DB-GPT, github]
|
||||
---
|
||||
|
||||
[DB-GPT](https://github.com/eosphoros-ai/DB-GPT) is an experimental open-source project that uses localized GPT large models to interact with your data and environment. With this solution, you can be assured that there is no risk of data leakage, and your data is 100% private and secure.
|
64
docs/docs/application/advanced_tutorial/smmf.md
Normal file
@ -0,0 +1,64 @@
|
||||
# SMMF
|
||||
|
||||
The DB-GPT project provides service-oriented multi-model management capabilities. Developer who are interested in related capabilities can read the [SMMF](/docs/modules/smmf) module part. Here we focus on how to use multi-LLMs.
|
||||
|
||||
|
||||
Here we mainly introduce the usage through the web interface. For developer interested in the command line, you can refer to the [cluster deployment](/docs/installation/model_service/cluster) model. Open the DB-GPT-Web frontend service and click on `Model Management` to enter the multi-model management interface.
|
||||
|
||||
|
||||
## List Models
|
||||
By opening the model management interface, we can see the list of currently deployed models. The following is the list of models.
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_list.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
## Use Models
|
||||
Once the models are deployed, you can switch and use the corresponding model on the multi-model interface.
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_use.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
## Stop Models
|
||||
As shown in the figure below, click Model Management to enter the model list interface. Select a specific model and click the red `Stop Model` button to stop the model.
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_stop.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
After the model is stopped, the display in the upper right corner will change.
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_stopped.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
## Model Deployment
|
||||
|
||||
1. Open the web page, click the `model management` button on the left to enter the model list page, click `Create Model` in the upper left corner, and then select the name of the model you want to deploy in the pop-up dialog box. Here we choose `vicuna-7b-v1.5`, as shown in the figure.
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_vicuna-7b-1.5.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
|
||||
2. Select the appropriate parameters according to the actual deployed model (if you are not sure, the default is enough), then click the `Submit` button at the bottom left of the dialog box, and wait until the model is deployed successfully.
|
||||
|
||||
3. After the new model is deployed, you can see the newly deployed model on the model page, as shown in the figure
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/model_vicuna_deployed.png'} width="720px"/>
|
||||
</p>
|
||||
|
||||
# Operations and Observability
|
||||
|
||||
Operations and observability are important components of a production system. In terms of operational capabilities, DB-GPT provides a command-line tool called dbgpt for operations and management, in addition to the common management functionalities available on the web interface. The dbgpt command-line tool offers the following functionalities:
|
||||
|
||||
- Starting and stopping various services
|
||||
- Knowledge base management (batch import, custom import, viewing, and deleting knowledge base documents)
|
||||
- Model management (viewing, starting, stopping models, and conducting dialogues for debugging)
|
||||
Observability tools (viewing and analyzing observability logs)
|
||||
|
||||
We won't go into detail about the usage of the command-line tool here. You can use the `dbgpt --help` command to obtain specific usage documentation. Additionally, you can check the documentation for individual subcommands. For example, you can use `dbgpt start --help` to view the documentation for starting a service. For more information, please refer to the document provided below.
|
||||
|
||||
- [Debugging](/docs/application/advanced_tutorial/debugging)
|
@ -1 +0,0 @@
|
||||
# SMMF
|
@ -1 +1,3 @@
|
||||
# Documentation Description
|
||||
# ChangeLog
|
||||
|
||||
Our version release information is maintained on GitHub. For more details, please visit [ReleaseNotes](https://github.com/eosphoros-ai/DB-GPT/releases)
|
||||
|
@ -73,7 +73,7 @@ import TabItem from '@theme/TabItem';
|
||||
{label: 'Open AI', value: 'openai'},
|
||||
{label: 'Qwen', value: 'qwen'},
|
||||
{label: 'ChatGLM', value: 'chatglm'},
|
||||
{label: 'ERNIE Bot', value: 'erniebot'},
|
||||
{label: 'WenXin', value: 'erniebot'},
|
||||
]}>
|
||||
<TabItem value="openai" label="open ai">
|
||||
Install dependencies
|
||||
@ -180,7 +180,7 @@ LLM_MODEL=wenxin_proxyllm
|
||||
PROXY_SERVER_URL={your_service_url}
|
||||
WEN_XIN_MODEL_VERSION={version}
|
||||
WEN_XIN_API_KEY={your-wenxin-sk}
|
||||
WEN_XIN_SECRET_KEY={your-wenxin-sct}
|
||||
WEN_XIN_API_SECRET={your-wenxin-sct}
|
||||
```
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
@ -218,7 +218,7 @@ mkdir models and cd models
|
||||
|
||||
# embedding model
|
||||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese
|
||||
或者
|
||||
or
|
||||
git clone https://huggingface.co/moka-ai/m3e-large
|
||||
|
||||
# llm model, if you use openai or Azure or tongyi llm api service, you don't need to download llm model
|
||||
|
@ -1,2 +1,25 @@
|
||||
# Connections
|
||||
The connections module supports connecting to various structured, semi-structured, and unstructured data storage engines. Bring multi-dimensional data into the framework and realize the interaction between natural language and multi-dimensional data
|
||||
The connections module supports connecting to various structured, semi-structured, and unstructured data storage engines. Bring multi-dimensional data into the framework and realize the interaction between natural language and multi-dimensional data
|
||||
|
||||
The list of data sources we currently support is as follows.
|
||||
|
||||
| DataSource | support | Notes |
|
||||
| ------------------------------------------------------------------------------ | ----------- | ------------------------------------------- |
|
||||
| [MySQL](https://www.mysql.com/) | Yes | MySQL is the world's most popular open source database. |
|
||||
| [PostgresSQL](https://www.postgresql.org/) | Yes | The World's Most Advanced Open Source Relational Database |
|
||||
| [Spark](https://github.com/apache/spark) | Yes | Unified Engine for large-scale data analytics |
|
||||
| [DuckDB](https://github.com/duckdb/duckdb) | Yes | DuckDB is an in-process SQL OLAP database management system |
|
||||
| [Sqlite](https://github.com/sqlite/sqlite) | Yes | |
|
||||
| [MSSQL](https://github.com/microsoft/mssql-jdbc) | Yes | |
|
||||
| [ClickHouse](https://github.com/ClickHouse/ClickHouse) | Yes | ClickHouse is the fastest and most resource efficient open-source database for real-time apps and analytics. |
|
||||
| [Oracle](https://github.com/oracle) | No | TODO |
|
||||
| [Redis](https://github.com/redis/redis) | No | The Multi-model NoSQL Database |
|
||||
| [MongoDB](https://github.com/mongodb/mongo) | No | MongoDB is a source-available cross-platform document-oriented database program |
|
||||
| [HBase](https://github.com/apache/hbase) | No | Open-source, distributed, versioned, column-oriented store modeled |
|
||||
| [Doris](https://github.com/apache/doris) | No | Apache Doris is an easy-to-use, high performance and unified analytics database. |
|
||||
| [DB2](https://github.com/IBM/Db2) | No | TODO |
|
||||
| [Couchbase](https://github.com/couchbase) | No | TODO |
|
||||
| [Elasticsearch](https://github.com/elastic/elasticsearch) | No | Free and Open, Distributed, RESTful Search Engine |
|
||||
| [OceanBase](https://github.com/OceanBase) | No | OceanBase is a distributed relational database. |
|
||||
| [TiDB](https://github.com/pingcap/tidb) | No | TODO |
|
||||
| [StarRocks](https://github.com/StarRocks/starrocks) | Yes | StarRocks is a next-gen, high-performance analytical data warehouse |
|
@ -1,7 +1,146 @@
|
||||
# SMMF
|
||||
Service-oriented Multi-model Management Framework(SMMF)
|
||||
|
||||
# Introduction
|
||||
|
||||
<p align="left">
|
||||
<img src={'/img/module/smmf.png'} width="720px" />
|
||||
</p>
|
||||
In AIGC application exploration and production landing, it is difficult to avoid directly interfacing with modeling services, but at present there is no de facto standard for the deployment of inference of large models, new models are constantly released and new training methods are constantly proposed, and we need to spend a lot of time adapting to the changing underlying modeling environments, which to a certain extent restricts the exploration and landing of AIGC applications
|
||||
|
||||
|
||||
# System Design
|
||||
In order to simplify the model adaptation process and improve model deployment efficiency and performance, we proposed a service-oriented Multi-Model Management Framework (SMMF).
|
||||
|
||||
<p align="center">
|
||||
<img src={'/img/module/smmf_layer.png'} width="360px" />
|
||||
</p>
|
||||
|
||||
SMMF consists of two parts: model inference layer and model deployment layer. The model inference layer corresponds to the model inference framework vLLM, TGI and TensorRT, etc. The model deployment layer connects downward to the inference layer and provides model service capabilities upward. The model deployment framework is based on the inference framework and provides capabilities such as multiple model instances, multiple inference frameworks, multi-cloud, automatic expansion and contraction<sup>[1]</sup> , and observability<sup>[2]</sup>
|
||||
|
||||
|
||||
<p align="center">
|
||||
<img src={'/img/module/smmf.png'} width="600px" />
|
||||
</p>
|
||||
|
||||
In DB-GPT, SMMF is specifically shown in the figure above: the top layer corresponds to the service and application layer (such as DB-GPT WebServer, Agents system, applications, etc.). The next layer is the model deployment framework layer, which includes the API Server and Model Handle that provide model services to the application layer, the Metadata Management and Control Center Model Controller of the entire deployment framework, and the Model Worker that directly interfaces with the inference framework and the underlying environment. The next layer is the inference framework layer, which includes vLLM, llama.cpp and FastChat (since DB-GPT directly uses the inference interface of FastChat, here we also classify FastChat as an inference framework), large language models (Vicuna, Llama, Baichuan, ChatGLM), etc. are deployed in the inference framework. The bottom layer is the actual deployment environment, including Kubernetes, Ray, AWS, Alibaba Cloud, private cloud, etc
|
||||
|
||||
## SMMF features
|
||||
- Supports multiple models and multiple inference frameworks
|
||||
|
||||
- Scalability and stability
|
||||
|
||||
- High framework performance
|
||||
|
||||
- Manageable and monitorable
|
||||
|
||||
- Lightweight
|
||||
|
||||
### Multiple models and multiple inference frameworks
|
||||
The current development in the field of large models is changing with each passing day. New models are constantly being released, and new methods are constantly being proposed in terms of model training and inference. We judge that this situation will continue for some time.
|
||||
|
||||
For most users exploring and implementing AIGC application scenarios, this situation has both advantages and disadvantages. A typical drawback is to be "led by the nose" by the model, and it is necessary to constantly try and explore new models and new reasoning frameworks.
|
||||
|
||||
In DB-GPT, seamless support for FastChat, vLLM and llama.cpp is directly provided. In theory, DB-GPT supports all the models they support. If you have needs for reasoning speed and tactical capabilities, you can directly use vLLM , if you want the CPU or Mac's M1/M2 chip to also get good inference performance, you can use llama.cpp. In addition, DB-GPT also supports proxy models, such as: OpenAI, Azure, Google Bard, Tongyi, Baichuan, Xun Feixinghuo, Baidu Wenxin, Zhipu AI, etc
|
||||
|
||||
|
||||
### Support LLMs
|
||||
#### Open-source Models
|
||||
- [Vicuna](https://huggingface.co/Tribbiani/vicuna-13b)
|
||||
- [vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5)
|
||||
- [LLama2](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
|
||||
- [baichuan2-13b](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat)
|
||||
- [baichuan2-7b](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat)
|
||||
- [chatglm-6b](https://huggingface.co/THUDM/chatglm-6b)
|
||||
- [chatglm2-6b](https://huggingface.co/THUDM/chatglm2-6b)
|
||||
- [chatglm3-6b](https://huggingface.co/THUDM/chatglm3-6b)
|
||||
- [falcon-40b](https://huggingface.co/tiiuae/falcon-40b)
|
||||
- [internlm-chat-7b](https://huggingface.co/internlm/internlm-chat-7b)
|
||||
- [internlm-chat-20b](https://huggingface.co/internlm/internlm-chat-20b)
|
||||
- [qwen-7b-chat](https://huggingface.co/Qwen/Qwen-7B-Chat)
|
||||
- [qwen-14b-chat](https://huggingface.co/Qwen/Qwen-14B-Chat)
|
||||
- [wizardlm-13b](https://huggingface.co/WizardLM/WizardLM-13B-V1.2)
|
||||
- [orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
||||
- [orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b)
|
||||
- [openchat_3.5](https://huggingface.co/openchat/openchat_3.5)
|
||||
- [zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)
|
||||
- [mistral-7b-instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
|
||||
- [Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat)
|
||||
|
||||
|
||||
#### Proxy Models
|
||||
- [OpenAI·ChatGPT](https://api.openai.com/)
|
||||
- [Alibaba·通义](https://www.aliyun.com/product/dashscope)
|
||||
- [Google·Bard](https://bard.google.com/)
|
||||
- [Baidu·文心](https://cloud.baidu.com/product/wenxinworkshop?track=dingbutonglan)
|
||||
- [智谱·ChatGLM](http://open.bigmodel.cn/)
|
||||
- [讯飞·星火](https://xinghuo.xfyun.cn/)
|
||||
|
||||
|
||||
:::info
|
||||
More LLMs, please refer to the [source code](https://github.com/eosphoros-ai/DB-GPT/blob/main/pilot/configs/model_config.py)
|
||||
:::
|
||||
|
||||
### Scalability and stability
|
||||
The cloud native field solves the core pain points of management, control, scheduling, and utilization of massive computing resources. Let the value of computing be fully released, making large-scale computing a ubiquitous technology.
|
||||
|
||||
In the field of large models, we are also concerned about the explosive demand for computing resources during model inference. Therefore, multi-model management with scheduling supercomputing capabilities is what we focus on during production implementation. In view of the outstanding achievements of computing scheduling layers such as Kubernetes and Istio in the past few years, we fully draw on relevant design concepts in multi-model management and control.
|
||||
|
||||
A relatively complete model deployment framework requires multiple parts, including a Model Worker that directly interfaces with the underlying reasoning framework, a Model Controller that manages and maintains multiple model components, and a Model API that provides external model service capabilities. The Model Worker must be scalable. It can be a Model Worker that specifically deploys large language models, or a Model Worker that is used to deploy Embedding models. Of course, it can also be based on the deployment environment, such as physical machine environment, kubernetes environment, and some specific clouds. Choose different Model Workers based on the cloud environment provided by the service provider.
|
||||
|
||||
The Model Controller used to manage metadata also needs to be extensible, and different Model Controllers must be selected for different deployment environments and different model management and control requirements. In addition, from a technical point of view, model services have a lot in common with traditional microservices. In microservices, a certain service in the microservice can have multiple service instances, and all service instances are uniformly registered to the registration center. The service caller pulls the service list corresponding to the service name from the registration center based on the service name, and then selects a specific service instance to call according to a certain load balancing policy.
|
||||
|
||||
In model deployment, a similar architecture can also be considered. A certain model can have multiple model instances. All model instances are uniformly registered to the model registration center, and then the model service caller goes to the registration center to pull the model instance based on the model name. list, and then call a specific model instance according to the load balancing policy of the model.
|
||||
|
||||
Here we introduce the model registration center, which is responsible for storing model instance metadata in the Model Controller. It can directly use the registration center in existing microservices as an implementation (such as nacos, eureka, etcd and console, etc.), so that the entire deployment system is Can achieve high availability.
|
||||
|
||||
### High framework performance
|
||||
|
||||
The framework layer should not be the bottleneck of model inference performance. In most cases, the hardware and inference framework determines the capability of the model service, and the deployment and optimization of model inference is a complex project, and inappropriate framework design may increase this complexity. In our opinion, there are two main concerns in deploying the framework in order to "not drag the feet" on performance: ● The framework should not be the bottleneck of model inference performance.
|
||||
|
||||
Avoid excessive encapsulation: the more encapsulation and the longer the links, the harder it is to troubleshoot performance issues.
|
||||
|
||||
High-performance communication design: There are many points in high-performance communication design, so I won't go into them here. Since Python is currently taking the lead in AIGC applications, in Python, asynchronous interfaces are critical to the performance of the service. Therefore, the model service layer only provides asynchronous interfaces to make compatibility with the model reasoning framework docking layer, and directly dock if the model reasoning framework provides asynchronous interfaces. Otherwise use synchronous to asynchronous task support.
|
||||
|
||||
### Manageable and monitorable
|
||||
In AIGC application exploration or AIGC application production and implementation, we need the model deployment system to have certain management capabilities, and to perform certain management and control on model instances deployed through API or command line (such as: online, offline, restart, debug, etc.)
|
||||
|
||||
Observability is a very important capability of production systems. We believe that observability is crucial in AIGC applications. Because the user experience and the interaction between the user and the system are more complex, in addition to traditional observation indicators, we are also more concerned about the user's input information and the contextual information of the corresponding scene. Which model instance and model parameters were called, the output content and response time of the model, user feedback, etc.
|
||||
|
||||
We can find some performance bottlenecks of model services and some user experience data from this information.
|
||||
|
||||
What about response latency?
|
||||
|
||||
Does it solve user problems and extract user satisfaction, etc. from user content?
|
||||
|
||||
These are the basis for further optimization of the entire application.
|
||||
|
||||
### Lightweight
|
||||
Considering the numerous supported models and inference frameworks, we need to work hard to avoid unnecessary dependencies and ensure that users can install them as needed.
|
||||
|
||||
In DB-GPT, users can install their own dependencies on demand. Some of the main optional dependencies are as follows:
|
||||
|
||||
- Install the most basic dependencies `pip install -e .` or `pip install -e ".[core]"`
|
||||
|
||||
- Install the dependencies of the basic framework `pip install -e ".[framework]"`
|
||||
|
||||
- Install the dependencies of the openai proxy model `pip install -e ".[openai]"`
|
||||
|
||||
- Install default dependencies `pip install -e ".[default]"`
|
||||
|
||||
- Install dependencies of vLLM inference framework `pip install -e ".[vllm]"`
|
||||
|
||||
- Install dependencies for model quantization deployment `pip install -e ".[quantization]"`
|
||||
|
||||
- Install knowledge base related dependencies `pip install -e ".[knowledge]"`
|
||||
|
||||
- Install pytorch dependencies `pip install -e ".[torch]"`
|
||||
|
||||
- Install the dependencies of llama.cpp `pip install -e ".[llama_cpp]"`
|
||||
|
||||
- Install vectorized database dependencies `pip install -e ".[vstore]"`
|
||||
|
||||
- Install data source dependencies `pip install -e ".[datasource]"`
|
||||
|
||||
## Implementation
|
||||
For multi-model related implementation, please refer to the [source code](https://github.com/eosphoros-ai/DB-GPT/tree/main/pilot/model)
|
||||
|
||||
# Appendix
|
||||
`[1]` `[2]` Capabilities such as automatic scaling and observability are still in incubation and have not yet been implemented.
|
@ -78,7 +78,7 @@ Fine-tuning module for Text2SQL/Text2DSL
|
||||
- [Connections](/docs/modules/connections)
|
||||
Connect various data sources
|
||||
|
||||
- [Obvervablity](/docs/operation_manual/advanced_tutorial/debugging)
|
||||
- [Obvervablity](/docs/operation/advanced_tutorial/debugging)
|
||||
Observing & monitoring
|
||||
|
||||
- [Evaluation](/docs/modules/eval)
|
||||
|
@ -106,10 +106,10 @@ PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions
|
||||
#### Hardware requirements description
|
||||
| Model | Quantize | VRAM Size |
|
||||
|:----------------------------------------:|--------------:|---------------|
|
||||
|Baichuan-7b | 4-bit | 8GB |
|
||||
|Baichuan-7b | 8-bit | 12GB |
|
||||
|Baichuan-13b | 4-bit | 12GB |
|
||||
|Baichuan-13b | 8-bit | 20GB |
|
||||
|Vicuna-7b | 4-bit | 8GB |
|
||||
|Vicuna-7b | 8-bit | 12GB |
|
||||
|Vicuna-13b | 4-bit | 12GB |
|
||||
|Vicuna-13b | 8-bit | 20GB |
|
||||
|
||||
#### Download LLM
|
||||
|
||||
|
@ -92,7 +92,7 @@ const sidebars = {
|
||||
|
||||
{
|
||||
type: "category",
|
||||
label: "Application Manual",
|
||||
label: "Application",
|
||||
collapsed: false,
|
||||
collapsible: false,
|
||||
items: [
|
||||
@ -103,27 +103,27 @@ const sidebars = {
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/chat_knowledge',
|
||||
id: 'application/started_tutorial/chat_knowledge',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/chat_data',
|
||||
id: 'application/started_tutorial/chat_data',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/chat_excel',
|
||||
id: 'application/started_tutorial/chat_excel',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/chat_db',
|
||||
id: 'application/started_tutorial/chat_db',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/chat_dashboard',
|
||||
id: 'application/started_tutorial/chat_dashboard',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/started_tutorial/agent',
|
||||
id: 'application/started_tutorial/agent',
|
||||
},
|
||||
],
|
||||
},
|
||||
@ -133,15 +133,15 @@ const sidebars = {
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/advanced_tutorial/rag',
|
||||
id: 'application/advanced_tutorial/rag',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/advanced_tutorial/smmf',
|
||||
id: 'application/advanced_tutorial/smmf',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/advanced_tutorial/debugging',
|
||||
id: 'application/advanced_tutorial/debugging',
|
||||
},
|
||||
],
|
||||
},
|
||||
@ -151,7 +151,7 @@ const sidebars = {
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
id: 'application_manual/fine_tuning_manual/text_to_sql',
|
||||
id: 'application/fine_tuning_manual/text_to_sql',
|
||||
},
|
||||
],
|
||||
},
|
||||
|
@ -1 +0,0 @@
|
||||
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1692782037832" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="15759" xmlns:xlink="http://www.w3.org/1999/xlink" width="200" height="200"><path d="M544.059897 959.266898h-64.949141c-228.633593 0-415.697442-187.063849-415.697442-415.697442v-64.949141c0-228.633593 187.063849-415.697442 415.697442-415.697442h64.949141c228.633593 0 415.697442 187.063849 415.697442 415.697442v64.949141C959.756315 772.203049 772.692466 959.266898 544.059897 959.266898z" fill="#2DC100" p-id="15760"></path><path d="M618.871102 424.812069c-52.6789 2.760395-98.49572 18.754507-135.696546 54.89766-37.587854 36.50356-54.743053 81.262707-50.047514 136.728622-20.586238-2.580191-39.34177-5.366183-58.19969-6.965492-6.552866-0.516038-14.292415 0.258019-19.786584 3.353224-18.316285 10.318716-35.858512 22.030941-56.703793 35.085479 3.818068-17.284208 6.294847-32.505287 10.680148-47.029101 3.173021-10.732366 1.702721-16.691379-8.152175-23.65687-63.256659-44.73355-89.905323-111.652647-69.963108-180.584703 18.470891-63.720479 63.798295-102.417201 125.376806-122.539619 84.100917-27.500536 178.52055 0.567232 229.651335 67.409538 18.733006 24.012159 30.112467 52.935895 32.763306 83.275665L618.871102 424.812069zM737.231222 753.7854c-16.691379-7.429312-31.989249-18.574304-48.241381-20.302622-16.252132-1.702721-33.330539 7.687331-50.305534 9.416673-51.724639 5.288368-98.0319-9.132033-136.263778-44.526725-72.646712-67.331723-62.275777-170.522981 21.799542-225.730878 74.736462-49.015438 184.324956-32.659894 237.003856 35.342474 45.971427 59.386373 40.55405 138.198922-15.55589 188.066232-16.252132 14.447022-22.108756 26.313853-11.686627 45.32638 1.909546 3.508855 2.140944 7.94535 3.250836 12.382869L737.231222 753.7854zM376.397651 403.348361c0.516038-12.640888-10.422129-23.991681-23.373254-24.353112-13.025869-0.533444-24.017278 9.593805-24.550722 22.619674-0.003072 0.078839-0.006143 0.158702-0.008191 0.237542-0.512967 12.869215 9.503704 23.719327 22.372918 24.232294 0.238565 0.009215 0.477131 0.015358 0.715696 0.017406C364.663926 426.584415 375.730078 416.448974 376.397651 403.348361zM502.909946 378.995249c-13.00232 0.258019-23.991681 11.350793-23.733662 23.99168 0.280545 13.104708 11.131681 23.50124 24.23639 23.220696 0.038908-0.001024 0.077815-0.002048 0.116723-0.003072 12.865119 0.104436 23.379398-10.239877 23.483834-23.104996 0.002048-0.278497 0-0.556994-0.008192-0.835491-0.109556-12.96546-10.708817-23.386565-23.673252-23.277009C503.191515 378.989105 503.050218 378.991153 502.909946 378.995249zM547.334283 569.640648c10.628954 0 19.348361-8.332379 19.760986-18.832323 0.384981-10.920761-8.15627-20.086582-19.077031-20.471563-0.176108-0.006143-0.352217-0.010239-0.529349-0.011262-11.041579 0.069624-19.937095 9.076743-19.867471 20.118322 0.001024 0.08703 0.002048 0.175084 0.003072 0.262115C528.092406 561.263219 536.764714 569.595597 547.334283 569.640648zM669.743869 530.351097c-10.452845 0.086006-19.011503 8.337498-19.477371 18.781128-0.570304 10.670933 7.617707 19.782488 18.28864 20.352793 0.310237 0.016382 0.620475 0.025597 0.930712 0.027645 10.654551 0 19.090342-8.07436 19.47737-18.703314 0.528325-10.772298-7.776409-19.934023-18.548706-20.462348-0.223207-0.011263-0.447438-0.01843-0.670645-0.021501V530.351097z" fill="#FFFFFF" p-id="15761"></path></svg>
|
Before Width: | Height: | Size: 3.4 KiB |
@ -1,156 +0,0 @@
|
||||
/*! Flickity v2.1.2
|
||||
https://flickity.metafizzy.co
|
||||
---------------------------------------------- */
|
||||
|
||||
.flickity-enabled {
|
||||
position: relative;
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
border-bottom: 1px solid var(--ifm-color-emphasis-200);
|
||||
margin-bottom: 4rem;
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .flickity-enabled {
|
||||
background: linear-gradient(
|
||||
rgba(41, 56, 88, 0.3),
|
||||
rgba(19, 31, 55, 0.3)
|
||||
) !important;
|
||||
}
|
||||
|
||||
.flickity-enabled:focus {
|
||||
outline: none;
|
||||
}
|
||||
|
||||
.flickity-viewport {
|
||||
overflow: hidden;
|
||||
position: relative;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.flickity-slider {
|
||||
position: absolute;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
/* draggable */
|
||||
|
||||
.flickity-enabled.is-draggable {
|
||||
-webkit-tap-highlight-color: transparent;
|
||||
tap-highlight-color: transparent;
|
||||
-webkit-user-select: none;
|
||||
-moz-user-select: none;
|
||||
-ms-user-select: none;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.flickity-enabled.is-draggable .flickity-viewport {
|
||||
cursor: move;
|
||||
cursor: -webkit-grab;
|
||||
cursor: grab;
|
||||
}
|
||||
|
||||
.flickity-enabled.is-draggable .flickity-viewport.is-pointer-down {
|
||||
cursor: -webkit-grabbing;
|
||||
cursor: grabbing;
|
||||
}
|
||||
|
||||
/* ---- flickity-button ---- */
|
||||
|
||||
.flickity-button {
|
||||
position: absolute;
|
||||
background: hsla(0, 0%, 100%, 0.75);
|
||||
border: none;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.flickity-button:hover {
|
||||
background: white;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.flickity-button:focus {
|
||||
outline: none;
|
||||
box-shadow: 0 0 0 5px #19f;
|
||||
}
|
||||
|
||||
.flickity-button:active {
|
||||
opacity: 0.6;
|
||||
}
|
||||
|
||||
.flickity-button:disabled {
|
||||
opacity: 0.3;
|
||||
cursor: auto;
|
||||
/* prevent disabled button from capturing pointer up event. #716 */
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
.flickity-button-icon {
|
||||
fill: #333;
|
||||
}
|
||||
|
||||
/* ---- previous/next buttons ---- */
|
||||
|
||||
.flickity-prev-next-button {
|
||||
top: 50%;
|
||||
width: 44px;
|
||||
height: 44px;
|
||||
border-radius: 50%;
|
||||
/* vertically center */
|
||||
transform: translateY(-50%);
|
||||
}
|
||||
|
||||
.flickity-prev-next-button.previous {
|
||||
left: 10px;
|
||||
}
|
||||
.flickity-prev-next-button.next {
|
||||
right: 10px;
|
||||
}
|
||||
/* right to left */
|
||||
.flickity-rtl .flickity-prev-next-button.previous {
|
||||
left: auto;
|
||||
right: 10px;
|
||||
}
|
||||
.flickity-rtl .flickity-prev-next-button.next {
|
||||
right: auto;
|
||||
left: 10px;
|
||||
}
|
||||
|
||||
.flickity-prev-next-button .flickity-button-icon {
|
||||
position: absolute;
|
||||
left: 20%;
|
||||
top: 20%;
|
||||
width: 60%;
|
||||
height: 60%;
|
||||
}
|
||||
|
||||
/* ---- page dots ---- */
|
||||
|
||||
.flickity-page-dots {
|
||||
position: absolute;
|
||||
width: 100%;
|
||||
bottom: -25px;
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
list-style: none;
|
||||
text-align: center;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.flickity-rtl .flickity-page-dots {
|
||||
direction: rtl;
|
||||
}
|
||||
|
||||
.flickity-page-dots .dot {
|
||||
display: inline-block;
|
||||
width: 10px;
|
||||
height: 10px;
|
||||
margin: 0 8px;
|
||||
background: var(--ifm-color-primary-lightest);
|
||||
border-radius: 50%;
|
||||
opacity: 0.25;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.flickity-page-dots .dot.is-selected {
|
||||
opacity: 1;
|
||||
}
|
@ -1,79 +0,0 @@
|
||||
import React from 'react';
|
||||
import Flickity from 'react-flickity-component';
|
||||
import Link from '@docusaurus/Link';
|
||||
import FeaturedSlides from './slides.json';
|
||||
import './flickity.css';
|
||||
import clsx from 'clsx';
|
||||
import styles from './styles.module.css';
|
||||
|
||||
const flickityOptions = {
|
||||
initialIndex: 0,
|
||||
autoPlay: false,
|
||||
adaptiveHeight: true,
|
||||
wrapAround: true,
|
||||
groupCells: false,
|
||||
fade: true,
|
||||
pageDots: false,
|
||||
};
|
||||
|
||||
export default function FeaturedSlider() {
|
||||
const RenderSlides: () => void = () => {
|
||||
return FeaturedSlides.map((slide) => (
|
||||
<div key={slide.title} className={clsx(styles.slide__container)}>
|
||||
<div
|
||||
key={slide.title}
|
||||
className={clsx(styles.slide)}
|
||||
style={{
|
||||
backgroundColor: 'transparent',
|
||||
backgroundImage: ` url(${slide.imagePath})`,
|
||||
backgroundSize: '35%',
|
||||
backgroundRepeat: 'no-repeat',
|
||||
backgroundPosition: 'bottom right',
|
||||
}}
|
||||
>
|
||||
<div className={clsx(styles.slide__section)}>
|
||||
<h1 className={clsx(styles.slide__header)}>{slide.title}</h1>
|
||||
<p className={clsx(styles.slide__description)}>
|
||||
{slide.description}
|
||||
</p>
|
||||
<div className={clsx(styles.slide__buttons)}>
|
||||
{slide.outlinedButton && (
|
||||
<Link
|
||||
to={slide.outlinedButton.url}
|
||||
className={clsx(
|
||||
styles.slide__button,
|
||||
'button',
|
||||
'button--outline',
|
||||
'button--primary',
|
||||
)}
|
||||
>
|
||||
{slide.outlinedButton.buttonText}
|
||||
</Link>
|
||||
)}
|
||||
{slide.solidButton && (
|
||||
<Link
|
||||
to={slide.solidButton.url}
|
||||
className={clsx(
|
||||
styles.slide__button,
|
||||
'button',
|
||||
'button--primary',
|
||||
)}
|
||||
>
|
||||
{slide.solidButton.buttonText}
|
||||
</Link>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
));
|
||||
};
|
||||
|
||||
return (
|
||||
<Flickity
|
||||
options={flickityOptions}
|
||||
>
|
||||
<RenderSlides />
|
||||
</Flickity>
|
||||
);
|
||||
}
|
@ -1,15 +0,0 @@
|
||||
[
|
||||
{
|
||||
"title": "DB-GPT",
|
||||
"description": "Revolutionizing Database Interactions with Private LLM Technology, a solution you can be assured that there is no risk of data leakage, and your data is 100% private and secure.",
|
||||
"outlinedButton": {
|
||||
"buttonText": "Learn About DB-GPT",
|
||||
"url": "/docs/getting_started"
|
||||
},
|
||||
"solidButton": {
|
||||
"buttonText": "Experience Now!",
|
||||
"url": "http://dev.dbgpt.site"
|
||||
},
|
||||
"imagePath": "/img/dbgpt_face_page.svg"
|
||||
}
|
||||
]
|
@ -1,60 +0,0 @@
|
||||
.slide__container {
|
||||
width: 100%;
|
||||
align-content: center;
|
||||
justify-content: space-between;
|
||||
row-gap: 1rem;
|
||||
padding: 0 8rem;
|
||||
display: flex;
|
||||
}
|
||||
|
||||
.slide {
|
||||
width: 100%;
|
||||
max-width: var(--ifm-container-width);
|
||||
margin: auto;
|
||||
}
|
||||
|
||||
.slide {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.slide {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: flex-end;
|
||||
width: 100%;
|
||||
background-repeat: no-repeat;
|
||||
background-position: right bottom;
|
||||
}
|
||||
}
|
||||
@media (max-width: 720px) {
|
||||
.slide {
|
||||
background: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
.slide__section {
|
||||
max-width: 25rem;
|
||||
margin: 6rem 0;
|
||||
}
|
||||
|
||||
.slide__header {
|
||||
font-size: 40px;
|
||||
}
|
||||
|
||||
.slide__buttons {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.slide__description {
|
||||
/*height: 8rem;*/
|
||||
}
|
||||
|
||||
.slide__button {
|
||||
flex: 1 1 0;
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
max-width: 12rem;
|
||||
}
|
@ -1,191 +0,0 @@
|
||||
import React, { FC } from 'react';
|
||||
import Layout from '@theme/Layout';
|
||||
import Link from '@docusaurus/Link';
|
||||
import useDocusaurusContext from '@docusaurus/useDocusaurusContext';
|
||||
import clsx from 'clsx';
|
||||
import './styles.css';
|
||||
import {
|
||||
Tool1,
|
||||
Tool2,
|
||||
Tool3,
|
||||
Tool4,
|
||||
Tool5,
|
||||
Tool6,
|
||||
ToolDev,
|
||||
ToolCloud,
|
||||
ToolData,
|
||||
ToolSecure,
|
||||
ToolPerson,
|
||||
ToolProduct,
|
||||
ToolSystem,
|
||||
CommGithub,
|
||||
CommWechat,
|
||||
CommDiscord,
|
||||
CommGithub2,
|
||||
} from '@site/src/common/icons';
|
||||
import FeaturedSlider from '@site/src/components/FeaturedSlider';
|
||||
|
||||
interface HomepageSectionProps {
|
||||
header?: string;
|
||||
description?: string;
|
||||
className?: string;
|
||||
}
|
||||
|
||||
const HomepageSection: FC<HomepageSectionProps> = (props) => {
|
||||
const toKebabCase = (header) =>
|
||||
header &&
|
||||
header
|
||||
.match(
|
||||
/[A-Z]{2,}(?=[A-Z][a-z]+[0-9]*|\b)|[A-Z]?[a-z]+[0-9]*|[A-Z]|[0-9]+/g,
|
||||
)
|
||||
.map((parts) => parts.toLowerCase())
|
||||
.join('-');
|
||||
|
||||
return (
|
||||
<div className={clsx('homepage__section', props.className)}>
|
||||
<div className='homepage__container'>
|
||||
{props.header && (
|
||||
<h2 className='homepage__header' id={toKebabCase(props.header)}>
|
||||
{props.header}
|
||||
</h2>
|
||||
)}
|
||||
{props.description && (
|
||||
<p className='homepage__description'>{props.description}</p>
|
||||
)}
|
||||
{props.children}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default function HomeLayout() {
|
||||
const { siteConfig } = useDocusaurusContext();
|
||||
|
||||
return (
|
||||
<Layout description={siteConfig.tagline}>
|
||||
<div className='homepage'>
|
||||
<FeaturedSlider />
|
||||
|
||||
<HomepageSection header='Abilities' description='Introduction to Framework Capabilities'>
|
||||
<div className='about__cards'>
|
||||
<Link
|
||||
to='/docs/modules/llms' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolDev />
|
||||
</div>
|
||||
<h3 className='about__header'>Multi-Models</h3>
|
||||
<p className='about__description'>
|
||||
Support multiply LLMs, such as chatglm, vicuna, Qwen, and proxy of chatgpt and bard.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='/docs/modules/vector/chroma' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolPerson />
|
||||
</div>
|
||||
<h3 className='about__header'>Embedding</h3>
|
||||
<p className='about__description'>
|
||||
Embed data as vectors and store them in vector databases, providing content similarity search.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='/docs/getting_started/application/chatdb' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolData />
|
||||
</div>
|
||||
<h3 className='about__header'>BI</h3>
|
||||
<p className='about__description'>
|
||||
Support multiply scenes, chat to db, chat to dashboard, chat to knowledge and native chat with LLMs.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='/docs/modules/knowledge/markdown/markdown_embedding' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolProduct />
|
||||
</div>
|
||||
<h3 className='about__header'>Knowledge Based QA</h3>
|
||||
<p className='about__description'>
|
||||
You can perform high-quality intelligent Q&A based on local documents such as pdf, word, excel and other data.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='/docs/getting_started' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolSecure />
|
||||
</div>
|
||||
<h3 className='about__header'>Privacy & Secure</h3>
|
||||
<p className='about__description'>
|
||||
You can be assured that there is no risk of data leakage, and your data is 100% private and secure.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='/docs/use_cases/tool_use_with_plugin' className='about__card'>
|
||||
<div className='about__section'>
|
||||
<div className='about__icon'>
|
||||
<ToolCloud />
|
||||
</div>
|
||||
<h3 className='about__header'>Agent & Plugins</h3>
|
||||
<p className='about__description'>
|
||||
Support AutoGPT plugins, and you can build your own plugins as well.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
</div>
|
||||
</HomepageSection>
|
||||
|
||||
<HomepageSection header='Framework' description='Introduction to Framework'>
|
||||
<img src="/img/framework_tt.svg" style={{height: 500}}/>
|
||||
</ HomepageSection>
|
||||
|
||||
<HomepageSection header='Contact us'>
|
||||
<div className='further__cards'>
|
||||
<Link to='https://discord.gg/erwfqcMP' className='further__card'>
|
||||
<div className='further__section'>
|
||||
<div className='further__icon'>
|
||||
<CommDiscord />
|
||||
</div>
|
||||
<h3 className='further__header'>Join Discord</h3>
|
||||
<p className='further__description'>
|
||||
Check out the DB-GPT community on Discord.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link to='https://github.com/eosphoros-ai/DB-GPT/blob/main/assets/wechat.jpg' className='further__card'>
|
||||
<div className='further__section'>
|
||||
<div className='further__icon'>
|
||||
<CommWechat />
|
||||
</div>
|
||||
<h3 className='further__header'>
|
||||
Wechat Group
|
||||
</h3>
|
||||
<p className='further__description'>
|
||||
3000+ developers here to learn and communicate with you.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
<Link
|
||||
to='https://github.com/eosphoros-ai/DB-GPT'
|
||||
className='further__card'
|
||||
>
|
||||
<div className='further__section'>
|
||||
<div className='further__icon'>
|
||||
<CommGithub2 />
|
||||
</div>
|
||||
<h3 className='further__header'>Github</h3>
|
||||
<p className='further__description'>
|
||||
Welcome to join us on GitHub and contribute code together.
|
||||
</p>
|
||||
</div>
|
||||
</Link>
|
||||
</div>
|
||||
</HomepageSection>
|
||||
|
||||
</div>
|
||||
</Layout>
|
||||
);
|
||||
}
|
@ -1,608 +0,0 @@
|
||||
.homepage {
|
||||
letter-spacing: 0.75px;
|
||||
font-weight: 200;
|
||||
color: var(--ifm-color-emphasis-700);
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .homepage {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
}
|
||||
|
||||
.homepage__section {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
margin-bottom: 8rem;
|
||||
}
|
||||
|
||||
.homepage__section--intro {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
border-bottom: 1px solid var(--ifm-color-emphasis-200);
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .homepage__section--intro {
|
||||
background: linear-gradient(rgba(41, 56, 88, 0.3), rgba(19, 31, 55, 0.3));
|
||||
}
|
||||
|
||||
.homepage__container {
|
||||
margin: 0 8rem;
|
||||
width: 100%;
|
||||
max-width: var(--ifm-container-width);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
row-gap: 1rem;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.homepage__container {
|
||||
margin: 0 2rem;
|
||||
}
|
||||
}
|
||||
|
||||
.homepage__description {
|
||||
width: 70%;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.homepage__description {
|
||||
width: 90%;
|
||||
}
|
||||
}
|
||||
|
||||
.languages {
|
||||
flex: 1 1 0;
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
align-items: flex-start;
|
||||
justify-content: flex-end;
|
||||
gap: 0.2rem;
|
||||
}
|
||||
|
||||
.language {
|
||||
padding: 0 0.4rem;
|
||||
border-radius: 0.2rem;
|
||||
font-size: calc(0.75rem * var(--ifm-button-size-multiplier));
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.intro {
|
||||
background: none;
|
||||
}
|
||||
}
|
||||
|
||||
.intro__section {
|
||||
max-width: 25rem;
|
||||
margin: 6rem 0;
|
||||
}
|
||||
|
||||
.intro__header {
|
||||
font-size: 40px;
|
||||
}
|
||||
|
||||
.intro__buttons {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.intro__button {
|
||||
flex: 1 1 0;
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.about__cards {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(18rem, 1fr));
|
||||
gap: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.about__card {
|
||||
aspect-ratio: 1/1;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
box-shadow: 0 0 5px 0 var(--ifm-color-emphasis-300);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
cursor: pointer;
|
||||
transition: border-color 200ms ease-in-out;
|
||||
}
|
||||
|
||||
.about__card:hover {
|
||||
color: inherit;
|
||||
border-color: var(--ifm-color-emphasis-400);
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .about__card {
|
||||
box-shadow: none;
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
}
|
||||
|
||||
.about__section {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
width: 16rem;
|
||||
height: 12rem;
|
||||
}
|
||||
|
||||
.about__icon {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
fill: var(--ifm-color-emphasis-900);
|
||||
|
||||
width: 4rem;
|
||||
height: 4rem;
|
||||
}
|
||||
|
||||
.about__icon > svg {
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
}
|
||||
|
||||
.about__header {
|
||||
margin-top: 1.5rem;
|
||||
}
|
||||
|
||||
.about__description {
|
||||
text-align: center;
|
||||
margin: 0;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.networks__cards {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1px;
|
||||
width: 100%;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
.networks__card {
|
||||
flex: 1 0 calc(50% - 1px);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: space-between;
|
||||
padding: 3rem 2rem;
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .networks__card {
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
}
|
||||
|
||||
.networks__section {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.networks__icon {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
border-radius: 50%;
|
||||
|
||||
width: 5rem;
|
||||
height: 5rem;
|
||||
}
|
||||
|
||||
.networks__icon > svg {
|
||||
width: 4rem;
|
||||
height: 4rem;
|
||||
}
|
||||
|
||||
.networks__icon--shimmer {
|
||||
fill: white;
|
||||
background-color: #16d4bf;
|
||||
}
|
||||
|
||||
.networks__label {
|
||||
font-weight: 800;
|
||||
font-size: 0.8rem;
|
||||
color: var(--ifm-color-emphasis-500);
|
||||
margin-top: 1rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.networks__description {
|
||||
text-align: center;
|
||||
width: 16rem;
|
||||
}
|
||||
|
||||
.networks__features {
|
||||
list-style-type: none;
|
||||
padding: 0;
|
||||
margin-bottom: var(--ifm-paragraph-margin-bottom);
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.networks__feature {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
margin: 0.5rem 0;
|
||||
}
|
||||
|
||||
.networks__button {
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.start-building__header {
|
||||
font-family: 'Metropolis Regular';
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.start-building__description {
|
||||
margin: 2rem 0;
|
||||
max-width: 38rem;
|
||||
}
|
||||
|
||||
.start-building__buttons {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.start-building__button {
|
||||
padding: 0.75rem 2rem;
|
||||
}
|
||||
|
||||
.libraries__cards {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(20rem, 1fr));
|
||||
gap: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.libraries__cards {
|
||||
width: unset;
|
||||
max-width: 100%;
|
||||
grid-template-columns: repeat(auto-fill, 20rem);
|
||||
}
|
||||
}
|
||||
|
||||
.libraries__card {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
box-shadow: 0 0 5px 0 var(--ifm-color-emphasis-300);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
padding: 1.5rem;
|
||||
display: flex;
|
||||
gap: calc(4rem + 2px);
|
||||
}
|
||||
|
||||
@media (min-width: 997px) {
|
||||
.libraries__card--wide {
|
||||
grid-column: span 2;
|
||||
}
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .libraries__card {
|
||||
box-shadow: none;
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
}
|
||||
|
||||
.libraries__logo {
|
||||
flex: 1 1 0;
|
||||
aspect-ratio: 1/1;
|
||||
width: calc(50% - 2rem - 1px);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.libraries__logo {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
.libraries__section {
|
||||
flex: 1 1 0;
|
||||
aspect-ratio: 1/1;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.libraries__head {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.libraries__icon {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
|
||||
width: 4rem;
|
||||
height: 4rem;
|
||||
|
||||
fill: var(--ifm-color-emphasis-900);
|
||||
}
|
||||
|
||||
.libraries__icon > svg {
|
||||
width: 3rem;
|
||||
height: 3rem;
|
||||
}
|
||||
|
||||
.libraries__header {
|
||||
margin-top: 1.5rem;
|
||||
margin-bottom: 0rem;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.libraries__features {
|
||||
flex-grow: 1;
|
||||
margin: 0;
|
||||
padding-left: 1rem;
|
||||
font-size: 0.8rem;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.libraries__feature {
|
||||
margin: 0.5rem 0;
|
||||
}
|
||||
|
||||
.nodes__feature > a,
|
||||
.libraries__feature > a {
|
||||
color: var(--ifm-color-emphasis-700);
|
||||
}
|
||||
|
||||
.nodes__feature > a,
|
||||
.libraries__feature:hover > a {
|
||||
color: var(--ifm-color-emphasis-700);
|
||||
}
|
||||
.libraries__button {
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.nodes__cards {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(30rem, 1fr));
|
||||
gap: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.nodes__cards {
|
||||
grid-template-columns: repeat(auto-fill, minmax(20rem, 1fr));
|
||||
}
|
||||
}
|
||||
|
||||
.nodes__card {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
box-shadow: 0 0 5px 0 var(--ifm-color-emphasis-300);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
padding: 1.5rem;
|
||||
display: flex;
|
||||
gap: 1.5rem;
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .nodes__card {
|
||||
box-shadow: none;
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
}
|
||||
|
||||
.nodes__icon {
|
||||
flex: 0 0 30%;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
fill: var(--ifm-color-emphasis-900);
|
||||
}
|
||||
|
||||
.nodes__icon > svg {
|
||||
width: 6rem;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.nodes__icon {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
.nodes__section {
|
||||
flex: 1 1 0;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.nodes__header {
|
||||
margin: 0;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.nodes__features {
|
||||
flex: 1 1 0;
|
||||
margin: 1rem 0;
|
||||
padding-left: 1rem;
|
||||
font-size: 0.8rem;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.nodes__feature {
|
||||
margin: 0.5rem 0;
|
||||
}
|
||||
|
||||
.nodes__button {
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.resources__cards {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(
|
||||
auto-fill,
|
||||
minmax(max(18rem, calc(50% - 0.75rem)), 1fr)
|
||||
);
|
||||
gap: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.resources__card {
|
||||
background-color: var(--ifm-color-emphasis-0);
|
||||
box-shadow: 0 0 5px 0 var(--ifm-color-emphasis-300);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
padding: 1.5rem;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
cursor: pointer;
|
||||
transition: border-color 200ms ease-in-out;
|
||||
}
|
||||
|
||||
.resources__card:hover {
|
||||
color: inherit;
|
||||
border-color: var(--ifm-color-emphasis-400);
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .resources__card {
|
||||
box-shadow: none;
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
}
|
||||
|
||||
.resources__card--logo:hover {
|
||||
color: inherit;
|
||||
border-color: var(--ifm-color-emphasis-200);
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.resources__card--logo {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
.resources__icon {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
fill: var(--ifm-color-emphasis-900);
|
||||
width: 3rem;
|
||||
height: 3rem;
|
||||
}
|
||||
|
||||
.resources__icon > svg {
|
||||
width: 3rem;
|
||||
}
|
||||
|
||||
.resources__header {
|
||||
margin-top: 1rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.resources__description {
|
||||
flex-grow: 1;
|
||||
margin: 0;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.further__cards {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(18rem, 1fr));
|
||||
gap: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.further__cards {
|
||||
width: unset;
|
||||
max-width: 100%;
|
||||
grid-template-columns: repeat(auto-fill, 20rem);
|
||||
}
|
||||
}
|
||||
|
||||
.further__card {
|
||||
aspect-ratio: 1/1;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
box-shadow: 0 0 5px 0 var(--ifm-color-emphasis-300);
|
||||
border: 1px solid var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
color: inherit;
|
||||
cursor: pointer;
|
||||
transition: border-color 200ms ease-in-out;
|
||||
}
|
||||
|
||||
.further__card:hover {
|
||||
color: inherit;
|
||||
border-color: var(--ifm-color-emphasis-400);
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .further__card {
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
.further__section {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
width: 16rem;
|
||||
height: 12rem;
|
||||
}
|
||||
|
||||
.further__icon {
|
||||
color: inherit;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: var(--ifm-global-radius);
|
||||
fill: var(--ifm-color-emphasis-900);
|
||||
width: 4rem;
|
||||
height: 4rem;
|
||||
}
|
||||
|
||||
.further__icon > svg {
|
||||
width: 3rem;
|
||||
height: 3rem;
|
||||
}
|
||||
|
||||
.further__header {
|
||||
margin-top: 1.5rem;
|
||||
margin-bottom: 1rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.further__description {
|
||||
text-align: center;
|
||||
margin: 0;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.about__card:hover,
|
||||
.resources__card:hover,
|
||||
.further__card:hover {
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
@media (max-width: 996px) {
|
||||
.spaceholder__card__img {
|
||||
display: none;
|
||||
}
|
||||
}
|
@ -1,5 +1,4 @@
|
||||
import React from 'react';
|
||||
import HomeLayout from '@site/src/components/HomeLayout';
|
||||
|
||||
export default function Home() {
|
||||
return <p/>
|
||||
|
BIN
docs/static/img/module/model_list.png
vendored
Normal file
After Width: | Height: | Size: 125 KiB |
BIN
docs/static/img/module/model_stop.png
vendored
Normal file
After Width: | Height: | Size: 144 KiB |
BIN
docs/static/img/module/model_stopped.png
vendored
Normal file
After Width: | Height: | Size: 142 KiB |
BIN
docs/static/img/module/model_use.png
vendored
Normal file
After Width: | Height: | Size: 441 KiB |
BIN
docs/static/img/module/model_vicuna-7b-1.5.png
vendored
Normal file
After Width: | Height: | Size: 105 KiB |
BIN
docs/static/img/module/model_vicuna_deployed.png
vendored
Normal file
After Width: | Height: | Size: 95 KiB |
BIN
docs/static/img/module/smmf_layer.png
vendored
Normal file
After Width: | Height: | Size: 53 KiB |
@ -0,0 +1 @@
|
||||
PUT ZIP PLUGINs TO THIS DIR
|
@ -9,7 +9,7 @@ pytest-mock
|
||||
pytest-recording
|
||||
pytesseract==0.3.10
|
||||
aioresponses
|
||||
# python code format
|
||||
# python code format, usage `black .`
|
||||
black
|
||||
# for git hooks
|
||||
pre-commit
|