xuyuan23
|
572c336b81
|
feat: Adjusting the proxy framework.
Adjusting the proxy framework, chatgpt_proxyllm default, or proxyllm instead, new LLM proxy should be configured in format 'xxx_proxyllm'.
|
2023-07-27 11:03:29 +08:00 |
|
xuyuan23
|
d95726c04a
|
feat: add Bard LLM proxy
Supplement proxy framework content, adding proxy access method for the Bard LLM.
Closes #369
|
2023-07-26 16:36:26 +08:00 |
|
FangYin Cheng
|
f234b30f7a
|
feat: Multi-model prompt adaptation
|
2023-07-21 20:51:53 +08:00 |
|
FangYin Cheng
|
168c754a3f
|
feat: Support llama-2 model
|
2023-07-20 21:44:40 +08:00 |
|
FangYin Cheng
|
accce56d49
|
fix pylint
|
2023-07-13 18:11:41 +08:00 |
|
qutcat1997
|
bc08285356
|
Update config.py
Modify config
|
2023-07-11 23:06:41 +08:00 |
|
tuyang.yhj
|
e8c61c29e2
|
WEB API independent
|
2023-07-04 16:50:49 +08:00 |
|
aries_ckt
|
d95646d8b2
|
feat:knowledge management
1.merge main branch
2.rebuild knowledge management
3.static js
|
2023-06-30 15:39:54 +08:00 |
|
tuyang.yhj
|
caa1a41065
|
Merge branch 'llm_framework' into dev_ty_06_end
# Conflicts:
# pilot/memory/chat_history/duckdb_history.py
# pilot/openapi/api_v1/api_v1.py
# pilot/scene/base.py
# pilot/scene/base_chat.py
# pilot/scene/chat_execution/example.py
|
2023-06-28 11:40:22 +08:00 |
|
tuyang.yhj
|
820214faa8
|
WEB API independent
|
2023-06-28 11:34:40 +08:00 |
|
aries_ckt
|
682b1468d1
|
style:format code
format code
|
2023-06-27 22:20:21 +08:00 |
|
magic.chen
|
aa4115ef67
|
Update model_config.py
|
2023-06-25 23:29:43 +08:00 |
|
csunny
|
192b289480
|
feat: add chatglm2-6b support
|
2023-06-25 23:05:50 +08:00 |
|
tuyang.yhj
|
d372e73cd5
|
WEB API independent
|
2023-06-25 14:46:46 +08:00 |
|
aries-ckt
|
c32f3f1766
|
style: code format.
code style format
|
2023-06-19 16:58:24 +08:00 |
|
aries-ckt
|
3af16cf792
|
fix: Weaviate document format.
1.similar search: docs format
2.conf SUMMARY_CONFIG
|
2023-06-19 16:44:18 +08:00 |
|
aries-ckt
|
8299f5e0fa
|
feat: integrate Weaviate vector database in DB-GPT.
1.Weaviate default schema update
2.Weaviate database config
3.requirement
|
2023-06-19 09:56:54 +08:00 |
|
LBYPatrick
|
0b1d54afa6
|
chore: run black against modified code
|
2023-06-16 12:23:28 +08:00 |
|
LBYPatrick
|
1580aaa8eb
|
chore: bump requirement.txt for guanaco support
|
2023-06-16 11:45:42 +08:00 |
|
aries-ckt
|
d07d103db9
|
fix:format
|
2023-06-14 22:22:27 +08:00 |
|
yhjun1026
|
ddd8e7a8c5
|
Merge branch 'dbgpt_doc' into ty_test
# Conflicts:
# pilot/common/plugins.py
|
2023-06-14 21:53:08 +08:00 |
|
yhjun1026
|
d784e7e00a
|
close auto load plugin
|
2023-06-14 21:40:08 +08:00 |
|
yhjun1026
|
b2ec087322
|
close auto load plugin
|
2023-06-14 21:28:06 +08:00 |
|
xuyuan23
|
6a5b0170ba
|
Merge branch 'dbgpt_doc' into feature-xuyuan-openai-proxy
|
2023-06-14 17:55:32 +08:00 |
|
xuyuan23
|
7b77360286
|
add use cases: tool_use_with_plugin, and how to write a plugin.
|
2023-06-14 17:24:31 +08:00 |
|
aries-ckt
|
333aad7bc4
|
fix:default chunk size
|
2023-06-14 12:37:45 +08:00 |
|
csunny
|
4f82cfde63
|
pylint: multi model for gp4all (#138)
|
2023-06-14 10:17:53 +08:00 |
|
csunny
|
0f4569e5c8
|
Merge branch 'dbgpt_doc' into llm_fxp
|
2023-06-14 10:14:13 +08:00 |
|
magic.chen
|
c67d0ab4d0
|
fix:default chunk size (#205)
fix:default chunk size
|
2023-06-13 23:54:19 +08:00 |
|
aries-ckt
|
61e71ed38b
|
fix:default chunk size
|
2023-06-13 23:07:26 +08:00 |
|
csunny
|
77d2abba5d
|
fix: conflicts
|
2023-06-13 21:41:18 +08:00 |
|
yhjun1026
|
18bacbd7f7
|
fix plugin mode bug;Optimize the parsing logic for model response
|
2023-06-13 20:33:34 +08:00 |
|
xuyuan23
|
6a8ee91834
|
add plugin_env file, define plugin config strategy.
|
2023-06-13 15:58:24 +08:00 |
|
yhjun1026
|
4e5ce4d98b
|
Merge remote-tracking branch 'origin/feature-xuyuan-openai-proxy' into ty_test
# Conflicts:
# pilot/model/llm_out/proxy_llm.py
|
2023-06-13 15:17:06 +08:00 |
|
yhjun1026
|
4690ac26ea
|
fix
|
2023-06-13 15:14:50 +08:00 |
|
xuyuan23
|
6e3b48c7c4
|
support multi process to launch llmserver, add openai proxy api.
|
2023-06-12 21:29:16 +08:00 |
|
yhjun1026
|
85176111b9
|
Merge branch 'llm_fxp' into ty_test
|
2023-06-12 16:57:20 +08:00 |
|
ykgong
|
7136aa748d
|
fix model key
|
2023-06-09 13:57:47 +08:00 |
|
ykgong
|
545c323216
|
add gpt4all
|
2023-06-09 11:35:06 +08:00 |
|
csunny
|
716460460f
|
feature: gorllia support (#173)
|
2023-06-08 21:33:22 +08:00 |
|
yhjun1026
|
58295a415a
|
Merge branch 'dev' into ty_test
|
2023-06-08 20:55:19 +08:00 |
|
yhjun1026
|
8851ab9d45
|
default scene change
|
2023-06-08 20:08:51 +08:00 |
|
csunny
|
e918687705
|
fix: remove qlora config from model_config
|
2023-06-08 20:07:36 +08:00 |
|
csunny
|
0948bc45bc
|
fix: gorilla chat adapter and config
|
2023-06-08 17:35:17 +08:00 |
|
luchun
|
a644d3ac6c
|
Merge branch 'llm_fxp' into falcon
|
2023-06-08 15:00:19 +08:00 |
|
zhanghy-sketchzh
|
bb9081e00f
|
Add quantize_qlora support for falcon
|
2023-06-08 13:37:48 +08:00 |
|
zhanghy-sketchzh
|
b357fd9d0c
|
Add support for falcon
|
2023-06-08 12:17:13 +08:00 |
|
zhanghy-sketchzh
|
c2bfab11e0
|
support gorilla
|
2023-06-07 12:05:33 +08:00 |
|
csunny
|
4d3079055c
|
fix: merge (#153)
|
2023-06-05 22:29:40 +08:00 |
|
aries-ckt
|
e29fa37cde
|
update:knowledge env
|
2023-06-05 18:08:55 +08:00 |
|