BurnCloud.com
|
d22023c148
|
✨ feat: add BurnCloud as new AI model provider (#2890)
Co-authored-by: Claude <noreply@anthropic.com>
|
2025-09-10 18:58:25 +08:00 |
|
Fangyin Cheng
|
ea6cee49ab
|
chore: Fix tongyi config example (#2884)
|
2025-08-21 18:09:28 +08:00 |
|
Dmitry
|
845432fea0
|
feat(model): AI/ML API integration (#2844)
|
2025-07-15 13:17:14 +08:00 |
|
Fangyin Cheng
|
d9d4d4b6bc
|
feat(model): Support MLX inference (#2781)
|
2025-06-19 09:30:58 +08:00 |
|
Fangyin Cheng
|
bb947b6af7
|
feat(model): Support Qwen3 embeddings (#2772)
|
2025-06-14 23:34:48 +08:00 |
|
Fangyin Cheng
|
2abd68d6c0
|
feat(model): Support Qwen3 models (#2664)
|
2025-04-29 09:55:28 +08:00 |
|
paxionfruit
|
445076b433
|
feat: add model provider InfiniAI (#2653)
Co-authored-by: yaozhuyu <yaozhuyu@infini-ai.com>
|
2025-04-27 16:22:31 +08:00 |
|
ethan
|
12170e2504
|
feat(mode) : Adding the siliconflow embeding proxy (#2603)
Co-authored-by: 9527 <9527@163.com>
Co-authored-by: Fangyin Cheng <staneyffer@gmail.com>
|
2025-04-11 16:41:51 +08:00 |
|
Fangyin Cheng
|
285c5e5d59
|
chore: Add app config example
|
2025-03-17 16:22:27 +08:00 |
|
Fangyin Cheng
|
9f2e747698
|
fix: Fix retrieve error
|
2025-03-17 15:38:31 +08:00 |
|
Fangyin Cheng
|
b2dd66dc6d
|
feat(storage): Support oss and s3
|
2025-03-17 15:38:31 +08:00 |
|
Aries-ckt
|
0d48f37ed9
|
deps(tongyi): fix tongyi dependencies and add tongyi proxy config (#2467)
# Description
deps(tongyi): fix tongyi dependencies and add tongyi proxy config
# How Has This Been Tested?
```bash
uv sync --all-packages --frozen --extra "base" --extra "proxy_tongyi" --extra "rag" --extra "storage_chromadb" --extra "dbgpts"
```
```bash
uv run python packages/dbgpt-app/src/dbgpt_app/dbgpt_server.py --config .\configs\dbgpt-proxy-tongyi.toml
```
|
2025-03-17 15:17:12 +08:00 |
|
yyhhyyyyyy
|
2ea3521114
|
fix:add tongyi api key config
|
2025-03-17 14:52:32 +08:00 |
|
Aries-ckt
|
fc3fe6b725
|
refactor: rag storage refactor (#2434)
|
2025-03-17 14:15:21 +08:00 |
|
yyhhyyyyyy
|
6b8e4b539c
|
fix: update API provider from proxy/openai to proxy/tongyi in configuration file
|
2025-03-17 08:50:42 +08:00 |
|
yyhhyyyyyy
|
9483cba353
|
deps(tongyi): fix tongyi dependencies and add tongyi proxy config
|
2025-03-14 16:59:50 +08:00 |
|
yyhhyyyyyy
|
b25f6920d1
|
deps(tongyi): fix tongyi dependencies and add tongyi proxy config
|
2025-03-14 16:48:29 +08:00 |
|
Fangyin Cheng
|
455cdd1612
|
feat(build): Support docker install
|
2025-03-12 10:24:22 +08:00 |
|
yyhhyy
|
bb06e93215
|
refactor(ollama): add ollama config and support ollama model output (#2411)
|
2025-03-07 18:23:52 +08:00 |
|
SonglinLyu
|
3bd75d8de2
|
refactor(GraphRAG): refine config usage and fix some bug. (#2392)
Co-authored-by: 秉翟 <lyusonglin.lsl@antgroup.com>
Co-authored-by: Fangyin Cheng <staneyffer@gmail.com>
Co-authored-by: yyhhyyyyyy <95077259+yyhhyyyyyy@users.noreply.github.com>
Co-authored-by: aries_ckt <916701291@qq.com>
|
2025-03-06 03:09:10 +08:00 |
|
Fangyin Cheng
|
c8e252c4de
|
fix(model): Fix reasoning output bug
|
2025-03-04 17:51:13 +08:00 |
|
yyhhyy
|
e3a25de7f7
|
docs: add vllm llama_cpp docs and standardize configs (#2386)
|
2025-03-03 16:59:07 +08:00 |
|
Fangyin Cheng
|
82bdc6fe94
|
feat(model): Support reasoning model (#2375)
Co-authored-by: yyhhyyyyyy <95077259+yyhhyyyyyy@users.noreply.github.com>
|
2025-02-28 14:32:47 +08:00 |
|
Fangyin Cheng
|
1e0674c54d
|
fix(model): Fix local embedding error (#2371)
|
2025-02-26 23:51:25 +08:00 |
|
Aries-ckt
|
22598ca79f
|
refactor:adapt rag storage and add integration documents. (#2361)
|
2025-02-24 12:49:36 +08:00 |
|
Fangyin Cheng
|
e4b329ee21
|
refactor(v0.7.0): restructure modules and config handling (#2358)
Co-authored-by: aries_ckt <916701291@qq.com>
|
2025-02-21 19:54:53 +08:00 |
|