This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-17 16:08:30 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
936dd96dbb6a9a4c1d167d795d4494b45edb4f5a
ColossalAI
/
extensions
/
pybind
History
Steve Luo
7806842f2d
add paged-attetionv2: support seq length split across thread block (
#5707
)
2024-05-14 12:46:54 +08:00
..
cpu_adam
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
flash_attention
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
inference
add paged-attetionv2: support seq length split across thread block (
#5707
)
2024-05-14 12:46:54 +08:00
layernorm
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
moe
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
optimizer
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
softmax
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00
__init__.py
[Inference/Refactor] Refactor compilation mechanism and unified multi hw (
#5613
)
2024-04-24 14:17:54 +08:00