This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-27 02:03:13 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
eb5cf94332b2f6f230f486a6c9746fe79f771a96
ColossalAI
/
colossalai
/
kernel
/
cuda_native
History
Frank Lee
918bc94b6b
[triton] added copyright information for flash attention (
#2835
)
...
* [triton] added copyright information for flash attention * polish code
2023-02-21 11:25:57 +08:00
..
csrc
[hotfix] fix error for torch 2.0 (
#2243
)
2022-12-30 23:11:55 +08:00
__init__.py
[kernel] fixed repeated loading of kernels (
#2549
)
2023-02-03 09:47:13 +08:00
flash_attention.py
[triton] added copyright information for flash attention (
#2835
)
2023-02-21 11:25:57 +08:00
layer_norm.py
[kernel] fixed repeated loading of kernels (
#2549
)
2023-02-03 09:47:13 +08:00
multihead_attention.py
[setup] support pre-build and jit-build of cuda kernels (
#2374
)
2023-01-06 20:50:26 +08:00
scaled_softmax.py
[kernel] fixed repeated loading of kernels (
#2549
)
2023-02-03 09:47:13 +08:00