Logo
Explore Help
Register Sign In
github/ColossalAI
1
0
Fork 0
You've already forked ColossalAI
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2026-01-15 23:23:11 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
bdef9dfdbee1f7f2d4a8e830a6b4bfb791fc67e9
ColossalAI/colossalai/nn/layer
History
ver217 f8a7148dec [kernel] move all symlinks of kernel to colossalai._C (#1971)
2022-11-17 13:42:33 +08:00
..
colossalai_layer
added skip_bias_add for non-tp linear
2022-11-09 15:41:08 +08:00
moe
[kernel] move all symlinks of kernel to colossalai._C (#1971)
2022-11-17 13:42:33 +08:00
parallel_1d
[tensorparallel] fixed tp layers (#1938)
2022-11-14 17:34:03 +08:00
parallel_2d
[utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548)
2022-09-06 20:18:35 +08:00
parallel_2p5d
[utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548)
2022-09-06 20:18:35 +08:00
parallel_3d
[tensorparallel] fixed tp layers (#1938)
2022-11-14 17:34:03 +08:00
parallel_sequence
[NFC] polish colossalai/nn/layer/parallel_sequence/layers.py code style (#1280)
2022-07-13 12:08:21 +08:00
utils
[NFC] polish colossalai/nn/layer/utils/common.py code style (#983)
2022-05-17 10:25:06 +08:00
vanilla
added skip_bias_add for non-tp linear
2022-11-09 15:41:08 +08:00
wrapper
[NFC] polish colossalai/nn/layer/wrapper/pipeline_wrapper.py code style (#1303)
2022-07-13 19:01:07 +08:00
__init__.py
[MOE] changed parallelmode to dist process group (#460)
2022-03-19 13:46:29 +08:00
base_layer.py
[utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548)
2022-09-06 20:18:35 +08:00
Powered by Gitea Version: 1.25.2 Page: 1246ms Template: 190ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API