Logo
Explore Help
Register Sign In
github/ColossalAI
1
0
Fork 0
You've already forked ColossalAI
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2026-01-19 09:04:50 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
20da6e48c8ebd7e17f68308b0bc45e977ff80471
ColossalAI/colossalai/nn
History
Jiarui Fang 4a76084dc9 [tensor] add zero_like colo op, important for Optimizer (#1236)
2022-07-08 14:55:27 +08:00
..
_ops
[tensor] add zero_like colo op, important for Optimizer (#1236)
2022-07-08 14:55:27 +08:00
graph
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
2022-07-06 16:15:16 +08:00
layer
[pipeline] refactor the pipeline module (#1087)
2022-06-10 11:27:38 +08:00
loss
[tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230)
2022-07-07 19:17:23 +08:00
lr_scheduler
[checkpoint]support generalized scheduler (#1222)
2022-07-07 18:16:38 +08:00
metric
[hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622)
2022-04-02 16:12:04 +08:00
optimizer
[optim] refactor fused sgd (#1134)
2022-06-20 11:19:38 +08:00
parallel
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
2022-07-06 16:15:16 +08:00
__init__.py
[pipeline] refactor the pipeline module (#1087)
2022-06-10 11:27:38 +08:00
init.py
Refactored docstring to google style
2022-03-29 17:17:47 +08:00
Powered by Gitea Version: 1.25.2 Page: 1104ms Template: 141ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API