Logo
Explore Help
Register Sign In
github/ColossalAI
1
0
Fork 0
You've already forked ColossalAI
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2026-01-22 21:24:31 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
42ab36b7620f8a8b286f237327344239fb3b25a3
ColossalAI/colossalai/nn
History
HELSON 42ab36b762 [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230)
2022-07-07 19:17:23 +08:00
..
_ops
[tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230)
2022-07-07 19:17:23 +08:00
graph
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
2022-07-06 16:15:16 +08:00
layer
[pipeline] refactor the pipeline module (#1087)
2022-06-10 11:27:38 +08:00
loss
[tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230)
2022-07-07 19:17:23 +08:00
lr_scheduler
[checkpoint]support generalized scheduler (#1222)
2022-07-07 18:16:38 +08:00
metric
[hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622)
2022-04-02 16:12:04 +08:00
optimizer
[optim] refactor fused sgd (#1134)
2022-06-20 11:19:38 +08:00
parallel
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
2022-07-06 16:15:16 +08:00
__init__.py
[pipeline] refactor the pipeline module (#1087)
2022-06-10 11:27:38 +08:00
init.py
Refactored docstring to google style
2022-03-29 17:17:47 +08:00
Powered by Gitea Version: 1.25.2 Page: 2532ms Template: 845ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API