This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-27 02:03:13 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
4da256a584b0d391be90b49691b071549df36637
ColossalAI
/
colossalai
/
nn
History
ver217
dbe62c67b8
add an example of ViT-B/16 and remove w_norm clipping in LAMB (
#29
)
2021-11-18 23:45:09 +08:00
..
data
Migrated project
2021-10-28 18:21:23 +02:00
layer
Support TP-compatible Torch AMP and Update trainer API (
#27
)
2021-11-18 19:45:06 +08:00
loss
Migrated project
2021-10-28 18:21:23 +02:00
lr_scheduler
Support TP-compatible Torch AMP and Update trainer API (
#27
)
2021-11-18 19:45:06 +08:00
model
Migrated project
2021-10-28 18:21:23 +02:00
multi_tensor_apply
Migrated project
2021-10-28 18:21:23 +02:00
optimizer
add an example of ViT-B/16 and remove w_norm clipping in LAMB (
#29
)
2021-11-18 23:45:09 +08:00
__init__.py
Migrated project
2021-10-28 18:21:23 +02:00