1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-06 23:48:26 +00:00
ColossalAI/colossalai/nn
FoolPlayer 21a3915c98 [shardformer] add Dropout layer support different dropout pattern ()
* add dropout layer, add dropout test

* modify seed manager as context manager

* add a copy of col_nn.layer

* add dist_crossentropy loss; separate module test

* polish the code

* fix dist crossentropy loss
2023-06-08 15:01:34 +08:00
..
_ops [doc] Fix typo under colossalai and doc() 2023-04-26 11:38:43 +08:00
layer [shardformer] add Dropout layer support different dropout pattern () 2023-06-08 15:01:34 +08:00
loss [nfc] fix typo colossalai/nn () 2023-06-05 16:04:27 +08:00
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/linear.py code style () 2022-10-19 12:20:51 +08:00
metric [NFC] polish colossalai/nn/metric/_utils.py code style () 2022-10-19 12:20:51 +08:00
optimizer [nfc]fix typo colossalai/pipeline tensor nn () 2023-06-06 14:07:36 +08:00
parallel [nfc] fix typo colossalai/nn () 2023-06-05 16:04:27 +08:00
__init__.py [kernel] added jit warmup () 2022-11-08 16:22:23 +08:00
init.py [NFC] polish colossalai/nn/init.py code style () 2022-07-13 10:51:55 +08:00