1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-03 22:18:23 +00:00
ColossalAI/colossalai/tensor
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin ()
* [fp8] support fp8 amp for hybrid parallel plugin

* [test] add fp8 hook test

* [fp8] fix fp8 linear compatibility
2024-08-07 18:21:08 +08:00
..
d_tensor [FP8] rebase main () 2024-08-06 16:29:37 +08:00
moe_tensor [MoE/ZeRO] Moe refactor with zero refactor () 2024-06-28 14:00:08 +08:00
padded_tensor [shardformer] refactor embedding resize () 2024-04-18 16:10:18 +08:00
__init__.py [misc] update pre-commit and run all files () 2023-09-19 14:20:26 +08:00
colo_parameter.py [fp8] support fp8 amp for hybrid parallel plugin () 2024-08-07 18:21:08 +08:00
colo_tensor.py [misc] update pre-commit and run all files () 2023-09-19 14:20:26 +08:00
comm_spec.py fix some typo () 2024-01-25 13:56:27 +08:00
param_op_hook.py [fp8] support fp8 amp for hybrid parallel plugin () 2024-08-07 18:21:08 +08:00
shape_consistency.py [misc] update pre-commit and run all files () 2023-09-19 14:20:26 +08:00
sharding_spec.py [FP8] rebase main () 2024-08-06 16:29:37 +08:00
utils.py [misc] update pre-commit and run all files () 2023-09-19 14:20:26 +08:00