1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-04 22:48:15 +00:00
ColossalAI/tests/test_fp8
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin ()
* [fp8] support fp8 amp for hybrid parallel plugin

* [test] add fp8 hook test

* [fp8] fix fp8 linear compatibility
2024-08-07 18:21:08 +08:00
..
test_all_to_all_single.py [fp8]support all2all fp8 () 2024-08-06 16:58:23 +08:00
test_fp8_all_to_all_single.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00
test_fp8_all_to_all.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00
test_fp8_allgather_flat.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00
test_fp8_allreduce.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00
test_fp8_gather.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00
test_fp8_hook.py [fp8] support fp8 amp for hybrid parallel plugin () 2024-08-07 18:21:08 +08:00
test_fp8_linear.py [fp8] add fp8 linear () 2024-08-07 15:41:49 +08:00
test_fp8_reduce_scatter.py [Feature] llama shardformer fp8 support () 2024-08-05 10:05:47 +08:00