This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-25 09:12:10 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
ac178ca5c17ca751ff9df38be81da2b4a005fc0d
ColossalAI
/
colossalai
/
booster
/
mixed_precision
History
Jianghai
b366f1d99f
[NFC] Fix format for mixed precision (
#4253
)
...
* [NFC] polish colossalai/booster/mixed_precision/mixed_precision_base.py code style
2023-07-26 14:12:57 +08:00
..
__init__.py
[amp] Add naive amp demo (
#3774
)
2023-05-18 16:33:14 +08:00
bf16.py
[booster] implemented mixed precision class (
#3151
)
2023-03-17 11:00:15 +08:00
fp8.py
[booster] implemented mixed precision class (
#3151
)
2023-03-17 11:00:15 +08:00
fp16_apex.py
[API] add docstrings and initialization to apex amp, naive amp (
#3783
)
2023-05-23 15:17:24 +08:00
fp16_naive.py
[API] add docstrings and initialization to apex amp, naive amp (
#3783
)
2023-05-23 15:17:24 +08:00
fp16_torch.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00
mixed_precision_base.py
[NFC] Fix format for mixed precision (
#4253
)
2023-07-26 14:12:57 +08:00