Logo
Explore Help
Register Sign In
github/ColossalAI
1
0
Fork 0
You've already forked ColossalAI
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2026-04-30 11:53:09 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
30a94431323d71c5ef06bd4b7f047aced3312fdf
ColossalAI/colossalai/zero/gemini
History
Hongxin Liu a15ab139ad [plugin] support get_grad_norm (#6115)
2024-11-05 18:12:47 +08:00
..
chunk
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059)
2024-09-14 10:40:01 +08:00
memory_tracer
[misc] fit torch api upgradation and remove legecy import (#6093)
2024-10-18 16:48:52 +08:00
__init__.py
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
2023-11-28 16:54:42 +08:00
gemini_ddp.py
[checkpointio] fix hybrid plugin model save (#6106)
2024-10-31 17:04:53 +08:00
gemini_hook.py
[gemini] quick fix on possible async operation (#5803)
2024-06-13 10:35:17 +08:00
gemini_mgr.py
[chore] remove unnecessary assert since compute list might not be recorded
2024-05-28 05:16:02 +00:00
gemini_optimizer.py
[plugin] support get_grad_norm (#6115)
2024-11-05 18:12:47 +08:00
placement_policy.py
[misc] fit torch api upgradation and remove legecy import (#6093)
2024-10-18 16:48:52 +08:00
utils.py
[npu] change device to accelerator api (#5239)
2024-01-09 10:20:05 +08:00
Powered by Gitea Version: 1.25.2 Page: 726ms Template: 5ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API