1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-06 23:48:26 +00:00
ColossalAI/colossalai/checkpoint_io
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code ()
* [legacy] remove outdated codes of pipeline ()

* [legacy] remove cli of benchmark and update optim ()

* [legacy] remove cli of benchmark and update optim

* [doc] fix cli doc test

* [legacy] fix engine clip grad norm

* [legacy] remove outdated colo tensor ()

* [legacy] remove outdated colo tensor

* [test] fix test import

* [legacy] move outdated zero to legacy ()

* [legacy] clean up utils ()

* [legacy] clean up utils

* [example] update examples

* [legacy] clean up amp

* [legacy] fix amp module

* [legacy] clean up gpc ()

* [legacy] clean up context

* [legacy] clean core, constants and global vars

* [legacy] refactor initialize

* [example] fix examples ci

* [example] fix examples ci

* [legacy] fix tests

* [example] fix gpt example

* [example] fix examples ci

* [devops] fix ci installation

* [example] fix examples ci
2023-09-18 16:31:06 +08:00
..
__init__.py [hotfix] fix typo in hybrid parallel io () 2023-09-12 17:32:19 +08:00
checkpoint_io_base.py Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin () 2023-07-07 16:33:06 +08:00
general_checkpoint_io.py Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
hybrid_parallel_checkpoint_io.py [example] llama2 add fine-tune example () 2023-09-15 18:45:44 +08:00
index_file.py [checkpointio] General Checkpointing of Sharded Optimizers () 2023-06-15 15:21:26 +08:00
utils.py [legacy] clean up legacy code () 2023-09-18 16:31:06 +08:00