Files
ColossalAI/colossalai/utils/__init__.py
Hongxin Liu d202cc28c0 [npu] change device to accelerator api (#5239)
* update accelerator

* fix timer

* fix amp

* update

* fix

* update bug

* add error raise

* fix autocast

* fix set device

* remove doc accelerator

* update doc

* update doc

* update doc

* use nullcontext

* update cpu

* update null context

* change time limit for example

* udpate

* update

* update

* update

* [npu] polish accelerator code

---------

Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com>
Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com>
2024-01-09 10:20:05 +08:00

27 lines
536 B
Python

from .common import (
_cast_float,
conditional_context,
disposable,
ensure_path_exists,
free_storage,
is_ddp_ignored,
set_seed,
)
from .multi_tensor_apply import multi_tensor_applier
from .tensor_detector import TensorDetector
from .timer import MultiTimer, Timer
__all__ = [
"conditional_context",
"Timer",
"MultiTimer",
"multi_tensor_applier",
"TensorDetector",
"ensure_path_exists",
"disposable",
"_cast_float",
"free_storage",
"set_seed",
"is_ddp_ignored",
]