mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-09-01 17:17:05 +00:00
[nfc] fix typo colossalai/cli fx kernel (#3847)
* fix typo colossalai/autochunk auto_parallel amp * fix typo colossalai/auto_parallel nn utils etc. * fix typo colossalai/auto_parallel autochunk fx/passes etc. * fix typo docs/ * change placememt_policy to placement_policy in docs/ and examples/ * fix typo colossalai/ applications/ * fix typo colossalai/cli fx kernel
This commit is contained in:
@@ -43,7 +43,7 @@ def warmup_jit_fusion(batch_size: int,
|
||||
seq_length: int = 512,
|
||||
vocab_size: int = 32768,
|
||||
dtype: torch.dtype = torch.float32):
|
||||
""" Compilie JIT functions before the main training steps """
|
||||
""" Compile JIT functions before the main training steps """
|
||||
|
||||
embed = Embedding(vocab_size, hidden_size).to(get_current_device())
|
||||
linear_1 = Linear(hidden_size, hidden_size * 4, skip_bias_add=True).to(get_current_device())
|
||||
|
Reference in New Issue
Block a user