[example] simplify the GPT2 huggingface example (#1826)

This commit is contained in:
Jiarui Fang
2022-11-08 16:14:07 +08:00
committed by GitHub
parent cd5a0d56fa
commit b1263d32ba
37 changed files with 177 additions and 27611 deletions

View File

@@ -22,6 +22,9 @@ The following example of [Colossal-AI](https://github.com/hpcaitech/ColossalAI)
We are using the pre-training weights of the OPT model provided by Hugging Face Hub on the raw WikiText-2 (no tokens were replaced before
the tokenization). This training script is adapted from the [HuggingFace Language Modelling examples](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling).
## Our Modifications
We adapt the OPT training code to ColossalAI by leveraging Gemini and ZeRO DDP.
## Quick Start
You can launch training by using the following bash script