From 5e4bced0a3fdcb790cda3811aa445f6691e468b1 Mon Sep 17 00:00:00 2001 From: Ikko Eltociear Ashimine Date: Fri, 6 Jan 2023 11:09:14 +0900 Subject: [PATCH] [NFC] Update roberta/README.md (#2350) --- examples/language/roberta/README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/examples/language/roberta/README.md b/examples/language/roberta/README.md index c119d23b5..a42b1935d 100644 --- a/examples/language/roberta/README.md +++ b/examples/language/roberta/README.md @@ -33,7 +33,7 @@ service ssh restart ```bash cd preprocessing ``` -following the `README.md`, preprocess orginal corpus to h5py+numpy +following the `README.md`, preprocess original corpus to h5py+numpy ## 2. Pretrain @@ -44,7 +44,7 @@ following the `README.md`, load the h5py generated by preprocess of step 1 to pr ## 3. Finetune -The checkpoint produced by this repo can replace `pytorch_model.bin` from [hfl/chinese-roberta-wwm-ext-large](https://huggingface.co/hfl/chinese-roberta-wwm-ext-large/tree/main) directly. Then use transfomers from HuggingFace to finetune downstream application. +The checkpoint produced by this repo can replace `pytorch_model.bin` from [hfl/chinese-roberta-wwm-ext-large](https://huggingface.co/hfl/chinese-roberta-wwm-ext-large/tree/main) directly. Then use transfomers from Hugging Face to finetune downstream application. ## Contributors The repo is contributed by AI team from [Moore Threads](https://www.mthreads.com/). If you find any problems for pretraining, please file an issue or send an email to yehua.zhang@mthreads.com. At last, welcome any form of contribution! @@ -55,4 +55,4 @@ The repo is contributed by AI team from [Moore Threads](https://www.mthreads.com author={Yehua Zhang, Chen Zhang}, year={2022} } -``` \ No newline at end of file +```