diff --git a/README.md b/README.md index 0ddcdab74..25d3b8f83 100644 --- a/README.md +++ b/README.md @@ -224,7 +224,7 @@ Acceleration of [AlphaFold Protein Structure](https://alphafold.ebi.ac.uk/)

- 70 billion parameter LLaMA2 model training accelerated by 195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1 diff --git a/docs/README-zh-Hans.md b/docs/README-zh-Hans.md index dda4f86a2..41eebc59c 100644 --- a/docs/README-zh-Hans.md +++ b/docs/README-zh-Hans.md @@ -217,7 +217,7 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的

- 700亿参数LLaMA2训练加速195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1 diff --git a/examples/language/llama2/README.md b/examples/language/llama2/README.md index 16b263c13..c8fc86d29 100644 --- a/examples/language/llama2/README.md +++ b/examples/language/llama2/README.md @@ -6,7 +6,7 @@

- 70 billion parameter LLaMA2 model training accelerated by 195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1