From ce97790ed73f7962ab1ceae057a020168b45dda4 Mon Sep 17 00:00:00 2001 From: binmakeswell Date: Thu, 14 Sep 2023 23:19:25 +0800 Subject: [PATCH] [doc] fix llama2 code link (#4726) * [doc] fix llama2 code link * [doc] fix llama2 code link * [doc] fix llama2 code link --- README.md | 2 +- docs/README-zh-Hans.md | 2 +- examples/language/llama2/README.md | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 0ddcdab74..25d3b8f83 100644 --- a/README.md +++ b/README.md @@ -224,7 +224,7 @@ Acceleration of [AlphaFold Protein Structure](https://alphafold.ebi.ac.uk/)

- 70 billion parameter LLaMA2 model training accelerated by 195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1 diff --git a/docs/README-zh-Hans.md b/docs/README-zh-Hans.md index dda4f86a2..41eebc59c 100644 --- a/docs/README-zh-Hans.md +++ b/docs/README-zh-Hans.md @@ -217,7 +217,7 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的

- 700亿参数LLaMA2训练加速195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1 diff --git a/examples/language/llama2/README.md b/examples/language/llama2/README.md index 16b263c13..c8fc86d29 100644 --- a/examples/language/llama2/README.md +++ b/examples/language/llama2/README.md @@ -6,7 +6,7 @@

- 70 billion parameter LLaMA2 model training accelerated by 195% -[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama) +[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2) [[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training) ### LLaMA1