mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-06-18 11:48:53 +00:00
[doc] fix llama2 code link (#4726)
* [doc] fix llama2 code link * [doc] fix llama2 code link * [doc] fix llama2 code link
This commit is contained in:
parent
20190b49a5
commit
ce97790ed7
@ -224,7 +224,7 @@ Acceleration of [AlphaFold Protein Structure](https://alphafold.ebi.ac.uk/)
|
|||||||
</p>
|
</p>
|
||||||
|
|
||||||
- 70 billion parameter LLaMA2 model training accelerated by 195%
|
- 70 billion parameter LLaMA2 model training accelerated by 195%
|
||||||
[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
|
[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
|
||||||
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
||||||
|
|
||||||
### LLaMA1
|
### LLaMA1
|
||||||
|
@ -217,7 +217,7 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的
|
|||||||
</p>
|
</p>
|
||||||
|
|
||||||
- 700亿参数LLaMA2训练加速195%
|
- 700亿参数LLaMA2训练加速195%
|
||||||
[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
|
[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
|
||||||
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
||||||
|
|
||||||
### LLaMA1
|
### LLaMA1
|
||||||
|
@ -6,7 +6,7 @@
|
|||||||
</p>
|
</p>
|
||||||
|
|
||||||
- 70 billion parameter LLaMA2 model training accelerated by 195%
|
- 70 billion parameter LLaMA2 model training accelerated by 195%
|
||||||
[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
|
[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
|
||||||
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
|
||||||
|
|
||||||
### LLaMA1
|
### LLaMA1
|
||||||
|
Loading…
Reference in New Issue
Block a user