[example] add llama pretraining (#4257)

This commit is contained in:
binmakeswell
2023-07-17 21:07:44 +08:00
committed by GitHub
parent 9a4842c571
commit 7ff11b5537
3 changed files with 32 additions and 0 deletions

View File

@@ -0,0 +1,11 @@
# Pretraining LLaMA: best practices for building LLaMA-like base models
<p id="ColossalChat-Speed" align="center">
<img src="https://raw.githubusercontent.com/hpcaitech/public_assets/main/examples/images/LLaMA_pretraining.png" width=600/>
</p>
- 65-billion-parameter large model pretraining accelerated by 38%
[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
[[blog]](https://www.hpc-ai.tech/blog/large-model-pretraining)
> Since the main branch is being updated, in order to maintain the stability of the code, this example is temporarily kept as an [independent branch](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama).