Files
ColossalAI/examples/language/llama
2023-07-17 21:07:44 +08:00
..

Pretraining LLaMA: best practices for building LLaMA-like base models

  • 65-billion-parameter large model pretraining accelerated by 38% [code] [blog]

Since the main branch is being updated, in order to maintain the stability of the code, this example is temporarily kept as an independent branch.