Files
ColossalAI/examples/language/llama
2023-07-26 14:12:57 +08:00
..
2023-07-26 14:12:57 +08:00

Pretraining LLaMA: best practices for building LLaMA-like base models

  • 65-billion-parameter large model pretraining accelerated by 38% [code] [blog]

Since the main branch is being updated, in order to maintain the stability of the code, this example is temporarily kept as an independent branch.