Files
ColossalAI/examples/language/llama2/attn.py
2024-02-27 11:22:07 +08:00

Symbolic link
1 line
85 B
Python

../../../applications/Colossal-LLaMA-2/colossal_llama2/utils/flash_attention_patch.py