[kernel] updated unittests for coloattention (#4389)

Updated coloattention tests of checking outputs and gradients
This commit is contained in:
flybird1111
2023-08-09 14:24:45 +08:00
committed by GitHub
parent 089c365fa0
commit 458ae331ad
3 changed files with 94 additions and 52 deletions

View File

@@ -13,6 +13,7 @@ torchrec==0.2.0
contexttimer
einops
triton==2.0.0.dev20221202
git+https://github.com/HazyResearch/flash-attention.git@c422fee3776eb3ea24e011ef641fd5fbeb212623#egg=flash_attn
requests==2.27.1 # downgrade to avoid huggingface error https://github.com/huggingface/transformers/issues/17611
SentencePiece
ninja
flash_attn>=2.0

View File

@@ -10,4 +10,5 @@ contexttimer
ninja
torch>=1.11
safetensors
flash_attn>=2.0
einops