Files
ColossalAI/requirements/requirements-test.txt
Cuiqing Li (李崔卿) 28052a71fb [Kernels]Update triton kernels into 2.1.0 (#5046)
* update flash-context-attention

* adding kernels

* fix

* reset

* add build script

* add building process

* add llama2 exmaple

* add colossal-llama2 test

* clean

* fall back test setting

* fix test file

* clean

* clean

* clean

---------

Co-authored-by: cuiqing.li <lixx336@gmail.com>
2023-11-16 16:43:15 +08:00

24 lines
583 B
Plaintext

diffusers
fbgemm-gpu==0.2.0
pytest
coverage==7.2.3
git+https://github.com/hpcaitech/pytest-testmon
torchvision
transformers==4.33.0
timm
titans
torchaudio
torchx-nightly==2022.6.29 # torchrec 0.2.0 requires torchx-nightly. This package is updated every day. We fix the version to a specific date to avoid breaking changes.
torchrec==0.2.0
contexttimer
einops
triton==2.1.0
requests==2.27.1 # downgrade to avoid huggingface error https://github.com/huggingface/transformers/issues/17611
SentencePiece
ninja
flash_attn==2.0.5
datasets
pydantic
ray
#auto-gptq now not support torch1.12