[Fix/Infer] Remove unused deps and revise requirements (#5341)

* remove flash-attn dep

* rm padding llama

* revise infer requirements

* move requirements out of module
This commit is contained in:
Yuanheng Zhao
2024-02-06 17:27:45 +08:00
committed by GitHub
parent 631862f339
commit 1dedb57747
5 changed files with 2 additions and 551 deletions

View File

@@ -1,5 +1,2 @@
ordered_set
transformers==4.34.0
auto-gptq==0.5.0
git+https://github.com/ModelTC/lightllm.git@ece7b43f8a6dfa74027adc77c2c176cff28c76c8
git+https://github.com/Dao-AILab/flash-attention.git@017716451d446e464dde9aca3a3c1ed2209caaa9
transformers==4.36.2