yuquanle

Results 2 comments of yuquanle

> python 3.10.14, Cuda 12.1, Ubuntu22.04.4 LTS torch==2.3.0, flash-attn==2.5.8 works (2.5.9post1 has the same failure) Thanks. I try python 3.9.19. torch==2.3.0, flash-attn==2.5.8. It works.