高德政
高德政
> python 3.6 pytorch 1.6 cuda 10.2 `conda install pytorch==1.6 torchvision torchaudio cudatoolkit=10.2 -c pytorch` > > detectron2 0.2.1 > > ``` > python -m pip install detectron2==0.2.1 -f >...
> python3.10, pytorch==2.7.0, cuda12.4, flash_atten==2.8.1 doesn't work for me. flash-attn==2.6.3 doesn't work for me. flash_atten==2.7.4.post1 works for me. but it is slow to download. Using this: `pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl` instead...