Orion icon indicating copy to clipboard operation
Orion copied to clipboard

运行代码报错 `pip install flash_attn`

Open zRzRzRzRzRzRzR opened this issue 2 years ago • 2 comments

报错

复制官方的cli_demo运行,在 init_model()会出现如下报错

ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`

问题

加载的过程中一定要使用flash_attn这个仓库吗,如果是,是flash_attn2吗,没有看到requirements.txt提到 根据官方下载地址,windows下载不了这个包

zRzRzRzRzRzRzR avatar Jan 21 '24 09:01 zRzRzRzRzRzRzR

+1

我的机器上换成老一点的版本可以。 pip install flash-attn===1.0.4 --no-build-isolation

LF-LLL avatar Jan 24 '24 09:01 LF-LLL

报错

复制官方的cli_demo运行,在 init_model()会出现如下报错

ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`

问题

加载的过程中一定要使用flash_attn这个仓库吗,如果是,是flash_attn2吗,没有看到requirements.txt提到 根据官方下载地址,windows下载不了这个包

安装flash_attn的问题 答:先安装对应版本的cuda-nvcc,https://anaconda.org/nvidia/cuda-nvcc 再安装flash_attn,https://github.com/Dao-AILab/flash-attention/releases/ pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.3.3/flash_attn-2.3.3+cu122torch2.1cxx11abiFALSE-cp38-cp38-linux_x86_64.whl

yecphaha avatar Jan 25 '24 03:01 yecphaha