cannot import name 'Gemma2FlashAttention2' from 'transformers.models.gemma2.modeling_gemma2'
Latest version of transformers is required for bug fixes
cannot import name 'is_torch_greater_or_equal_than_2_0' from 'transformers.pytorch_utils'
After upgrading to transformers 4.48.2 we get this import error for FlagEmbedding
File /opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
[54](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:54) from .gemma_config import CostWiseGemmaConfig
[55](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:55) from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> [56](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56) from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
[57](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:57) from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
[59](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:59) if is_flash_attn_2_available():
ImportError: cannot import name 'Gemma2FlashAttention2' from 'transformers.models.gemma2.modeling_gemma2' (/opt/homebrew/lib/python3.11/site-packages/transformers/models/gemma2/modeling_gemma2.py)
This class declaration does not exist anymore
Hello, @mxchinegod. Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed.
Hello, @mxchinegod. Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed.
updated FlagEmbedding==1.3.4, still have this error.
Hello, @mxchinegod. Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed.
updated FlagEmbedding==1.3.4, still have this error.
Hello, @mxchinegod. Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed.
updated FlagEmbedding==1.3.4, still have this error.
pip install transformers==4.44 can help
Hello, @mxchinegod. Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed.
updated FlagEmbedding==1.3.4, still have this error.
pip install transformers==4.44 can help
pip install transformers==4.44.2
Can confirm this is fixed in 1.3.5 without needing to do transformers==4.44.2