Can flagembedding dependancy on transformers upgrade to newer ones?
I have other packages depending on higher versions of transformer such as sentence-transformers, marker-pdf and so on. Can the future releases upgrade transformers version requirements?
flagembedding 1.3.2 has requirement transformers==4.44.2, but you have transformers 4.46.3.
marker-pdf 0.3.10 has requirement transformers<5.0.0,>=4.45.2, but you have transformers 4.44.2.
I’ve also encountered a bug caused by a conflict between transformers==4.44.2 and accelerate==1.2.0. It would be great if you could update the transformers dependency to the latest available version.
File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
54 from .gemma_config import CostWiseGemmaConfig
55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
59 if is_flash_attn_2_available():
ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'
error raised for transformers==4.47.0
File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56 54 from .gemma_config import CostWiseGemmaConfig 55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb ---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING 57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING 59 if is_flash_attn_2_available(): ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'error raised for transformers==4.47.0
I have the same problem when I am just importing @545999961
from FlagEmbedding import BGEM3FlagModel
File d:\conda\win\envs\prod\Lib\site-packages\FlagEmbedding\inference\reranker\decoder_only\models\gemma_model.py:56
[54](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:54) from .gemma_config import CostWiseGemmaConfig
[55](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:55) from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> [56](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56) from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
[57](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:57) from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
[59](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:59) if is_flash_attn_2_available():
ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2' (d:\conda\win\envs\prod\Lib\site-packages\transformers\models\gemma2\modeling_gemma2.py)
Are there any updated plans or timelines for upgrading to the latest version of transformers? There are security updates in the latest release that we really need to align with: https://github.com/huggingface/transformers/issues/34840
https://github.com/FlagOpen/FlagEmbedding/issues/1356
Adding to the urgency around this. Would be great to have this merged.
Yes, can we please have some action on this? There even is a pull request for it. You just need to merge it and release it.
Hello, everyone! We've just released version 1.3.4 on PyPi. This update has fixed the dependency issue discussed here. Thanks to @Hypothesis-Z for submitting PR #1343!
@hanhainebula Thanks, BTW now we don't have any GEMMA2_START_DOCSTRING in the transformers. Which last version of the transformers I can use? Can we fix it (to up to date transformers -OR- add workaround version conditions >= <=)
@n0isy Hello, we have updated the version. Alternatively, you can install transformers==4.46.0.