FlagEmbedding icon indicating copy to clipboard operation
FlagEmbedding copied to clipboard

Can flagembedding dependancy on transformers upgrade to newer ones?

Open firezym opened this issue 1 year ago • 9 comments

I have other packages depending on higher versions of transformer such as sentence-transformers, marker-pdf and so on. Can the future releases upgrade transformers version requirements?

flagembedding 1.3.2 has requirement transformers==4.44.2, but you have transformers 4.46.3.
marker-pdf 0.3.10 has requirement transformers<5.0.0,>=4.45.2, but you have transformers 4.44.2.

firezym avatar Nov 29 '24 01:11 firezym

I’ve also encountered a bug caused by a conflict between transformers==4.44.2 and accelerate==1.2.0. It would be great if you could update the transformers dependency to the latest available version.

Nimtaa avatar Dec 12 '24 09:12 Nimtaa

File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
     54 from .gemma_config import CostWiseGemmaConfig
     55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     59 if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'

error raised for transformers==4.47.0

conderls avatar Dec 16 '24 01:12 conderls

File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
     54 from .gemma_config import CostWiseGemmaConfig
     55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     59 if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'

error raised for transformers==4.47.0

I have the same problem when I am just importing @545999961

from FlagEmbedding import BGEM3FlagModel

File d:\conda\win\envs\prod\Lib\site-packages\FlagEmbedding\inference\reranker\decoder_only\models\gemma_model.py:56
     [54](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:54) from .gemma_config import CostWiseGemmaConfig
     [55](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:55) from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> [56](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56) from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     [57](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:57) from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     [59](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:59) if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2' (d:\conda\win\envs\prod\Lib\site-packages\transformers\models\gemma2\modeling_gemma2.py)

firezym avatar Dec 30 '24 06:12 firezym

Are there any updated plans or timelines for upgrading to the latest version of transformers? There are security updates in the latest release that we really need to align with: https://github.com/huggingface/transformers/issues/34840

Nimtaa avatar Jan 17 '25 10:01 Nimtaa

https://github.com/FlagOpen/FlagEmbedding/issues/1356

Adding to the urgency around this. Would be great to have this merged.

mxchinegod avatar Feb 01 '25 18:02 mxchinegod

Yes, can we please have some action on this? There even is a pull request for it. You just need to merge it and release it.

jesnie avatar Feb 07 '25 08:02 jesnie

Hello, everyone! We've just released version 1.3.4 on PyPi. This update has fixed the dependency issue discussed here. Thanks to @Hypothesis-Z for submitting PR #1343!

hanhainebula avatar Feb 07 '25 13:02 hanhainebula

@hanhainebula Thanks, BTW now we don't have any GEMMA2_START_DOCSTRING in the transformers. Which last version of the transformers I can use? Can we fix it (to up to date transformers -OR- add workaround version conditions >= <=)

n0isy avatar May 27 '25 23:05 n0isy

@n0isy Hello, we have updated the version. Alternatively, you can install transformers==4.46.0.

545999961 avatar May 28 '25 07:05 545999961