safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
/root/miniconda3/envs/internvl/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:397: UserWarning: do_sample is set to False. However, top_p is set to None -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset top_p. This was detected when initializing the generation config instance, which means the corresponding file may hold incorrect parameterization and should be fixed.
warnings.warn(
Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/yangtao/programs/adv2/InternVL/main.py", line 87, in
When I call the v1.5-4b model, there is no problem, but when I call the v2-4b model, I get an error, how to solve this problem?
It is recommended to strictly follow the environment setup of version 2.0, especially the version of transformers. Additionally, please use the latest code.
Can you provide your code and environment information so that we can reproduce this problem?
This question has not been updated for more than two weeks. This question may be solved. So I close it temporarily. If necessary, please reopen it.