loading model: /home/ps/app/edison/TigerBot/TigerResearch/tigerbot-70b-chat-4bit-exl2...
Traceback (most recent call last):
File "/home/ps/app/edison/TigerBot/other_infer/exllamav2_hf_infer.py", line 300, in
fire.Fire(main)
File "/home/ps/anaconda3/envs/FastChat/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/ps/anaconda3/envs/FastChat/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/ps/anaconda3/envs/FastChat/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/ps/app/edison/TigerBot/other_infer/exllamav2_hf_infer.py", line 190, in main
model = get_model(model_path)
File "/home/ps/app/edison/TigerBot/other_infer/exllamav2_hf_infer.py", line 177, in get_model
model = Exllamav2HF.from_pretrained(model)
File "/home/ps/app/edison/TigerBot/other_infer/exllamav2_hf_infer.py", line 161, in from_pretrained
config.prepare()
File "/home/ps/anaconda3/envs/FastChat/lib/python3.10/site-packages/exllamav2/config.py", line 112, in prepare
with safe_open(st_file, framework = "pt", device = "cpu") as f:
safetensors_rust.SafetensorError: Error while deserializing header: InvalidHeaderDeserialization
环境完全按照要求安装的依赖
我执行这个了命令报错
CUDA_VISIBLE_DEVICES=0 python other_infer/exllamav2_hf_infer.py --model_path TigerResearch/tigerbot-70b-chat-4bit-exl2
看样子是model没下全 建议删除本地权重 重新下载model