huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name'
python demo.py --cfg-path eval_configs/minigpt4_eval.yaml
Initializing Chat
Loading VIT
Loading VIT Done
Loading Q-Former
Loading Q-Former Done
Loading LLAMA
Traceback (most recent call last):
File "/home/Startupcolors/doze/wound/MiniGPT-4/demo.py", line 57, in repo_type argument if needed.
I'm also having the same problem. It is somehow related to this step:
- Prepare the pretrained MiniGPT-4 checkpoint To play with our pretrained model, download the pretrained checkpoint here. Then, set the path to the pretrained checkpoint in the evaluation config file in eval_configs/minigpt4_eval.yaml at Line 11.
The path at line 11 is interpreted as a repo id rather than a relative local path
Ok, I fixed this. So, apparently, you have to use an absolute path to your the vicuna checkpoint (starting with "/"), otherwise it gets interpreted as a hosted huggingface model.
Thanks, me too, but I am getting this error
Initializing Chat
Loading VIT
Loading VIT Done
Loading Q-Former
Loading Q-Former Done
Loading LLAMA
/home/Startupcolors/miniconda3/envs/fit/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:1714: FutureWarning: Calling LlamaTokenizer.from_pretrained() with the path to a single file or url is deprecated and won't be possible anymore in v5. Use a model identifier or the path to a directory instead.
warnings.warn(
╭───────────────────── Traceback (most recent call last) ──────────────────────╮
│ /home/Startupcolors/doze/wound/MiniGPT-4/demo.py:57 in
Ok, I fixed this. So, apparently, you have to use an absolute path to your the vicuna checkpoint (starting with "/"), otherwise it gets interpreted as a hosted huggingface model.
Hello, I got the same error, could you please explain how to modify it specifically, thx.
i got the same error, I've even added vicuna checkpoint (starting with "/")
我也遇到了同样的错误
any help ??