Traceback (most recent call last):
File "app.py", line 35, in
model = AutoModel.get_model(model_args, tune_strategy='none', ds_config=ds_config)
File "/home/zebin/work/gpt/LMFlow/src/lmflow/models/auto_model.py", line 16, in get_model
return HFDecoderModel(model_args, *args, **kwargs)
File "/home/zebin/work/gpt/LMFlow/src/lmflow/models/hf_decoder_model.py", line 220, in init
self.backend_model = AutoModelForCausalLM.from_pretrained(
File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/auto_factory.py", line 441, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/configuration_auto.py", line 908, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/transformers/configuration_utils.py", line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/transformers/configuration_utils.py", line 628, in _get_config_dict
resolved_config_file = cached_file(
File "/usr/local/lib/python3.8/dist-packages/transformers/utils/hub.py", line 380, in cached_file
raise EnvironmentError(
OSError: path to /robin-7b/ does not appear to have a file named config.json.

看起来是path设置错了。麻烦将整个运行的命令贴一下,谢谢
It seems the path is wrong. Could you provide the full command? Thank you!
看起来是path设置错了。麻烦将整个运行的命令贴一下,谢谢
It seems the path is wrong. Could you provide the full command? Thank you!
LMFlow/service# python3 app.py

LMFlow/service# python3 app.py
robin-7b is a lora checkpoint. The correct usage should be
model_name_or_path = [the path to llama-7b]
lora_path = [the path to robin-7b]
This issue has been marked as stale because it has not had recent activity. If you think this still needs to be addressed please feel free to reopen this issue. Thanks