starcoder icon indicating copy to clipboard operation
starcoder copied to clipboard

Model loading failed for inference - KeyError: 'gpt_bigcode'

Open shailja-thakur opened this issue 2 years ago • 4 comments


KeyError Traceback (most recent call last) /tmp/ipykernel_1004512/2718782402.py in 2 tokenizer = AutoTokenizer.from_pretrained(checkpoint,use_auth_token=True) 3 # to save memory consider using fp16 or bf16 by specifying torch.dtype=torch.float16 for example ----> 4 model = AutoModelForCausalLM.from_pretrained(checkpoint,use_auth_token=True).to(device)

~/anaconda3/envs/verilog_gpt/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs) 421 kwargs["_from_auto"] = True 422 if not isinstance(config, PretrainedConfig): --> 423 config, kwargs = AutoConfig.from_pretrained( 424 pretrained_model_name_or_path, return_unused_kwargs=True, trust_remote_code=trust_remote_code, **kwargs 425 )

~/anaconda3/envs/verilog_gpt/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 743 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs) 744 elif "model_type" in config_dict: --> 745 config_class = CONFIG_MAPPING[config_dict["model_type"]] 746 return config_class.from_dict(config_dict, **kwargs) 747 else:

~/anaconda3/envs/verilog_gpt/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py in getitem(self, key) 450 return self._extra_content[key] 451 if key not in self._mapping: --> 452 raise KeyError(key) 453 value = self._mapping[key] 454 module_name = model_type_to_module_name(key)

KeyError: 'gpt_bigcode'

shailja-thakur avatar May 08 '23 17:05 shailja-thakur

Can you provide more information about the issues you're experiencing? It seemed to be about gpt_bigcode how did you get that error?

IeatToilets avatar May 09 '23 02:05 IeatToilets

Did you install the latest version of transformers?

lvwerra avatar May 09 '23 07:05 lvwerra

The problem probably lies in the version of transformers you are using. Anything >=4.28.1 should be fine.

ArmelRandy avatar May 09 '23 12:05 ArmelRandy

Yes, I already had transformers 0.29.dev version but was conflicting with a lower version

Works now

On Tue, May 9, 2023, 8:23 a.m. ArmelRandy @.***> wrote:

The problem probably lies in the version of transformers you are using. Anything >=4.28.1 should be fine.

— Reply to this email directly, view it on GitHub https://github.com/bigcode-project/starcoder/issues/18#issuecomment-1540026137, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAXKPBMPAFO4O4UGG5GI2HTXFIZLBANCNFSM6AAAAAAX2HBPJY . You are receiving this because you authored the thread.Message ID: @.***>

shailja-thakur avatar May 09 '23 13:05 shailja-thakur