certified-developer icon indicating copy to clipboard operation
certified-developer copied to clipboard

Lab 6 not working as "Falcon has now been fully ported into the Hugging Face transformers library"

Open paleze opened this issue 1 year ago • 2 comments

I tried to run Lab 6 on my laptop following the instructions but I received this message:

WARNING: You are currently loading Falcon using legacy code contained in the model repository. Falcon has now been fully ported into the Hugging Face transformers library. For the most up-to-date and high-performance version of the Falcon model code, please update to the latest version of transformers and then load the model without the trust_remote_code=True argument.

I played around a bit by updating transformers version but with no success. Please review this lab.

Thank you.

paleze avatar Mar 26 '24 08:03 paleze

Hello @paleze the warning should not prevent the lab from running. Can you please try again?

eduand-alvarez avatar May 01 '24 13:05 eduand-alvarez

Hello @eduand-alvarez , I just retried by pulling the main branch, recreating the conda env and applying the lab instructions. I still receive an error:

python Falcon_HF_Pipelines.py --falcon_version "7b" --max_length 25 --top_k 5

tokenizer_config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 287/287 [00:00<00:00, 24.1kB/s]
tokenizer.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.73M/2.73M [00:00<00:00, 5.39MB/s]
special_tokens_map.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 281/281 [00:00<00:00, 140kB/s]
config.json: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.05k/1.05k [00:00<00:00, 85.5kB/s]
configuration_falcon.py: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7.16k/7.16k [00:00<00:00, 3.18MB/s]
A new version of the following files was downloaded from https://huggingface.co/tiiuae/falcon-7b:
- configuration_falcon.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.

WARNING: You are currently loading Falcon using legacy code contained in the model repository. Falcon has now been fully ported into the Hugging Face transformers library. For the most up-to-date and high-performance version of the Falcon model code, please update to the latest version of transformers and then load the model without the trust_remote_code=True argument.

modeling_falcon.py: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 56.9k/56.9k [00:00<00:00, 11.1MB/s]
A new version of the following files was downloaded from https://huggingface.co/tiiuae/falcon-7b:
- modeling_falcon.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
pytorch_model.bin.index.json: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16.9k/16.9k [00:00<00:00, 17.5MB/s]
pytorch_model-00001-of-00002.bin: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 9.95G/9.95G [24:17<00:00, 6.83MB/s]
pytorch_model-00002-of-00002.bin: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.48G/4.48G [10:55<00:00, 6.84MB/s]
Downloading shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [35:14<00:00, 1057.13s/it]
Traceback (most recent call last):
  File "/home/calinet/Documents/Code/certified-developer/MLOps_Professional/lab6/sample/Falcon_HF_Pipelines.py", line 69, in <module>
    main(FLAGS)
  File "/home/calinet/Documents/Code/certified-developer/MLOps_Professional/lab6/sample/Falcon_HF_Pipelines.py", line 16, in main
    generator = transformers.pipeline(
  File "/home/calinet/miniconda3/envs/lab6/lib/python3.9/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline
    framework, model = infer_framework_load_model(
  File "/home/calinet/miniconda3/envs/lab6/lib/python3.9/site-packages/transformers/pipelines/base.py", line 279, in infer_framework_load_model
    raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model tiiuae/falcon-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,).``` 

paleze avatar May 02 '24 14:05 paleze