transformers
transformers copied to clipboard
fix from_pretrained in offline mode when model is preloaded in cache
What does this PR do?
Fixes # (issue) fix model load in offline mode
Who can review?
@ArthurZucker @Wauplin
Reproduce:
- download a private model from hub
- script:
import os
os.environ['TRANSFORMERS_OFFLINE'] = "true"
from transformers.models.auto.modeling_auto import AutoModelForImageClassification
model_id = "Narsil/private"
revision = "main"
kwargs = {'use_auth_token': None}
model = AutoModelForImageClassification.from_pretrained(model_id, revision=revision, **kwargs)
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Thanks! Can you add this example by using HF hub to downlaod snapshot of a .bin file in a temp repo? and test both the flag to go offline and the local_files_only in from pretrained? 🤗
Done @ArthurZucker , added some unit test -> fails before fix, succeeds after test is ugly but I could not find any cleaner way to do it...