galai
galai copied to clipboard
load model error
when I call the huge model with "model = gal.load_model("huge")", the execption showed "Exception: Model "facebook/galactica-120b" on the Hub doesn't have a tokenizer", I clear the cache directory in .cache/huggingface/hub/, run the script again, the error still existed, what's the problem?
What are your galai and transformers versions? What do you get for:
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("facebook/galactica-120b")
?