Donagh Horgan
Donagh Horgan
I'm seeing the same for the 125m and 350m OPT tokenizers (haven't checked the larger ones): ```python >>> AutoTokenizer.from_pretrained("facebook/opt-350m") PreTrainedTokenizer(name_or_path='facebook/opt-350m', vocab_size=50265, model_max_len=1000000000000000019884624838656, is_fast=False, padding_side='right', truncation_side='right', special_tokens={'bos_token': AddedToken("", rstrip=False, lstrip=False, single_word=False,...
I'm seeing this error too. I saw that a change to `Backups.proto` was also addressed in #60 (apparently successfully), so I tried to copy the approach, bumping `Backups.proto` to the...