Jason
Results
1
comments of
Jason
Hi @Narsil , I'm using transformers 4.16.2. Thank you for the tip regarding T5TokenizerFast. I was able to mitigate the issue by explicitly adding special tokens, like so: ``` tokenizer.add_tokens([f"_{n}"...