Lucile Saulnier
Lucile Saulnier
Hi @YBooks Thank you very much for the detailed issue :hugs: ! I see that you have already proposed a fix that has been merged and that solves the problem...
Hi @AndreaSottana, Thank you very much for sharing a feature proposal! :hugs: I understand your use case, my feeling is that for the moment I will not push for the...
> @SaulLu Thank you very much for your detailed feedback and suggestion. Before moving forward to revise the code w.r.t. the add_tokens feature, it would be great if you could...
Hi @datquocnguyen , I echo [Lysandre's answer](https://github.com/huggingface/transformers/pull/17254#issuecomment-1143221043): I thank you for working very hard for this PR :hugs: and I also think it would be a very good fit for...
> Cleaning the "merges" file will definitely result in different encoding outputs from the slow and fast tokenizers. For example, in the case of FlauBERT, the slow and fast tokenizers...
Hi @farahdian , thank you very much for your contribution. I see that several tests have failed, is this still a work in progress?
Ok top! I'd be happy to give you a hand. I think in your case it would be great if the title of the PR started with `[WIP]` and a...
From your error message, what I understand is that you don't have the `make` command installed on your computer. ([source](https://www.computerhope.com/unix/umake.htm)) > On Unix-like operating systems, make is a utility for...
Hi @farahdian , Just a quick message to see how you're doing with adding the tests on your end. :relaxed:
Hi @farahdian , Thanks for the update. What is your working directory when you run the `make fixup` command?