Results 5 comments of Hiroaki Hayashi

@diogovieira You might have already resolved the issue, but setting the relevant python path might help: ``` let g:python3_host_prog = "/path/to/your/python" ```

Correct, there was a breaking change in transformers that changed the order of variable declarations within PretrainedTokenizer. On it.

16GB graphics card may be somewhat limiting. Perhaps you could try CPU-offloading or offloading to NVMe to save more memory, as well as multi-node training.

Hey @FranxYao , thanks for the great work on this paper. I was wondering, did you use the prompt in the code or the modified prompt above for the figure...

Great find! I totally missed that. File order being different after every cloning but consistent within each clone is probably because files are copied and registered by the file system...