CSV FILE
https://github.com/Georgetown-IR-Lab/ExtendedSumm/blob/cf9d74f2b55b8322c476ac89a49f6655fee219f5/src/prepro/data_builder.py#L563
Can you please tell how to get this csv file ??? Or if we are using json files then we can you use _format_to_bert function instead of format_to_bert function ???
Hi @CJPJ007,
Thank you for reporting this! please see #3
Best, Sajad
Thanks that is working but can say something about screenshot attached here. It is taking too long to train. I am training in Google Colab.

@CJPJ007, what is your virtual machine's specification on Colab? I've been training (and testing) the model on my machine and it works quite fine. The specifications are as follows: NVIDIA GeForce GTX 1080, 12GB memory. Would you please change the report_every training argument to 1 to see if the training is progressing?
Google Colab VM's specification : GPU Tesla P100-PCIE-16GB, RAM 13GB. I changed that argument but no change training keeps at that Pdb only it is remaining in debugging mode only if I try to continue from there by passing unt. Then the following error is arisen :

report_every
can you slove this problem? could you tell me how to slove tks!!!!
Thanks that is working but can say something about screenshot attached here. It is taking too long to train. I am training in Google Colab.
hello friend, are you solve this problem? could you tell me how to do thank you best, cht