benleetownsend

Results 7 comments of benleetownsend

Thanks for your interest in finetune. The api for loading base model files saved in this way is. ``` new_model = Classifier(base_model=, base_model_path=) ``` where the base model object is...

If you wanted to use this new model frequently and wanted a cleaner api you could also do: ``` class NewBaseModel(RoBERTa): # replace RoBERTa with whatever model this is based...

Are you trying to run finetune on tensorflow 2.0 by any chance?

Thanks for the bug report, I can reproduce the second issue. For now you can set the kwarg chunk_long_sequences=False and this should reinstate the previous behaviour.

Can you provide a minimum reproducible example for the first issue?

Can you run `pip freeze | grep "tensorflow\|finetune"` and send me the output

So, we explicitly run validation on the final model no matter the val interval. This is for the purpose of keep_best_model, otherwise we can accidentally waste the final set of...