Is there a config file for ModernBERT-Large?
Hello, I have been trying to replicate your pretraining results and I wanted to understand which model reported in the paper corresponds to which yaml file listed in the repository.
I think (based on this pull request) that flex-bert-base.yaml is the ModernBERT base in the paper. I am not clear on what the difference is between flex-bert-base.yaml and flex-bert-base-parallel.yaml. I also do not see a config file for the ModernBERT Large.
Thank you for any assistance you can provide. I am interested in pretraining the ModernBERT Large on my own dataset using different tokenizers. LeAnn
It's not merged yet. You can find config files here
https://github.com/AnswerDotAI/ModernBERT/tree/pretraining_documentation/yamls/modernbert