bio-transformers
bio-transformers copied to clipboard
pytorch_lightning.utilities.exceptions.MisconfigurationException: You have asked for `amp_level='O2'` but it's only supported with `amp_backend='apex'`.
I am trying to fine-tune on a set of sequences, using a A100 GPU and the exact example script from the github mainpage, and it throws this error.
What to do here?
I meet this same error, how to deal with?