evaluation
evaluation copied to clipboard
fix: update the version of transformers to support in evaluating gpt-neo models
- Evaluated on: GPT Neo
There is an error would happen if you want to evaluate gpt-neo related models (e.g. "EleutherAI/gpt-neo-125M") using the original version of transformers. This is due to the change of modeling_gpt_neo.py file in the latest version of transformers.
Hey @YU-Anthony, thanks for flagging this. I agree that we want to bump outdated versions. However, it seems like bumping transformers only introduces compatibility issues with other packages. I will open a separate PR that removes hard-coded version requirements in favor of >=.