Apply ruff flake8-comprehensions rules.
Fix #2424
Enables flake8-comprehension checks in ruff and adds the automatically generated fixes. This should strictly improve performance by removing unnecessary lookups, iterations, and function calls. It should also help readability and generate more efficient Python byte code.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.
It looks like the two failing checks are flakes?
Hi @Skylion007! Thanks, I've read the changes and they seem reasonable! However, when we made the change to ruff last week we made it compatible with the ruff configuration in transformers: https://github.com/huggingface/transformers/blob/main/pyproject.toml#L8. The idea is for contributors that work on both codebases to have a consistent experience. I'd invite more discussion from the rest of the team before we can apply this change /cc @patrickvonplaten @patil-suraj @williamberman @yiyixuxu @sayakpaul
The idea is for contributors that work on both codebases to have a consistent experience.
Exactly this. If enabling this change would create a disparity between the formatting configurations of transformers and diffusers then we need to reconsider it.
Looks like transformers applied this though: https://github.com/huggingface/transformers/pull/21694/files#diff-50c86b7ed8ac2cf95bd48334961bf0530cdc77b5a56f852c5c61b89d735fd711R8
@Skylion007 would you mind syncing with main and reformatting?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Superseded by #2827