torchmetrics icon indicating copy to clipboard operation
torchmetrics copied to clipboard

Support equivalent of torcheval's `merge_state`

Open ytang137 opened this issue 1 year ago • 1 comments

🚀 Feature

Support a method equivalent to torcheval's merge_state to allow explicitly reducing metrics when not used under DDP. This is an updated request from https://github.com/Lightning-AI/torchmetrics/issues/2063.

Motivation

When used within DDP, torchmetrics objects support automatic syncing and reduction across ranks. However, there doesn't seem to be support for reduction outside DDP. This will be a good feature to have because it allows using torchmetrics for distributed evaluation using frameworks other than DDP.

Pitch

Enabling manual reduction makes torchmetrics more widely applicable because it can be used in distributed frameworks other than DDP, such as ray.

Alternatives

Additional context

ytang137 avatar Jul 22 '24 19:07 ytang137

@Borda Thanks for the consideration of this request I am curious if there are any updates on this issue

tachwali avatar Aug 04 '24 15:08 tachwali