Support equivalent of torcheval's `merge_state`
🚀 Feature
Support a method equivalent to torcheval's merge_state to allow explicitly reducing metrics when not used under DDP. This is an updated request from https://github.com/Lightning-AI/torchmetrics/issues/2063.
Motivation
When used within DDP, torchmetrics objects support automatic syncing and reduction across ranks. However, there doesn't seem to be support for reduction outside DDP. This will be a good feature to have because it allows using torchmetrics for distributed evaluation using frameworks other than DDP.
Pitch
Enabling manual reduction makes torchmetrics more widely applicable because it can be used in distributed frameworks other than DDP, such as ray.
Alternatives
Additional context
@Borda Thanks for the consideration of this request I am curious if there are any updates on this issue