Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support equivalent of torcheval's merge_state #2643

Open
ytang137 opened this issue Jul 22, 2024 · 1 comment
Open

Support equivalent of torcheval's merge_state #2643

ytang137 opened this issue Jul 22, 2024 · 1 comment
Labels
enhancement New feature or request question Further information is requested

Comments

@ytang137
Copy link

🚀 Feature

Support a method equivalent to torcheval's merge_state to allow explicitly reducing metrics when not used under DDP. This is an updated request from #2063.

Motivation

When used within DDP, torchmetrics objects support automatic syncing and reduction across ranks. However, there doesn't seem to be support for reduction outside DDP. This will be a good feature to have because it allows using torchmetrics for distributed evaluation using frameworks other than DDP.

Pitch

Enabling manual reduction makes torchmetrics more widely applicable because it can be used in distributed frameworks other than DDP, such as ray.

Alternatives

Additional context

@ytang137 ytang137 added the enhancement New feature or request label Jul 22, 2024
@Borda Borda added the question Further information is requested label Jul 24, 2024
@tachwali
Copy link

tachwali commented Aug 4, 2024

@Borda Thanks for the consideration of this request
I am curious if there are any updates on this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants