-
Notifications
You must be signed in to change notification settings - Fork 309
FLOPS in ResNet50_ImageNet #94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
fvcore will get the output of 4.09G, but it will also print Skipped operation aten::batch_norm 53 time(s)
Skipped operation aten::max_pool2d 1 time(s)
Skipped operation aten::add_ 16 time(s)
Skipped operation aten::adaptive_avg_pool2d 1 time(s) Perhaps those papers ignore the computation of some of the operators. |
@jkhu29 you're right! ptflops also considers batch norms and poolings as non-zero ops, that's why it outputs slightly greater numbers than expected. |
How to ignore the batch norms and poolings ops in ptflops? |
You can pass all the nn.Modules to exclude via |
Uh oh!
There was an error while loading. Please reload this page.
Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration
(pytorch default 76.15% test accuracy for pretrained model)
Can you please modify the code or mention the reason for getting 0.03B increase in FLOPs?
The text was updated successfully, but these errors were encountered: