Skip to content

What's the relationship between number of gpu and batch size (global batch size)) #13314

Discussion options

You must be logged in to vote

Hi @HuangChiEn, in your case shown in the sample code, the actual batch size will be 128 (32 * 4).

It behaves differently depending on which strategy is used. You can read more about it

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@HuangChiEn
Comment options

Answer selected by HuangChiEn
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
distributed Generic distributed-related topic accelerator: cuda Compute Unified Device Architecture GPU
2 participants