Replies: 1 comment 1 reply
-
Hi @hw-ju , It's recommended by PyTorch to use Thanks. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
I see almost all MONAI multigpu tutorial scripts use
torch.nn.DistributedDataParallel
. Doestorch.nn.DistributedDataParallel
always give better performance thantorch.nn.DataParallel
when using MONAI for multigpu training?Beta Was this translation helpful? Give feedback.
All reactions