Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. #6761
-
Hi, I started noticing the following warning message after setting up a new conda environment with Pytorch 1.8.1, which is an update from my previous environment that uses Pytorch 1.7.0.
Any idea if this is a real concern? How can we disable trainer = pl.Trainer(
val_check_interval=0.1,
gpus=-1,
accelerator="ddp",
callbacks=[checkpoint_callback, early_stop_callback],
precision=16,
) Packages:
|
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 15 replies
-
Hi @athenawisdoms the docs here cover how you can disable |
Beta Was this translation helpful? Give feedback.
-
Hi! I added this warning in native PyTorch as a way to remind users to disable this flag if performance is critical and there are no unused parameters. One note is - as the warning indicates it can be a false positive if your model has flow control that results in unused params in future iterations. |
Beta Was this translation helpful? Give feedback.
-
Hi. The warning could disappear by adding plugins as below. from pytorch_lightning.plugins import DDPPlugin
trainer = pl.Trainer(
val_check_interval=0.1,
gpus=-1,
accelerator="ddp",
callbacks=[checkpoint_callback, early_stop_callback],
plugins=DDPPlugin(find_unused_parameters=False),
precision=16,
) |
Beta Was this translation helpful? Give feedback.
-
Hello. For those who are using pl 1.6.0, use from pytorch_lightning.strategies.ddp import DDPStrategy
trainer = pl.Trainer(
strategy = DDPStrategy(find_unused_parameters=False),
accelerator = 'gpu',
devices = 3
) |
Beta Was this translation helpful? Give feedback.
Hi @athenawisdoms the docs here cover how you can disable
find_unused_parameters
and speed up your DDP traininghttps://pytorch-lightning.readthedocs.io/en/latest/benchmarking/performance.html#when-using-ddp-set-find-unused-parameters-false