What's the proper way to do logging with nested PL modules? #10594
Unanswered
siyuanfeng-tri
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments
-
Hey @siyuanfeng-tri. You would need to override the def __setattr__(self, key, value):
if isinstance(value, (LightningModule, ModuleWrapperBase)):
self._children.append(key)
patched_attributes = ["_current_fx_name", "_current_hook_fx_name", "_results"]
if isinstance(value, Trainer) or key in patched_attributes:
if hasattr(self, "_children"):
for child in self._children:
setattr(getattr(self, child), key, value)
super().__setattr__(key, value) Check this out from Lightning Flash: https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/core/model.py#L91 This is currently not supported yet in Lightning but it is for the Flash Task. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Awesome, I will give it a try, thanks much |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Suppose I have pytorch lighning modules A, B. And another pytorch lighning C that holds A, B. I make a trainer out of C, and proceed with the normal workflow. I can run the training and validation loops just fine. But I can't seem to properly log things (e.g. calling self.log()) from A or B. And I think the issue is that A and B are not visible / registered to the trainer, and thus they never get a logger attached. What's the recommended way to resolve this?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions