You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
C:\...miniconda3\envs\envpt\Lib\site-packages\lightning\pytorch\utilities\parsing.py:208: Attribute 'logging_metrics' is an instance of nn.Moduleand is already saved during checkpointing. It is recommended to ignore them usingself.save_hyperparameters(ignore=['logging_metrics']).
This is caused by self.save_hyperparameters() in init method of TemporalFusionTransformer, because save_hyperparameters() uses inspect and frame to identify all the hyperparameters,
What's the reason to keep it or shall we add handling in init?
The text was updated successfully, but these errors were encountered:
dr-upsilon
changed the title
Is there a reason that pytorch-forecasting saves loss and logging_metrics multiple times?
Is there a reason that pytorch-forecasting seemingly unnecessarily saves loss and logging_metrics multiple times?
Oct 22, 2024
C:\...miniconda3\envs\envpt\Lib\site-packages\lightning\pytorch\utilities\parsing.py:208: Attribute 'logging_metrics' is an instance of
nn.Moduleand is already saved during checkpointing. It is recommended to ignore them using
self.save_hyperparameters(ignore=['logging_metrics']).
This is caused by
self.save_hyperparameters()
in init method of TemporalFusionTransformer, because save_hyperparameters() uses inspect and frame to identify all the hyperparameters,What's the reason to keep it or shall we add handling in init?
The text was updated successfully, but these errors were encountered: