-
Notifications
You must be signed in to change notification settings - Fork 89
Description
The OP in this issue openmm/openmm-torch#133 was led to believe the Graph Network is equivariant because TorchMD-Net provides the "equivariant_group" parameter (which is only used by TensorNet).
This happens a lot in the current parameter handling, where "irrelevant" parameters are silently ignored in, sometimes subtle, ways.
For instance, if a user sets y_weight=1 but the dataset does not provide energies, the code just ignores y_weight.
I am opening this issue to discuss some mechanism for these things to result in, at least, a warning.
Arguably, the better way to handle these architecture-specific parameters would be to treat them similarly as the dataset_args parameter, so that one would have to write:
model: "graph-network"
model_args:
hidden_channels:128
...
So that it is clear that:
- The params are for the model.
- Some parameters work for some models but not for others.
However, this change would break old parameter files. Perhaps there could be a notion of versions in the parameter files?