Skip to content

Print warnings when incompatible parameters are used #271

@RaulPPelaez

Description

@RaulPPelaez

The OP in this issue openmm/openmm-torch#133 was led to believe the Graph Network is equivariant because TorchMD-Net provides the "equivariant_group" parameter (which is only used by TensorNet).
This happens a lot in the current parameter handling, where "irrelevant" parameters are silently ignored in, sometimes subtle, ways.

For instance, if a user sets y_weight=1 but the dataset does not provide energies, the code just ignores y_weight.

I am opening this issue to discuss some mechanism for these things to result in, at least, a warning.

Arguably, the better way to handle these architecture-specific parameters would be to treat them similarly as the dataset_args parameter, so that one would have to write:

model: "graph-network"
model_args: 
    hidden_channels:128
    ...

So that it is clear that:

  1. The params are for the model.
  2. Some parameters work for some models but not for others.

However, this change would break old parameter files. Perhaps there could be a notion of versions in the parameter files?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions