Loss Functions

These classes define loss functions for model optimization, as well as generalized losses that combine several losses of different types.

class mlcg.nn.losses.Loss(losses, weights=None)[source]

Generalized loss function class that can be used to combine more than one loss function

Parameters:
  • losses (List[_Loss]) – List of torch loss modules

  • weights (Optional[List[float]]) – List of corresponding weights for each loss in the losses list. By default, each loss is weighted equally by 1.0.

class mlcg.nn.losses.ForceRMSE(force_kwd='forces', size_average=None, reduce=None, reduction='mean')[source]

Force root-mean-square error loss, as defined by:

\[L\left(f,\hat{f}\right) = \sqrt{ \frac{1}{Nd}\sum_{i}^{N} \left\Vert f_i - \hat{f}_i \right\Vert ^2 }\]

where \(f\) are predicted forces, \(\hat{f}\) are reference forces, \(N\) is the number of examples/structures, and \(d\) is the real space dimensionality (eg, \(d=3\) for proteins)

Parameters:
  • force_kwd (str) – string to specify the force key in an AtomicData instance

  • size_average (Optional[bool]) – If True, the loss is normalized by the batch size

  • reduce (Optional[bool]) – If True, the loss is reduced to a scalar

  • reduction (str) – Specifies the method of reduction. See here for more options.

class mlcg.nn.losses.ForceMSE(force_kwd='forces', size_average=None, reduce=None, reduction='mean')[source]

Force mean square error loss, as defined by:

\[L\left(f,\hat{f}\right) = \frac{1}{Nd}\sum_{i}^{N} \left\Vert f_i - \hat{f}_i \right\Vert ^2\]

where \(f\) are predicted forces, \(\hat{f}\) are reference forces, \(N\) is the number of examples/structures, and \(d\) is the real space dimensionality (eg, \(d=3\) for proteins)

Parameters:
  • force_kwd (str) – string to specify the force key in an AtomicData instance

  • size_average (Optional[bool]) – If True, the loss is normalized by the batch size

  • reduce (Optional[bool]) – If True, the loss is reduced to a scalar

  • reduction (str) –

    Specifies the method of reduction. See here for more options.