Documentation
¶
Overview ¶
Package regularizers adds tools to facilitate add regularization to the weights learned.
It defines a standard Regularizer interface and several methods that implement it.
Layers like layers.Dense, layers.DenseWithBias and kan.Config will take regularizers as inputs.
Index ¶
Constants ¶
const ( // ParamL2 context hyperparameter defines the amount of L2 regularization of kernels. // Each layer may decide independently to implement it or not. // layers.Dense, layers.DenseWithBias, layers.FNN, kan.New and layers.Convolution kernels look at this hyperparameter. // The value should be a float64. // The default is `0.0`. ParamL2 = "l2_regularization" // ParamL1 context hyperparameter defines the L1 regularizer of kernels. // Each layer may decide independently to implement it or not. // layers.Dense, layers.DenseWithBias, layers.FNN, kan.New and layers.Convolution kernels look at this hyperparameter. // The value should be a float64. // The default is `0.0`. ParamL1 = "l1_regularization" )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Regularizer ¶
Regularizer is a function that will add a regularization term (train.AddLoss) to the loss relative to the given weights (Variables).
Notice it takes variables (and not the nodes) as inputs, as some specialized regularizers may want to update the variables post gradient updates to impose constraints -- e.g.: L1 will reduce weights to 0 if they smaller than the amount of regularization; a monotonicity regularizer may force weights to be monotonic in some direction.
func Combine ¶
func Combine(regs ...Regularizer) Regularizer
Combine the provided regularizers into one -- simply apply all of them. If regs is empty, this returns a nil regularizer. If regs has only one element, it is returned. If any of the regs is nil, it is skipped.
func ConstantL1 ¶ added in v0.13.0
func ConstantL1(amount float64) Regularizer
ConstantL1 returns an L1 regularizer applied to the ConsecutiveDifference of the last axis of the weights. This has the effect of pushing each value towards its neighbours, that is, a constant function.
This is useful for control points in piecewise-linear, piecewise-constant or b-spline functions, when one wants to make points that are not trained much to move towards the mean of its neighbours.
func FromContext ¶
func FromContext(ctx *context.Context) Regularizer
FromContext returns a regularizer from context hyperparameters. It may be nil if no regularization is configured.
It looks at ParamL2 and ParamL1 regularizer for now.
func L1 ¶
func L1(amount float64) Regularizer
L1 creates a L1 regularizer (abs(x) * amount) with the given static amount.
It also adds an update rule that sets values of x to 0 if they are smaller than the amount -- to avoid the flipping between positive/negative small values.
func L2 ¶
func L2(amount float64) Regularizer
L2 creates a L2 regularizer (x^2 * amount) with the given static amount.