regularization

package
v1.24.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 26, 2026 License: Apache-2.0 Imports: 8 Imported by: 0

Documentation

Overview

Package regularization provides regularization layers for neural networks.

Stability: stable

Package regularization provides regularization layers for neural networks.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func BuildDropout

func BuildDropout[T tensor.Float](
	engine compute.Engine[T],
	ops numeric.Arithmetic[T],
	_ string,
	_ map[string]*graph.Parameter[T],
	attributes map[string]any,
) (graph.Node[T], error)

BuildDropout constructs a Dropout node from the provided attributes.

func BuildFeatureDropout

func BuildFeatureDropout[T tensor.Float](
	engine compute.Engine[T],
	ops numeric.Arithmetic[T],
	_ string,
	_ map[string]*graph.Parameter[T],
	attributes map[string]any,
) (graph.Node[T], error)

BuildFeatureDropout constructs a FeatureDropout node from the provided attributes.

Types

type Dropout

type Dropout[T tensor.Float] struct {
	graph.NoParameters[T]
	// contains filtered or unexported fields
}

Dropout implements inverted dropout regularization. During training, each element is zeroed with probability `rate` and the surviving elements are scaled by 1/(1-rate) so that expected values are preserved. During evaluation (the default mode) the input is returned unchanged.

func NewDropout

func NewDropout[T tensor.Float](engine compute.Engine[T], ops numeric.Arithmetic[T], rate T) *Dropout[T]

NewDropout creates a new Dropout layer with the given drop rate. The rate must be in [0, 1). A rate of 0 disables dropout entirely.

func (*Dropout[T]) Attributes

func (d *Dropout[T]) Attributes() map[string]any

Attributes returns the non-tensor attributes of the layer.

func (*Dropout[T]) Backward

func (d *Dropout[T]) Backward(ctx context.Context, _ types.BackwardMode, dOut *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward computes the backward pass. In evaluation mode the upstream gradient is returned unchanged. In training mode the upstream gradient is multiplied by the cached mask from the most recent Forward call.

func (*Dropout[T]) Forward

func (d *Dropout[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward computes the forward pass. In evaluation mode the input is returned unchanged. In training mode each element is independently zeroed with probability rate, and surviving elements are scaled by 1/(1-rate) (inverted dropout).

func (*Dropout[T]) IsTraining

func (d *Dropout[T]) IsTraining() bool

IsTraining returns whether the layer is in training mode.

func (*Dropout[T]) OpType

func (d *Dropout[T]) OpType() string

OpType returns the operation type.

func (*Dropout[T]) OutputShape

func (d *Dropout[T]) OutputShape() []int

OutputShape returns the output shape from the most recent Forward call.

func (*Dropout[T]) SetTraining

func (d *Dropout[T]) SetTraining(training bool)

SetTraining enables or disables training mode.

type FeatureDropout

type FeatureDropout[T tensor.Float] struct {
	graph.NoParameters[T]
	// contains filtered or unexported fields
}

FeatureDropout implements feature-level (column-wise) inverted dropout. During training, entire feature columns are zeroed with probability rate, and surviving columns are scaled by 1/(1-rate). During evaluation the input is returned unchanged.

func NewFeatureDropout

func NewFeatureDropout[T tensor.Float](engine compute.Engine[T], ops numeric.Arithmetic[T], rate T) *FeatureDropout[T]

NewFeatureDropout creates a new FeatureDropout layer with the given drop rate. The rate must be in [0, 1). A rate of 0 disables dropout entirely.

func (*FeatureDropout[T]) Attributes

func (d *FeatureDropout[T]) Attributes() map[string]any

Attributes returns the non-tensor attributes of the layer.

func (*FeatureDropout[T]) Backward

func (d *FeatureDropout[T]) Backward(ctx context.Context, _ types.BackwardMode, dOut *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward computes the backward pass. In evaluation mode the upstream gradient is returned unchanged. In training mode the upstream gradient is multiplied by the cached mask from the most recent Forward call.

func (*FeatureDropout[T]) Forward

func (d *FeatureDropout[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward computes the forward pass. In evaluation mode the input is returned unchanged. In training mode entire feature columns (axis=1) are independently zeroed with probability rate, and surviving columns are scaled by 1/(1-rate).

func (*FeatureDropout[T]) IsTraining

func (d *FeatureDropout[T]) IsTraining() bool

IsTraining returns whether the layer is in training mode.

func (*FeatureDropout[T]) OpType

func (d *FeatureDropout[T]) OpType() string

OpType returns the operation type.

func (*FeatureDropout[T]) OutputShape

func (d *FeatureDropout[T]) OutputShape() []int

OutputShape returns the output shape from the most recent Forward call.

func (*FeatureDropout[T]) SetTraining

func (d *FeatureDropout[T]) SetTraining(training bool)

SetTraining enables or disables training mode.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL