adagrad

package
v0.7.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 24, 2021 License: BSD-2-Clause Imports: 3 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type AdaGrad

type AdaGrad struct {
	Config
}

AdaGrad assigns a different learning rate to each parameter using the sum of squares of its all historical gradients. References

Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
http://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf

func New

func New(c Config) *AdaGrad

New returns a new AdaGrad optimizer, initialized according to the given configuration.

func (*AdaGrad) Delta

func (o *AdaGrad) Delta(param nn.Param) mat.Matrix

Delta returns the difference between the current params and where the method wants it to be.

func (*AdaGrad) Label

func (o *AdaGrad) Label() int

Label returns the enumeration-like value which identifies this gradient descent method.

func (*AdaGrad) NewSupport

func (o *AdaGrad) NewSupport(r, c int) *nn.Payload

NewSupport returns a new support structure with the given dimensions.

type Config

type Config struct {
	gd.MethodConfig
	LR      mat.Float
	Epsilon mat.Float
}

Config provides configuration settings for an AdaGrad optimizer.

func NewConfig

func NewConfig(lr, epsilon mat.Float) Config

NewConfig returns a new AdaGrad Config.

func NewDefaultConfig

func NewDefaultConfig() Config

NewDefaultConfig returns a new Config with generically reasonable default values.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL