package
Version:
v0.7.0
Opens a new window with list of versions in this module.
Published: May 24, 2021
License: BSD-2-Clause
Opens a new window with license information.
Imports: 3
Opens a new window with list of imports.
Imported by: 0
Opens a new window with list of known importers.
Documentation
¶
type AdaGrad struct {
Config
}
AdaGrad assigns a different learning rate to each parameter using the sum of squares of its all historical gradients.
References
Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
http://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf
New returns a new AdaGrad optimizer, initialized according to the given configuration.
Delta returns the difference between the current params and where the method wants it to be.
Label returns the enumeration-like value which identifies this gradient descent method.
NewSupport returns a new support structure with the given dimensions.
Config provides configuration settings for an AdaGrad optimizer.
NewConfig returns a new AdaGrad Config.
func NewDefaultConfig() Config
NewDefaultConfig returns a new Config with generically reasonable default values.
Source Files
¶
Click to show internal directories.
Click to hide internal directories.