gd

package
v0.5.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 7, 2021 License: BSD-2-Clause Imports: 6 Imported by: 0

Documentation

Index

Constants

View Source
const (
	// None represents the absence of a specific gradient descent optimization method.
	None int = iota
	// SGD represents the SGD gradient descent optimization method.
	SGD
	// AdaGrad represents the AdaGrad gradient descent optimization method.
	AdaGrad
	// Adam represents the Adam gradient descent optimization method.
	Adam
	// RAdam represents the RAdam gradient descent optimization method.
	RAdam
	// RMSProp represents the RMSProp gradient descent optimization method.
	RMSProp
)

Variables

This section is empty.

Functions

func GetOrSetPayload

func GetOrSetPayload(param nn.Param, m Method) *nn.Payload

GetOrSetPayload returns the payload from param, if it already exists, otherwise a new payload is created, assigned to the param, and returned.

Types

type BatchScheduler

type BatchScheduler interface {
	// IncBatch beats the occurrence of a new batch.
	IncBatch()
}

BatchScheduler is implemented by any value that has the IncBatch method.

type EpochScheduler

type EpochScheduler interface {
	// IncEpoch beats the occurrence of a new epoch.
	IncEpoch()
}

EpochScheduler is implemented by any value that has the IncEpoch method.

type ExampleScheduler

type ExampleScheduler interface {
	// IncExample beats the occurrence of a new example.
	IncExample()
}

ExampleScheduler is implemented by any value that has the IncExample method.

type GradientDescent

type GradientDescent struct {
	// contains filtered or unexported fields
}

GradientDescent implements Gradients Descent (GD) optimization.

func NewOptimizer

func NewOptimizer(method Method, paramsIterator nn.ParamsGetter, opts ...Option) *GradientDescent

NewOptimizer returns a new GradientDescent optimizer. The gradient clipper can be set to nil.

func (*GradientDescent) IncBatch

func (o *GradientDescent) IncBatch()

IncBatch beats the occurrence of a new batch.

func (*GradientDescent) IncEpoch

func (o *GradientDescent) IncEpoch()

IncEpoch beats the occurrence of a new epoch.

func (*GradientDescent) IncExample

func (o *GradientDescent) IncExample()

IncExample beats the occurrence of a new example.

func (*GradientDescent) Optimize

func (o *GradientDescent) Optimize()

Optimize optimize the params, applying the optional gradient clipping. After the optimization the params have zero gradients.

type Method

type Method interface {
	// Label returns the enumeration-like value which identifies this gradient descent method.
	Label() int
	// Delta returns the difference between the current params and where the method wants it to be.
	Delta(param nn.Param) mat.Matrix
	// NewSupport returns a new support structure with the given dimensions.
	NewSupport(r, c int) *nn.Payload
}

Method is implemented by any optimization method.

type MethodConfig

type MethodConfig interface{}

MethodConfig is an empty interface implemented by the configuration structures of AdaGrad, Adam, RMSProp and SGD.

type Option

type Option func(*GradientDescent)

Option allows to configure a new GradientDescent with your specific needs.

func ClipGradByNorm

func ClipGradByNorm(max, normType mat.Float) Option

ClipGradByNorm is an option to clip the gradients during the training by norm.

func ClipGradByValue

func ClipGradByValue(value mat.Float) Option

ClipGradByValue is an option to clip the gradients during the training between -value and +value.

func ConcurrentComputations added in v0.3.0

func ConcurrentComputations(value int) Option

ConcurrentComputations sets the maximum number of concurrent computations handled by the GradientDescent for heavy tasks such as the params update steps. The value 1 corresponds to sequential execution.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL