optimisation

package
v0.0.0-...-d247bc1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 7, 2014 License: MIT Imports: 1 Imported by: 0

Documentation

Overview

Package optimisation provides a number of optimisation functions.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func BatchGradientDescent

func BatchGradientDescent(x, y, theta *mat64.Dense, alpha float64, epoch int) *mat64.Dense

Batch gradient descent finds the local minimum of a function. See http://en.wikipedia.org/wiki/Gradient_descent for more details.

func StochasticGradientDescent

func StochasticGradientDescent(x, y, theta *mat64.Dense, alpha float64, epoch, procs int) *mat64.Dense

Stochastic gradient descent updates the parameters of theta on a random row selection from a matrix. It is faster as it does not compute the cost function over the entire dataset every time. It instead calculates the error parameters over only one row of the dataset at a time. In return, there is a trade off for accuracy. This is minimised by running multiple SGD processes (the number of goroutines spawned is specified by the procs variable) in parallel and taking an average of the result.

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL