Documentation ¶
Overview ¶
Package optimisation provides a number of optimisation functions.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func BatchGradientDescent ¶
BatchGradientDescent finds the local minimum of a function. See http://en.wikipedia.org/wiki/Gradient_descent for more details.
func StochasticGradientDescent ¶
func StochasticGradientDescent(x, y, theta *mat64.Dense, alpha float64, epoch, procs int) *mat64.Dense
StochasticGradientDescent updates the parameters of theta on a random row selection from a matrix. It is faster as it does not compute the cost function over the entire dataset every time. It instead calculates the error parameters over only one row of the dataset at a time. In return, there is a trade off for accuracy. This is minimised by running multiple SGD processes (the number of goroutines spawned is specified by the procs variable) in parallel and taking an average of the result.
Types ¶
This section is empty.