infergo

module
v0.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 27, 2018 License: MIT

README

Learning programs in Go

Go as a platform for probabilistic inference. Uses automatic differentiation and gradient descent optimization.

GoDoc

MIT license (see LICENSE)

Example

[more examples]

Learning parameters of the Normal distribution from observations.

Model
type Model struct {
    Data []float64
}

// x[0] is the mean, x[1] is the log stddev of the distribution
func (m *Model) Observe(x []float64) float64 {
    // Our prior is a unit normal ...
    ll := Normal.Logps(0, 1, x...)
    // ... but the posterior is based on data observations.
	ll += Normal.Logps(x[0], math.Exp(x[1]), m.Data...)
    return ll
}
Inference
// Data
m := &Model{[]float64{
	-0.854, 1.067, -1.220, 0.818, -0.749,
	0.805, 1.443, 1.069, 1.426, 0.308}}

// Parameters
mean, logs := 0., 0.
x := []float64{mean, logs}
	
// Optimiziation
opt := &infer.Momentum{
    Rate:  0.01,
    Decay: 0.998,
}
for iter := 0; iter != 1000; iter++ {
    opt.Step(m, x)
}
mean, logs := x[0], x[1]

// Posterior
hmc := &infer.HMC{
	L:   10,
	Eps: 0.1,
}
samples := make(chan []float64)
hmc.Sample(m, x, samples)
for i := 0; i != 1000; i++ {
	x = <-samples
}
hmc.Stop()

Directories

Path Synopsis
Package ad implements automatic differentiation of a model.
Package ad implements automatic differentiation of a model.
cmd
Package dist provides differentiatable distribution models.
Package dist provides differentiatable distribution models.
ad
examples
gmm
gmm/model
Gaussian mixture
Gaussian mixture
hello/model
Inferring parameters of the Normal distribution from observations
Inferring parameters of the Normal distribution from observations
ppv
ppv/model
Determining the best bandwidth for page-per-visit prediction (http://dtolpin.github.io/posts/session-depth/)
Determining the best bandwidth for page-per-visit prediction (http://dtolpin.github.io/posts/session-depth/)
schools/model
The eight schools example as appears in PyStan documentation (and taken from "Bayesian Data Analysis", Section 5.5 by Gelman et al.i): data { int<lower=0> J; // number of schools vector[J] y; // estimated treatment effects vector<lower=0>[J] sigma; // s.e.
The eight schools example as appears in PyStan documentation (and taken from "Bayesian Data Analysis", Section 5.5 by Gelman et al.i): data { int<lower=0> J; // number of schools vector[J] y; // estimated treatment effects vector<lower=0>[J] sigma; // s.e.
Package infer contains inference algorithms: maximum likelihood estimation by gradient descent and approximation of the posterior by Markov Chain Monte Carlo methods (notably Hamiltonian Monte Carlo family of algorithms).
Package infer contains inference algorithms: maximum likelihood estimation by gradient descent and approximation of the posterior by Markov Chain Monte Carlo methods (notably Hamiltonian Monte Carlo family of algorithms).
Package mathx provides auxiliary elemental functions, ubiquitously useful but not found in package math.
Package mathx provides auxiliary elemental functions, ubiquitously useful but not found in package math.
Package model specifies the interface of a probabilistc model.
Package model specifies the interface of a probabilistc model.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL