README
¶
Infergo — Go programs that learn
infergo
is a probabilistic programming
facility for the Go language. infergo
allows to write probabilistic models in almost unrestricted Go
and relies on automatic
differentiation
for optimization and inference.
Example
Learning parameters of the Normal distribution from observations.
Model
type Model struct {
Data []float64
}
// x[0] is the mean, x[1] is the log stddev of the distribution
func (m *Model) Observe(x []float64) float64 {
// Our prior is a unit normal ...
ll := Normal.Logps(0, 1, x...)
// ... but the posterior is based on data observations.
ll += Normal.Logps(x[0], math.Exp(x[1]), m.Data...)
return ll
}
Inference
// Data
m := &Model{[]float64{
-0.854, 1.067, -1.220, 0.818, -0.749,
0.805, 1.443, 1.069, 1.426, 0.308}}
// Parameters
mean, logs := 0., 0.
x := []float64{mean, logs}
// Optimiziation
opt := &infer.Momentum{
Rate: 0.01,
Decay: 0.998,
}
for iter := 0; iter != 1000; iter++ {
opt.Step(m, x)
}
mean, logs := x[0], x[1]
// Posterior
hmc := &infer.HMC{
L: 10,
Eps: 0.1,
}
samples := make(chan []float64)
hmc.Sample(m, x, samples)
for i := 0; i != 1000; i++ {
x = <-samples
}
hmc.Stop()
Acknowledgements
I owe a debt of gratitude to Frank
Wood who introduced me to
probabilistic programming and inspired me to pursue
probabilistic programming paradigms and applications. I also
want to thank Jan-Willem van de
Meent, with whom I had
fruitful discussions of motives, ideas, and implementation
choices behind infergo
, and whose thoughts and recommendations
significantly influenced infergo
design. Finally, I want to
thank PUB+, the company I work for, for
supporting me in development of infergo
and letting me
experiment with applying probabilistic programming to critical
decision-making in production environment.
Directories
¶
Path | Synopsis |
---|---|
Package ad implements automatic differentiation of a model.
|
Package ad implements automatic differentiation of a model. |
cmd
|
|
Package dist provides differentiatable distribution models.
|
Package dist provides differentiatable distribution models. |
examples
|
|
gmm/model
Gaussian mixture
|
Gaussian mixture |
hello/model
Inferring parameters of the Normal distribution from observations
|
Inferring parameters of the Normal distribution from observations |
mt/model
Inferring parameters of the Normal distribution from observations
|
Inferring parameters of the Normal distribution from observations |
ppv/model
Determining the best bandwidth for page-per-visit prediction (http://dtolpin.github.io/posts/session-depth/)
|
Determining the best bandwidth for page-per-visit prediction (http://dtolpin.github.io/posts/session-depth/) |
schools/model
The eight schools example as appears in PyStan documentation (and taken from "Bayesian Data Analysis", Section 5.5 by Gelman et al.i): data { int<lower=0> J; // number of schools vector[J] y; // estimated treatment effects vector<lower=0>[J] sigma; // s.e.
|
The eight schools example as appears in PyStan documentation (and taken from "Bayesian Data Analysis", Section 5.5 by Gelman et al.i): data { int<lower=0> J; // number of schools vector[J] y; // estimated treatment effects vector<lower=0>[J] sigma; // s.e. |
Package infer contains inference algorithms: maximum likelihood estimation by gradient descent and approximation of the posterior by Markov Chain Monte Carlo methods (notably Hamiltonian Monte Carlo family of algorithms).
|
Package infer contains inference algorithms: maximum likelihood estimation by gradient descent and approximation of the posterior by Markov Chain Monte Carlo methods (notably Hamiltonian Monte Carlo family of algorithms). |
Package mathx provides auxiliary elemental functions, ubiquitously useful but not found in package math.
|
Package mathx provides auxiliary elemental functions, ubiquitously useful but not found in package math. |
Package model specifies the interface of a probabilistc model.
|
Package model specifies the interface of a probabilistc model. |