Version: v1.0.8 Latest Latest Go to latest
Published: Jun 20, 2022 License: MIT

## Documentation ¶

### Overview ¶

Package infer contains inference algorithms: maximum likelihood estimation by gradient descent and approximation of the posterior by Markov Chain Monte Carlo methods (notably Hamiltonian Monte Carlo family of algorithms).

### Constants ¶

This section is empty.

### Variables ¶

This section is empty.

### Functions ¶

```func FuncGrad(m model.Model) (
Func func(x []float64) float64,
)```

FuncGrad returns the function to minimize and the gradient, suitable as fields for gonum optimize.Problem, corresponding to maximization of the model's log-likelihood.

#### func Optimize ¶ added in v0.8.6

```func Optimize(
m model.Model, x []float64,
niter, nplateau int,
eps float64,
) (
iter int,
ll0, ll float64,
)```

Optimize wraps a gradient-based optimizer into an optimization loop with early stopping if a plateau is reached.

### Types ¶

```type Adam struct {
Rate  float64 // learning rate
Beta1 float64 // first momentum factor
Beta2 float64 // second momentum factor
Eps   float64 // stabilizer
// contains filtered or unexported fields
}```

```func (opt *Adam) Step(
m model.Model,
x []float64,
) (
ll float64,
)```

Step implements the Optimizer interface.

```type DepthAdapter struct {
DualAveraging
Depth   float64
}```

Parameters of adaptation to the target depth

```func (da *DepthAdapter) Adapt(
nuts *NUTS,
samples <-chan []float64,
nIter int,
)```

Adapt adapts NUTS sampler to the target depth. At most nIter iterations are run.

#### type DualAveraging ¶

```type DualAveraging struct {
Rate float64
}```

Parameters of dual averaging.

#### func (*DualAveraging) Step ¶

`func (da *DualAveraging) Step(t, x, gradSum float64) float64`

Step implements Nesterov's primal-dual averaging, oversimplified.

```chi = -gradSum/math.Sqrt(t)
eta = Rate/t
x = eta*chi + (1-eta)*x
```

```type Grad interface {
Step(m model.Model, x []float64) (ll float64, grad []float64)
}```

Grad is the interface of gradient-based optimizers. Step makes a single step over parameters in the gradient direction.

#### type HMC ¶

```type HMC struct {
Sampler
// Parameters
L   int     // number of leapfrog steps
Eps float64 // leapfrog step size
}```

Vanilla Hamiltonian Monte Carlo Sampler.

#### func (*HMC) Sample ¶

```func (hmc *HMC) Sample(
m model.Model,
x []float64,
samples chan []float64,
)```

#### type MCMC ¶

```type MCMC interface {
Sample(
m model.Model,
x []float64,
samples chan []float64,
)
Stop()
}```

MCMC is the interface of MCMC samplers.

#### type Momentum ¶

```type Momentum struct {
Rate  float64 //learning rate
Decay float64 // rate decay
Gamma float64 // gradient momentum factor
// contains filtered or unexported fields
}```

Gradient ascent with momentum (https://www.nature.com/articles/323533a0). If the momentum factor is not set, and thus 0, reduces to vanilla gradient ascent.

#### func (*Momentum) Step ¶

```func (opt *Momentum) Step(
m model.Model,
x []float64,
) (
ll float64,
)```

Step implements the Optimizer interface.

#### type NUTS ¶

```type NUTS struct {
Sampler
// Parameters
Eps      float64 // step size
Delta    float64 // lower bound on energy for doubling
MaxDepth int     // maximum depth
// Statistics
// Depth belief is encoded as a vector of beta-bernoulli
// distributions. If the depth is greater than the element's
// index i, Depth[i] is incremented; for index depth,
// Depth[depth] is incremented.
Depth []float64 // depth belief
// contains filtered or unexported fields
}```

No U-Turn Sampler (https://arxiv.org/abs/1111.4246).

#### func (*NUTS) MeanDepth ¶

`func (nuts *NUTS) MeanDepth() float64`

MeanDepth returns the average observed depth.

#### func (*NUTS) Sample ¶

```func (nuts *NUTS) Sample(
m model.Model,
x []float64,
samples chan []float64,
)```

#### type Sampler ¶ added in v0.9.0

```type Sampler struct {
Stopped bool
Samples chan []float64
// Statistics
NAcc, NRej int // the number of accepted and rejected samples
}```

Sampler is the structure for embedding into concrete samplers.

#### func (*Sampler) Stop ¶ added in v0.9.0

`func (s *Sampler) Stop()`

Stop stops a sampler gracefully, using the samples channel for synchronization. Stop must be called before further calls to differentiated code. A part of the MCMC interface.

#### type SgHMC ¶ added in v0.8.0

```type SgHMC struct {
Sampler
// Parameters
L int // number of steps
// The parameterization follows Equation (15) in
// https://arxiv.org/abs/1402.4102
Eta   float64 // learning rate
Alpha float64 // friction (1 - momentum)
V     float64 // diffusion
}```

#### func (*SgHMC) Sample ¶ added in v0.8.0

```func (sghmc *SgHMC) Sample(
m model.Model,
x []float64,
samples chan []float64,
)```