Version: v0.9.1 Latest Latest Go to latest
Published: Mar 29, 2021 License: BSD-3-Clause

## Documentation ¶

### Overview ¶

Package samplemv implements advanced sampling routines from explicit and implicit probability distributions.

### Constants ¶

View Source
```const (
// Owen generates (scrambled) Halton samples using the Randomized van der Corput
// algorithm described in
//  A randomized Halton algorithm
//  Art Owen
//  https://arxiv.org/pdf/1706.02808.pdf
// Currently limited to 1000 dimensional inputs.
Owen = iota + 1
)```

### Variables ¶

View Source
`var ErrRejection = errors.New("rejection: acceptance ratio above 1")`

ErrRejection is returned when the constant in Rejection is not sufficiently high.

### Functions ¶

This section is empty.

### Types ¶

#### type Halton ¶

```type Halton struct {
Kind HaltonKind
Q    distmv.Quantiler
Src  rand.Source
}```

Halton is a type for sampling using the Halton sequence from the given distribution. The specific method for scrambling (or lack thereof) is specified by the HaltonKind. If src is not nil, it will be used to generate the randomness needed to scramble the sequence (if necessary), otherwise the rand package will be used. Halton panics if the HaltonKind is unrecognized or if q is nil.

Halton sequence random number generation is a quasi-Monte Carlo procedure where the samples are generated to be evenly spaced out across the distribution. Note that this means the sample locations are correlated with one another. The distmv.NewUnitUniform function can be used for easy sampling from the unit hypercube.

#### func (Halton) Sample ¶

`func (h Halton) Sample(batch *mat.Dense)`

Sample generates rows(batch) samples using the Halton generation procedure.

#### type HaltonKind ¶

`type HaltonKind int`

HaltonKind specifies the type of algorithm used to generate Halton samples.

#### type IID ¶

```type IID struct {
Dist distmv.Rander
}```

IID generates a set of independently and identically distributed samples from the input distribution.

#### func (IID) Sample ¶

`func (iid IID) Sample(batch *mat.Dense)`

Sample generates a set of identically and independently distributed samples.

#### type Importance ¶

```type Importance struct {
Target   distmv.LogProber
Proposal distmv.RandLogProber
}```

Importance is a type for performing importance sampling using the given Target and Proposal distributions.

Importance sampling is a variance reduction technique where samples are generated from a proposal distribution, q(x), instead of the target distribution p(x). This allows relatively unlikely samples in p(x) to be generated more frequently.

The importance sampling weight at x is given by p(x)/q(x). To reduce variance, a good proposal distribution will bound this sampling weight. This implies the support of q(x) should be at least as broad as p(x), and q(x) should be "fatter tailed" than p(x).

#### func (Importance) SampleWeighted ¶

`func (l Importance) SampleWeighted(batch *mat.Dense, weights []float64)`

SampleWeighted generates rows(batch) samples using the Importance sampling generation procedure.

The length of weights must equal the length of batch, otherwise Importance will panic.

#### type LatinHypercube ¶

```type LatinHypercube struct {
Q   distmv.Quantiler
Src rand.Source
}```

LatinHypercube is a type for sampling using Latin hypercube sampling from the given distribution. If src is not nil, it will be used to generate random numbers, otherwise rand.Float64 will be used.

Latin hypercube sampling divides the cumulative distribution function into equally spaced bins and guarantees that one sample is generated per bin. Within each bin, the location is randomly sampled. The distmv.NewUnitUniform function can be used for easy sampling from the unit hypercube.

#### func (LatinHypercube) Sample ¶

`func (l LatinHypercube) Sample(batch *mat.Dense)`

Sample generates rows(batch) samples using the LatinHypercube generation procedure.

#### type MHProposal ¶

```type MHProposal interface {
// ConditionalLogProb returns the probability of the first argument
// conditioned on being at the second argument.
//  p(x|y)
// ConditionalLogProb panics if the input slices are not the same length.
ConditionalLogProb(x, y []float64) (prob float64)

// ConditionalRand generates a new random location conditioned being at the
// location y. If the first argument is nil, a new slice is allocated and
// returned. Otherwise, the random location is stored in-place into the first
// argument, and ConditionalRand will panic if the input slice lengths differ.
ConditionalRand(x, y []float64) []float64
}```

MHProposal defines a proposal distribution for Metropolis Hastings.

#### type MetropolisHastingser ¶

```type MetropolisHastingser struct {
Initial  []float64
Target   distmv.LogProber
Proposal MHProposal
Src      rand.Source

BurnIn int
Rate   int
}```

MetropolisHastingser is a type for generating samples using the Metropolis Hastings algorithm (http://en.wikipedia.org/wiki/Metropolis%E2%80%93Hastings_algorithm), with the given target and proposal distributions, starting at the location specified by Initial. If src != nil, it will be used to generate random numbers, otherwise rand.Float64 will be used.

Metropolis-Hastings is a Markov-chain Monte Carlo algorithm that generates samples according to the distribution specified by target using the Markov chain implicitly defined by the proposal distribution. At each iteration, a proposal point is generated randomly from the current location. This proposal point is accepted with probability

```p = min(1, (target(new) * proposal(current|new)) / (target(current) * proposal(new|current)))
```

If the new location is accepted, it becomes the new current location. If it is rejected, the current location remains. This is the sample stored in batch, ignoring BurnIn and Rate (discussed below).

The samples in Metropolis Hastings are correlated with one another through the Markov chain. As a result, the initial value can have a significant influence on the early samples, and so, typically, the first samples generated by the chain are ignored. This is known as "burn-in", and the number of samples ignored at the beginning is specified by BurnIn. The proper BurnIn value will depend on the mixing time of the Markov chain defined by the target and proposal distributions.

Many choose to have a sampling "rate" where a number of samples are ignored in between each kept sample. This helps decorrelate the samples from one another, but also reduces the number of available samples. This value is specified by Rate. If Rate is 0 it is defaulted to 1 (keep every sample).

The initial value is NOT changed during calls to Sample.

#### func (MetropolisHastingser) Sample ¶

`func (m MetropolisHastingser) Sample(batch *mat.Dense)`

Sample generates rows(batch) samples using the Metropolis Hastings sample generation method. The initial location is NOT updated during the call to Sample.

The number of columns in batch must equal len(m.Initial), otherwise Sample will panic.

#### type ProposalNormal ¶

```type ProposalNormal struct {
// contains filtered or unexported fields
}```

ProposalNormal is a sampling distribution for Metropolis-Hastings. It has a fixed covariance matrix and changes the mean based on the current sampling location.

#### func NewProposalNormal ¶

`func NewProposalNormal(sigma *mat.SymDense, src rand.Source) (*ProposalNormal, bool)`

NewProposalNormal constructs a new ProposalNormal for use as a proposal distribution for Metropolis-Hastings. ProposalNormal is a multivariate normal distribution (implemented by distmv.Normal) where the covariance matrix is fixed and the mean of the distribution changes.

NewProposalNormal returns {nil, false} if the covariance matrix is not positive-definite.

#### func (*ProposalNormal) ConditionalLogProb ¶

`func (p *ProposalNormal) ConditionalLogProb(x, y []float64) (prob float64)`

ConditionalLogProb returns the probability of the first argument conditioned on being at the second argument.

```p(x|y)
```

ConditionalLogProb panics if the input slices are not the same length or are not equal to the dimension of the covariance matrix.

#### func (*ProposalNormal) ConditionalRand ¶

`func (p *ProposalNormal) ConditionalRand(x, y []float64) []float64`

ConditionalRand generates a new random location conditioned being at the location y. If the first argument is nil, a new slice is allocated and returned. Otherwise, the random location is stored in-place into the first argument, and ConditionalRand will panic if the input slice lengths differ or if they are not equal to the dimension of the covariance matrix.

#### type Rejection ¶

```type Rejection struct {
C        float64
Target   distmv.LogProber
Proposal distmv.RandLogProber
Src      rand.Source
// contains filtered or unexported fields
}```

Rejection is a type for sampling using the rejection sampling algorithm.

Rejection sampling generates points from the target distribution by using the proposal distribution. At each step of the algorithm, the proposed point is accepted with probability

```p = target(x) / (proposal(x) * c)
```

where target(x) is the probability of the point according to the target distribution and proposal(x) is the probability according to the proposal distribution. The constant c must be chosen such that target(x) < proposal(x) * c for all x. The expected number of proposed samples is len(samples) * c.

The number of proposed locations during sampling can be found with a call to Proposed. If there was an error during sampling, all elements of samples are set to NaN and the error can be accesssed with the Err method. If src != nil, it will be used to generate random numbers, otherwise rand.Float64 will be used.

Target may return the true (log of) the probablity of the location, or it may return a value that is proportional to the probability (logprob + constant). This is useful for cases where the probability distribution is only known up to a normalization constant.

#### func (*Rejection) Err ¶

`func (r *Rejection) Err() error`

Err returns nil if the most recent call to sample was successful, and returns ErrRejection if it was not.

#### func (*Rejection) Proposed ¶

`func (r *Rejection) Proposed() int`

Proposed returns the number of samples proposed during the most recent call to Sample.

#### func (*Rejection) Sample ¶

`func (r *Rejection) Sample(batch *mat.Dense)`

Sample generates rows(batch) using the Rejection sampling generation procedure. Rejection sampling may fail if the constant is insufficiently high, as described in the type comment for Rejection. If the generation fails, the samples are set to math.NaN(), and a call to Err will return a non-nil value.

#### type SampleUniformWeighted ¶

```type SampleUniformWeighted struct {
Sampler
}```

SampleUniformWeighted wraps a Sampler type to create a WeightedSampler where all weights are equal.

#### func (SampleUniformWeighted) SampleWeighted ¶

`func (w SampleUniformWeighted) SampleWeighted(batch *mat.Dense, weights []float64)`

SampleWeighted generates rows(batch) samples from the embedded Sampler type and sets all of the weights equal to 1. If rows(batch) and len(weights) of weights are not equal, SampleWeighted will panic.

#### type Sampler ¶

```type Sampler interface {
Sample(batch *mat.Dense)
}```

Sampler generates a batch of samples according to the rule specified by the implementing type. The number of samples generated is equal to rows(batch), and the samples are stored in-place into the input.

#### type WeightedSampler ¶

```type WeightedSampler interface {
SampleWeighted(batch *mat.Dense, weights []float64)
}```

WeightedSampler generates a batch of samples and their relative weights according to the rule specified by the implementing type. The number of samples generated is equal to rows(batch), and the samples and weights are stored in-place into the inputs. The length of weights must equal rows(batch), otherwise SampleWeighted will panic.