leabra

package
v0.5.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 12, 2019 License: BSD-3-Clause Imports: 29 Imported by: 0

Documentation

Overview

Package leabra provides the basic reference leabra implementation, for rate-coded activations and standard error-driven learning. Other packages provide spiking or deep leabra, PVLV, PBWM, etc.

The overall design seeks an "optimal" tradeoff between simplicity, transparency, ability to flexibly recombine and extend elements, and avoiding having to rewrite a bunch of stuff.

The *Stru elements handle the core structural components of the network, and hold emer.* interface pointers to elements such as emer.Layer, which provides a very minimal interface for these elements. Interfaces are automatically pointers, so think of these as generic pointers to your specific Layers etc.

This design means the same *Stru infrastructure can be re-used across different variants of the algorithm. Because we're keeping this infrastructure minimal and algorithm-free it should be much less confusing than dealing with the multiple levels of inheritance in C++ emergent. The actual algorithm-specific code is now fully self-contained, and largely orthogonalized from the infrastructure.

One specific cost of this is the need to cast the emer.* interface pointers into the specific types of interest, when accessing via the *Stru infrastructure.

The *Params elements contain all the (meta)parameters and associated methods for computing various functions. They are the equivalent of Specs from original emergent, but unlike specs they are local to each place they are used, and styling is used to apply common parameters across multiple layers etc. Params seems like a more explicit, recognizable name compared to specs, and this also helps avoid confusion about their different nature than old specs. Pars is shorter but confusable with "Parents" so "Params" is more unambiguous.

Params are organized into four major categories, which are more clearly functionally labeled as opposed to just structurally so, to keep things clearer and better organized overall: * ActParams -- activation params, at the Neuron level (in act.go) * InhibParams -- inhibition params, at the Layer / Pool level (in inhib.go) * LearnNeurParams -- learning parameters at the Neuron level (running-averages that drive learning) * LearnSynParams -- learning parameters at the Synapse level (both in learn.go)

The levels of structure and state are: * Network * .Layers * .Pools: pooled inhibition state -- 1 for layer plus 1 for each sub-pool (unit group) with inhibition * .RecvPrjns: receiving projections from other sending layers * .SendPrjns: sending projections from other receiving layers * .Neurons: neuron state variables

There are methods on the Network that perform initialization and overall computation, by iterating over layers and calling methods there. This is typically how most users will run their models.

Parallel computation across multiple CPU cores (threading) is achieved through persistent worker go routines that listen for functions to run on thread-specific channels. Each layer has a designated thread number, so you can experiment with different ways of dividing up the computation. Timing data is kept for per-thread time use -- see TimeReport() on the network.

The Layer methods directly iterate over Neurons, Pools, and Prjns, and there is no finer-grained level of computation (e.g., at the individual Neuron level), except for the *Params methods that directly compute relevant functions. Thus, looking directly at the layer.go code should provide a clear sense of exactly how everything is computed -- you may need to the refer to act.go, learn.go etc to see the relevant details but at least the overall organization should be clear in layer.go.

Computational methods are generally named: VarFmVar to specifically name what variable is being computed from what other input variables. e.g., ActFmG computes activation from conductances G.

The Pools (type Pool, in pool.go) hold state used for computing pooled inhibition, but also are used to hold overall aggregate pooled state variables -- the first element in Pools applies to the layer itself, and subsequent ones are for each sub-pool (4D layers). These pools play the same role as the LeabraUnGpState structures in C++ emergent.

Prjns directly support all synapse-level computation, and hold the LearnSynParams and iterate directly over all of their synapses. It is the exact same Prjn object that lives in the RecvPrjns of the receiver-side, and the SendPrjns of the sender-side, and it maintains and coordinates both sides of the state. This clarifies and simplifies a lot of code. There is no separate equivalent of LeabraConSpec / LeabraConState at the level of connection groups per unit per projection.

The pattern of connectivity between units is specified by the prjn.Pattern interface and all the different standard options are avail in that prjn package. The Pattern code generates a full tensor bitmap of binary 1's and 0's for connected (1's) and not (0's) units, and can use any method to do so. This full lookup-table approach is not the most memory-efficient, but it is fully general and shouldn't be too-bad memory-wise overall (fully bit-packed arrays are used, and these bitmaps don't need to be retained once connections have been established). This approach allows patterns to just focus on patterns, and they don't care at all how they are used to allocate actual connections.

Index

Constants

View Source
const NeuronVarStart = 2

NeuronVarStart is the index of fields in the Neuron structure where the float32 named variables start. Note: all non-float32 infrastructure variables must be at the start!

Variables

View Source
var KiT_ActNoiseType = kit.Enums.AddEnum(ActNoiseTypeN, false, nil)
View Source
var KiT_Layer = kit.Types.AddType(&Layer{}, LayerProps)
View Source
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
View Source
var KiT_NeurFlags = kit.Enums.AddEnum(NeurFlagsN, true, nil)
View Source
var KiT_Quarters = kit.Enums.AddEnum(QuartersN, true, nil)
View Source
var KiT_TimeScales = kit.Enums.AddEnum(TimeScalesN, false, nil)
View Source
var LayerProps = ki.Props{
	"ToolBar": ki.PropSlice{
		{"Defaults", ki.Props{
			"icon": "reset",
			"desc": "return all parameters to their intial default values",
		}},
		{"InitWts", ki.Props{
			"icon": "update",
			"desc": "initialize the layer's weight values according to prjn parameters, for all *sending* projections out of this layer",
		}},
		{"InitActs", ki.Props{
			"icon": "update",
			"desc": "initialize the layer's activation values",
		}},
		{"sep-act", ki.BlankProp{}},
		{"LesionNeurons", ki.Props{
			"icon": "close",
			"desc": "Lesion (set the Off flag) for given proportion of neurons in the layer (number must be 0 -- 1, NOT percent!)",
			"Args": ki.PropSlice{
				{"Proportion", ki.Props{
					"desc": "proportion (0 -- 1) of neurons to lesion",
				}},
			},
		}},
		{"UnLesionNeurons", ki.Props{
			"icon": "reset",
			"desc": "Un-Lesion (reset the Off flag) for all neurons in the layer",
		}},
	},
}
View Source
var NetworkProps = ki.Props{
	"ToolBar": ki.PropSlice{
		{"SaveWtsJSON", ki.Props{
			"label": "Save Wts...",
			"icon":  "file-save",
			"desc":  "Save json-formatted weights",
			"Args": ki.PropSlice{
				{"Weights File Name", ki.Props{
					"default-field": "WtsFile",
					"ext":           ".wts",
				}},
			},
		}},
		{"OpenWtsJSON", ki.Props{
			"label": "Open Wts...",
			"icon":  "file-open",
			"desc":  "Open json-formatted weights",
			"Args": ki.PropSlice{
				{"Weights File Name", ki.Props{
					"default-field": "WtsFile",
					"ext":           ".wts",
				}},
			},
		}},
		{"sep-file", ki.BlankProp{}},
		{"Build", ki.Props{
			"icon": "update",
			"desc": "build the network's neurons and synapses according to current params",
		}},
		{"InitWts", ki.Props{
			"icon": "update",
			"desc": "initialize the network weight values according to prjn parameters",
		}},
		{"InitActs", ki.Props{
			"icon": "update",
			"desc": "initialize the network activation values",
		}},
		{"sep-act", ki.BlankProp{}},
		{"AddLayer", ki.Props{
			"label": "Add Layer...",
			"icon":  "new",
			"desc":  "add a new layer to network",
			"Args": ki.PropSlice{
				{"Layer Name", ki.Props{}},
				{"Layer Shape", ki.Props{
					"desc": "shape of layer, typically 2D (Y, X) or 4D (Pools Y, Pools X, Units Y, Units X)",
				}},
				{"Layer Type", ki.Props{
					"desc": "type of layer -- used for determining how inputs are applied",
				}},
			},
		}},
		{"ConnectLayerNames", ki.Props{
			"label": "Connect Layers...",
			"icon":  "new",
			"desc":  "add a new connection between layers in the network",
			"Args": ki.PropSlice{
				{"Send Layer Name", ki.Props{}},
				{"Recv Layer Name", ki.Props{}},
				{"Pattern", ki.Props{
					"desc": "pattern to connect with",
				}},
				{"Prjn Type", ki.Props{
					"desc": "type of projection -- direction, or other more specialized factors",
				}},
			},
		}},
	},
}
View Source
var NeuronVars = []string{"Act", "Ge", "Gi", "Inet", "Vm", "Targ", "Ext", "AvgSS", "AvgS", "AvgM", "AvgL", "AvgLLrn", "AvgSLrn", "ActM", "ActP", "ActDif", "ActDel", "ActAvg", "Noise", "GiSyn", "GiSelf", "ActSent", "GeRaw", "GeInc", "GiRaw", "GiInc"}
View Source
var NeuronVarsMap map[string]int
View Source
var SynapseVars = []string{"Wt", "LWt", "DWt", "Norm", "Moment"}
View Source
var SynapseVarsMap map[string]int

Functions

func JsonToParams

func JsonToParams(b []byte) string

JsonToParams reformates json output to suitable params display output

func NeuronVarByName

func NeuronVarByName(varNm string) (int, error)

NeuronVarByName returns the index of the variable in the Neuron, or error

func SigFun

func SigFun(w, gain, off float32) float32

SigFun is the sigmoid function for value w in 0-1 range, with gain and offset params

func SigFun61

func SigFun61(w float32) float32

SigFun61 is the sigmoid function for value w in 0-1 range, with default gain = 6, offset = 1 params

func SigInvFun

func SigInvFun(w, gain, off float32) float32

SigInvFun is the inverse of the sigmoid function

func SigInvFun61

func SigInvFun61(w float32) float32

SigInvFun61 is the inverse of the sigmoid function, with default gain = 6, offset = 1 params

Types

type ActAvg

type ActAvg struct {
	ActMAvg    float32 `desc:"running-average minus-phase activity -- used for adapting inhibition -- see ActAvgParams.Tau for time constant etc"`
	ActPAvg    float32 `desc:"running-average plus-phase activity -- used for netinput scaling -- see ActAvgParams.Tau for time constant etc"`
	ActPAvgEff float32 `desc:"ActPAvg * ActAvgParams.Adjust -- adjusted effective layer activity directly used in netinput scaling"`
}

ActAvg are running-average activation levels used for netinput scaling and adaptive inhibition

type ActAvgParams

type ActAvgParams struct {
	Init      float32 `` /* 462-byte string literal not displayed */
	Fixed     bool    `` /* 190-byte string literal not displayed */
	UseExtAct bool    `` /* 343-byte string literal not displayed */
	UseFirst  bool    `` /* 166-byte string literal not displayed */
	Tau       float32 `` /* 177-byte string literal not displayed */
	Adjust    float32 `` /* 576-byte string literal not displayed */

	Dt float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate = 1 / tau"`
}

ActAvgParams represents expected average activity levels in the layer. Used for computing running-average computation that is then used for netinput scaling. Also specifies time constant for updating average and for the target value for adapting inhibition in inhib_adapt.

func (*ActAvgParams) AvgFmAct

func (aa *ActAvgParams) AvgFmAct(avg *float32, act float32)

AvgFmAct updates the running-average activation given average activity level in layer

func (*ActAvgParams) Defaults

func (aa *ActAvgParams) Defaults()

func (*ActAvgParams) EffFmAvg

func (aa *ActAvgParams) EffFmAvg(eff *float32, avg float32)

EffFmAvg updates the effective value from the running-average value

func (*ActAvgParams) EffInit

func (aa *ActAvgParams) EffInit() float32

EffInit returns the initial value applied during InitWts for the AvgPAvgEff effective layer activity

func (*ActAvgParams) Update

func (aa *ActAvgParams) Update()

type ActInitParams

type ActInitParams struct {
	Decay float32 `def:"1" max:"1" min:"0" desc:"proportion to decay activation state toward initial values at start of every trial"`
	Vm    float32 `` /* 193-byte string literal not displayed */
	Act   float32 `def:"0" desc:"initial activation value -- typically 0"`
	Ge    float32 `` /* 268-byte string literal not displayed */
}

ActInitParams are initial values for key network state variables. Initialized at start of trial with Init_Acts or DecayState.

func (*ActInitParams) Defaults

func (ai *ActInitParams) Defaults()

func (*ActInitParams) Update

func (ai *ActInitParams) Update()

type ActNoiseParams

type ActNoiseParams struct {
	erand.RndParams
	Type  ActNoiseType `desc:"where and how to add processing noise"`
	Fixed bool         `` /* 227-byte string literal not displayed */
}

ActNoiseParams contains parameters for activation-level noise

func (*ActNoiseParams) Defaults

func (an *ActNoiseParams) Defaults()

func (*ActNoiseParams) Update

func (an *ActNoiseParams) Update()

type ActNoiseType

type ActNoiseType int

ActNoiseType are different types / locations of random noise for activations

const (
	// NoNoise means no noise added
	NoNoise ActNoiseType = iota

	// VmNoise means noise is added to the membrane potential.
	// IMPORTANT: this should NOT be used for rate-code (NXX1) activations,
	// because they do not depend directly on the vm -- this then has no effect
	VmNoise

	// GeNoise means noise is added to the excitatory conductance (Ge).
	// This should be used for rate coded activations (NXX1)
	GeNoise

	// ActNoise means noise is added to the final rate code activation
	ActNoise

	// GeMultNoise means that noise is multiplicative on the Ge excitatory conductance values
	GeMultNoise

	ActNoiseTypeN
)

The activation noise types

func (*ActNoiseType) FromString

func (i *ActNoiseType) FromString(s string) error

func (ActNoiseType) MarshalJSON

func (ev ActNoiseType) MarshalJSON() ([]byte, error)

func (ActNoiseType) String

func (i ActNoiseType) String() string

func (*ActNoiseType) UnmarshalJSON

func (ev *ActNoiseType) UnmarshalJSON(b []byte) error

type ActParams

type ActParams struct {
	XX1        XX1Params       `view:"inline" desc:"X/X+1 rate code activation function parameters"`
	OptThresh  OptThreshParams `view:"inline" desc:"optimization thresholds for faster processing"`
	Init       ActInitParams   `` /* 127-byte string literal not displayed */
	Dt         DtParams        `view:"inline" desc:"time and rate constants for temporal derivatives / updating of activation state"`
	Gbar       Chans           `view:"inline" desc:"[Defaults: 1, .2, 1, 1] maximal conductances levels for channels"`
	Erev       Chans           `view:"inline" desc:"[Defaults: 1, .3, .25, .1] reversal potentials for each channel"`
	Clamp      ClampParams     `view:"inline" desc:"how external inputs drive neural activations"`
	Noise      ActNoiseParams  `view:"inline" desc:"how, where, when, and how much noise to add to activations"`
	VmRange    minmax.F32      `view:"inline" desc:"range for Vm membrane potential -- [0, 2.0] by default"`
	ErevSubThr Chans           `inactive:"+" view:"-" json:"-" xml:"-" desc:"Erev - Act.Thr for each channel -- used in computing GeThrFmG among others"`
	ThrSubErev Chans           `inactive:"+" view:"-" json:"-" xml:"-" desc:"Act.Thr - Erev for each channel -- used in computing GeThrFmG among others"`
}

leabra.ActParams contains all the activation computation params and functions for basic Leabra, at the neuron level. This is included in leabra.Layer to drive the computation.

func (*ActParams) ActFmG

func (ac *ActParams) ActFmG(nrn *Neuron)

ActFmG computes rate-coded activation Act from conductances Ge and Gi

func (*ActParams) DecayState

func (ac *ActParams) DecayState(nrn *Neuron, decay float32)

DecayState decays the activation state toward initial values in proportion to given decay parameter Called with ac.Init.Decay by Layer during AlphaCycInit

func (*ActParams) Defaults

func (ac *ActParams) Defaults()

func (*ActParams) GRawFmInc

func (ac *ActParams) GRawFmInc(nrn *Neuron)

GRawFmInc integrates G conductance from Inc delta-increment sent.

func (*ActParams) GeGiFmInc

func (ac *ActParams) GeGiFmInc(nrn *Neuron)

GeGiFmInc integrates Ge excitatory conductance from GeInc delta-increment sent and also GiRaw and GiSyn from GiInc.

func (*ActParams) GeThrFmG

func (ac *ActParams) GeThrFmG(nrn *Neuron) float32

GeThrFmG computes the threshold for Ge based on other conductances

func (*ActParams) HardClamp

func (ac *ActParams) HardClamp(nrn *Neuron)

HardClamp clamps activation from external input -- just does it -- use HasHardClamp to check

func (*ActParams) HasHardClamp

func (ac *ActParams) HasHardClamp(nrn *Neuron) bool

HasHardClamp returns true if this neuron has external input that should be hard clamped

func (*ActParams) InetFmG

func (ac *ActParams) InetFmG(vm, ge, gi, gk float32) float32

InetFmG computes net current from conductances and Vm

func (*ActParams) InitActs

func (ac *ActParams) InitActs(nrn *Neuron)

InitActs initializes activation state in neuron -- called during InitWts but otherwise not automatically called (DecayState is used instead)

func (*ActParams) InitGeGi

func (ac *ActParams) InitGeGi(nrn *Neuron)

InitGe initializes the Ge excitatory and Gi inhibitory conductance accumulation states called at start of trial always

func (*ActParams) Update

func (ac *ActParams) Update()

Update must be called after any changes to parameters

func (*ActParams) VmFmG

func (ac *ActParams) VmFmG(nrn *Neuron)

VmFmG computes membrane potential Vm from conductances Ge and Gi. The Vm value is only used in pure rate-code computation within the sub-threshold regime because firing rate is a direct function of excitatory conductance Ge.

type AvgLParams

type AvgLParams struct {
	Init   float32 `def:"0.4" min:"0" max:"1" desc:"initial AvgL value at start of training"`
	Gain   float32 `` /* 501-byte string literal not displayed */
	Min    float32 `` /* 219-byte string literal not displayed */
	Tau    float32 `` /* 273-byte string literal not displayed */
	LrnMax float32 `` /* 570-byte string literal not displayed */
	LrnMin float32 `` /* 350-byte string literal not displayed */
	ErrMod bool    `def:"true" desc:"modulate amount learning by normalized level of error within layer"`
	ModMin float32 `` /* 200-byte string literal not displayed */

	Dt      float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"rate = 1 / tau"`
	LrnFact float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"(LrnMax - LrnMin) / (Gain - Min)"`
}

AvgLParams are parameters for computing the long-term floating average value, AvgL which is used for driving BCM-style hebbian learning in XCAL -- this form of learning increases contrast of weights and generally decreases overall activity of neuron, to prevent "hog" units -- it is computed as a running average of the (gain multiplied) medium-time-scale average activation at the end of the alpha-cycle. Also computes an adaptive amount of BCM learning, AvgLLrn, based on AvgL.

func (*AvgLParams) AvgLFmAvgM

func (al *AvgLParams) AvgLFmAvgM(avgM float32, avgL, lrn *float32)

AvgLFmAvgM computes long-term average activation value, and learning factor, from given medium-scale running average activation avgM

func (*AvgLParams) Defaults

func (al *AvgLParams) Defaults()

func (*AvgLParams) ErrModFmLayErr

func (al *AvgLParams) ErrModFmLayErr(layCosDiffAvg float32) float32

ErrModFmLayErr computes AvgLLrn multiplier from layer cosine diff avg statistic

func (*AvgLParams) Update

func (al *AvgLParams) Update()

type Chans

type Chans struct {
	E float32 `desc:"excitatory sodium (Na) AMPA channels activated by synaptic glutamate"`
	L float32 `desc:"constant leak (potassium, K+) channels -- determines resting potential (typically higher than resting potential of K)"`
	I float32 `desc:"inhibitory chloride (Cl-) channels activated by synaptic GABA"`
	K float32 `desc:"gated / active potassium channels -- typically hyperpolarizing relative to leak / rest"`
}

Chans are ion channels used in computing point-neuron activation function

func (*Chans) SetAll

func (ch *Chans) SetAll(e, l, i, k float32)

SetAll sets all the values

func (*Chans) SetFmMinusOther

func (ch *Chans) SetFmMinusOther(minus float32, oth Chans)

SetFmMinusOther sets all the values from given value minus other Chans

func (*Chans) SetFmOtherMinus

func (ch *Chans) SetFmOtherMinus(oth Chans, minus float32)

SetFmOtherMinus sets all the values from other Chans minus given value

type ClampParams

type ClampParams struct {
	Hard    bool       `` /* 200-byte string literal not displayed */
	Range   minmax.F32 `` /* 153-byte string literal not displayed */
	Gain    float32    `viewif:"!Hard" def:"0.02:0.5" desc:"soft clamp gain factor (Ge += Gain * Ext)"`
	Avg     bool       `` /* 181-byte string literal not displayed */
	AvgGain float32    `` /* 145-byte string literal not displayed */
}

ClampParams are for specifying how external inputs are clamped onto network activation values

func (*ClampParams) AvgGe

func (cp *ClampParams) AvgGe(ext, ge float32) float32

AvgGe computes Avg-based Ge clamping value if using that option

func (*ClampParams) Defaults

func (cp *ClampParams) Defaults()

func (*ClampParams) Update

func (cp *ClampParams) Update()

type CosDiffParams

type CosDiffParams struct {
	Tau float32 `` /* 592-byte string literal not displayed */

	Dt  float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate constant = 1 / Tau"`
	DtC float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"complement of rate constant = 1 - Dt"`
}

CosDiffParams specify how to integrate cosine of difference between plus and minus phase activations Used to modulate amount of hebbian learning, and overall learning rate.

func (*CosDiffParams) AvgVarFmCos

func (cd *CosDiffParams) AvgVarFmCos(avg, vr *float32, cos float32)

AvgVarFmCos updates the average and variance from current cosine diff value

func (*CosDiffParams) Defaults

func (cd *CosDiffParams) Defaults()

func (*CosDiffParams) Update

func (cd *CosDiffParams) Update()

type CosDiffStats

type CosDiffStats struct {
	Cos        float32 `` /* 185-byte string literal not displayed */
	Avg        float32 `` /* 234-byte string literal not displayed */
	Var        float32 `` /* 193-byte string literal not displayed */
	AvgLrn     float32 `desc:"1 - Avg and 0 for non-Hidden layers"`
	ModAvgLLrn float32 `` /* 144-byte string literal not displayed */
}

CosDiffStats holds cosine-difference statistics at the layer level

func (*CosDiffStats) Init

func (cd *CosDiffStats) Init()

type DWtNormParams

type DWtNormParams struct {
	On       bool    `` /* 184-byte string literal not displayed */
	DecayTau float32 `` /* 172-byte string literal not displayed */
	NormMin  float32 `` /* 157-byte string literal not displayed */
	LrComp   float32 `` /* 264-byte string literal not displayed */
	Stats    bool    `` /* 134-byte string literal not displayed */

	DecayDt  float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate constant of decay = 1 / decay_tau"`
	DecayDtC float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"complement rate constant of decay = 1 - (1 / decay_tau)"`
}

DWtNormParams are weight change (dwt) normalization parameters, using MAX(ABS(dwt)) aggregated over Sending connections in a given projection for a given unit. Slowly decays and instantly resets to any current max(abs) Serves as an estimate of the variance in the weight changes, assuming zero net mean overall.

func (*DWtNormParams) Defaults

func (dn *DWtNormParams) Defaults()

func (*DWtNormParams) NormFmAbsDWt

func (dn *DWtNormParams) NormFmAbsDWt(norm *float32, absDwt float32) float32

DWtNormParams updates the dwnorm running max_abs, slowly decaying value jumps up to max(abs_dwt) and slowly decays returns the effective normalization factor, as a multiplier, including lrate comp

func (*DWtNormParams) Update

func (dn *DWtNormParams) Update()

type DtParams

type DtParams struct {
	Integ  float32 `` /* 649-byte string literal not displayed */
	VmTau  float32 `` /* 501-byte string literal not displayed */
	GTau   float32 `` /* 601-byte string literal not displayed */
	AvgTau float32 `` /* 206-byte string literal not displayed */

	VmDt  float32 `view:"-" json:"-" xml:"-" desc:"nominal rate = Integ / tau"`
	GDt   float32 `view:"-" json:"-" xml:"-" desc:"rate = Integ / tau"`
	AvgDt float32 `view:"-" json:"-" xml:"-" desc:"rate = 1 / tau"`
}

DtParams are time and rate constants for temporal derivatives in Leabra (Vm, net input)

func (*DtParams) Defaults

func (dp *DtParams) Defaults()

func (*DtParams) GFmRaw

func (dp *DtParams) GFmRaw(geRaw float32, ge *float32)

func (*DtParams) Update

func (dp *DtParams) Update()

type FFFBInhib

type FFFBInhib struct {
	FFi    float32 `desc:"computed feedforward inhibition"`
	FBi    float32 `desc:"computed feedback inhibition (total)"`
	Gi     float32 `` /* 146-byte string literal not displayed */
	GiOrig float32 `desc:"original value of the inhibition (before any  group effects set in)"`
	LayGi  float32 `` /* 127-byte string literal not displayed */
}

FFFBInhib contains values for computed FFFB inhibition

func (*FFFBInhib) Init

func (fi *FFFBInhib) Init()

type FFFBParams

type FFFBParams struct {
	On       bool    `desc:"enable this level of inhibition"`
	Gi       float32 `` /* 226-byte string literal not displayed */
	FF       float32 `` /* 316-byte string literal not displayed */
	FB       float32 `` /* 253-byte string literal not displayed */
	FBTau    float32 `` /* 471-byte string literal not displayed */
	MaxVsAvg float32 `` /* 755-byte string literal not displayed */
	FF0      float32 `` /* 398-byte string literal not displayed */

	FBDt float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate = 1 / tau"`
}

FFFBParams parameterizes feedforward (FF) and feedback (FB) inhibition (FFFB) based on average (or maximum) netinput (FF) and activation (FB)

func (*FFFBParams) Defaults

func (fb *FFFBParams) Defaults()

func (*FFFBParams) FBInhib

func (fb *FFFBParams) FBInhib(avgAct float32) float32

FBInhib computes feedback inhibition value as function of average activation

func (*FFFBParams) FBUpdt

func (fb *FFFBParams) FBUpdt(fbi *float32, newFbi float32)

FBUpdt updates feedback inhibition using time-integration rate constant

func (*FFFBParams) FFInhib

func (fb *FFFBParams) FFInhib(avgGe, maxGe float32) float32

FFInhib returns the feedforward inhibition value based on average and max excitatory conductance within relevant scope

func (*FFFBParams) Inhib

func (fb *FFFBParams) Inhib(avgGe, maxGe, avgAct float32, inh *FFFBInhib)

Inhib is full inhibition computation for given pool activity levels and inhib state

func (*FFFBParams) Update

func (fb *FFFBParams) Update()

type InhibParams

type InhibParams struct {
	Layer  FFFBParams      `view:"inline" desc:"inhibition across the entire layer"`
	Pool   FFFBParams      `view:"inline" desc:"inhibition across sub-pools of units, for layers with 4D shape"`
	Self   SelfInhibParams `` /* 161-byte string literal not displayed */
	ActAvg ActAvgParams    `` /* 144-byte string literal not displayed */
}

leabra.InhibParams contains all the inhibition computation params and functions for basic Leabra This is included in leabra.Layer to support computation. This also includes other misc layer-level params such as running-average activation in the layer which is used for netinput rescaling and potentially for adapting inhibition over time

func (*InhibParams) Defaults

func (ip *InhibParams) Defaults()

func (*InhibParams) Update

func (ip *InhibParams) Update()

type LayFunChan

type LayFunChan chan func(ly LeabraLayer)

LayFunChan is a channel that runs LeabraLayer functions

type Layer

type Layer struct {
	LayerStru
	Act     ActParams       `desc:"Activation parameters and methods for computing activations"`
	Inhib   InhibParams     `desc:"Inhibition parameters and methods for computing layer-level inhibition"`
	Learn   LearnNeurParams `desc:"Learning parameters and methods that operate at the neuron level"`
	Neurons []Neuron        `` /* 133-byte string literal not displayed */
	Pools   []Pool          `` /* 234-byte string literal not displayed */
	CosDiff CosDiffStats    `desc:"cosine difference between ActM, ActP stats"`
}

leabra.Layer has parameters for running a basic rate-coded Leabra layer

func (*Layer) ActFmG

func (ly *Layer) ActFmG(ltime *Time)

ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act

func (*Layer) AllParams

func (ly *Layer) AllParams() string

AllParams returns a listing of all parameters in the Layer

func (*Layer) AlphaCycInit

func (ly *Layer) AlphaCycInit()

AlphaCycInit handles all initialization at start of new input pattern, including computing netinput scaling from running average activation etc. should already have presented the external input to the network at this point.

func (*Layer) ApplyExt

func (ly *Layer) ApplyExt(ext etensor.Tensor)

ApplyExt applies external input in the form of an etensor.Float32. If dimensionality of tensor matches that of layer, and is 2D or 4D, then each dimension is iterated separately, so any mismatch preserves dimensional structure. If the layer is a Target or Compare layer type, then it goes in Targ otherwise it goes in Ext

func (*Layer) ApplyExt1D

func (ly *Layer) ApplyExt1D(ext []float64)

ApplyExt1D applies external input in the form of a flat 1-dimensional slice of floats If the layer is a Target or Compare layer type, then it goes in Targ otherwise it goes in Ext

func (*Layer) ApplyExtFlags

func (ly *Layer) ApplyExtFlags() (clrmsk, setmsk int32, toTarg bool)

ApplyExtFlags gets the clear mask and set mask for updating neuron flags based on layer type, and whether input should be applied to Targ (else Ext)

func (*Layer) AsLeabra

func (ly *Layer) AsLeabra() *Layer

AsLeabra returns this layer as a leabra.Layer -- all derived layers must redefine this to return the base Layer type, so that the LeabraLayer interface does not need to include accessors to all the basic stuff

func (*Layer) AvgLFmAvgM

func (ly *Layer) AvgLFmAvgM()

AvgLFmAvgM updates AvgL long-term running average activation that drives BCM Hebbian learning

func (*Layer) AvgMaxAct

func (ly *Layer) AvgMaxAct(ltime *Time)

AvgMaxAct computes the average and max Act stats, used in inhibition

func (*Layer) AvgMaxGe

func (ly *Layer) AvgMaxGe(ltime *Time)

AvgMaxGe computes the average and max Ge stats, used in inhibition

func (*Layer) Build

func (ly *Layer) Build() error

Build constructs the layer state, including calling Build on the projections you MUST have properly configured the Inhib.Pool.On setting by this point to properly allocate Pools for the unit groups if necessary.

func (*Layer) BuildPools

func (ly *Layer) BuildPools(nu int) error

BuildPools builds the inhibitory pools structures -- nu = number of units in layer

func (*Layer) BuildPrjns

func (ly *Layer) BuildPrjns() error

BuildPrjns builds the projections, recv-side

func (*Layer) BuildSubPools

func (ly *Layer) BuildSubPools()

BuildSubPools initializes neuron start / end indexes for sub-pools

func (*Layer) CosDiffFmActs

func (ly *Layer) CosDiffFmActs()

CosDiffFmActs computes the cosine difference in activation state between minus and plus phases. this is also used for modulating the amount of BCM hebbian learning

func (*Layer) DWt

func (ly *Layer) DWt()

DWt computes the weight change (learning) -- calls DWt method on sending projections

func (*Layer) DecayState

func (ly *Layer) DecayState(decay float32)

DecayState decays activation state by given proportion (default is on ly.Act.Init.Decay)

func (*Layer) Defaults

func (ly *Layer) Defaults()

func (*Layer) GFmInc

func (ly *Layer) GFmInc(ltime *Time)

GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.

func (*Layer) GScaleFmAvgAct

func (ly *Layer) GScaleFmAvgAct()

GScaleFmAvgAct computes the scaling factor for synaptic input conductances G, based on sending layer average activation. This attempts to automatically adjust for overall differences in raw activity coming into the units to achieve a general target of around .5 to 1 for the integrated Ge value.

func (*Layer) GenNoise

func (ly *Layer) GenNoise()

GenNoise generates random noise for all neurons

func (*Layer) HardClamp

func (ly *Layer) HardClamp()

HardClamp hard-clamps the activations in the layer -- called during AlphaCycInit for hard-clamped Input layers

func (*Layer) InhibFmGeAct

func (ly *Layer) InhibFmGeAct(ltime *Time)

InhibiFmGeAct computes inhibition Gi from Ge and Act averages within relevant Pools

func (*Layer) InitActAvg

func (ly *Layer) InitActAvg()

InitActAvg initializes the running-average activation values that drive learning.

func (*Layer) InitActs

func (ly *Layer) InitActs()

InitActs fully initializes activation state -- only called automatically during InitWts

func (*Layer) InitExt

func (ly *Layer) InitExt()

InitExt initializes external input state -- called prior to apply ext

func (*Layer) InitGInc

func (ly *Layer) InitGInc()

InitGInc initializes GeInc and GiIn increment -- optional

func (*Layer) InitWtSym

func (ly *Layer) InitWtSym()

InitWtsSym initializes the weight symmetry -- higher layers copy weights from lower layers

func (*Layer) InitWts

func (ly *Layer) InitWts()

InitWts initializes the weight values in the network, i.e., resetting learning Also calls InitActs

func (*Layer) LesionNeurons

func (ly *Layer) LesionNeurons(prop float32) int

LesionNeurons lesions (sets the Off flag) for given proportion (0-1) of neurons in layer returns number of neurons lesioned. Emits error if prop > 1 as indication that percent might have been passed

func (*Layer) MSE

func (ly *Layer) MSE(tol float32) (sse, mse float64)

MSE returns the sum-squared-error and mean-squared-error over the layer, in terms of ActP - ActM (valid even on non-target layers FWIW). Uses the given tolerance per-unit to count an error at all (e.g., .5 = activity just has to be on the right side of .5).

func (*Layer) Pool

func (ly *Layer) Pool(idx int) *Pool

Pool returns pool at given index

func (*Layer) PoolTry

func (ly *Layer) PoolTry(idx int) (*Pool, error)

PoolTry returns pool at given index, returns error if index is out of range

func (*Layer) QuarterFinal

func (ly *Layer) QuarterFinal(ltime *Time)

QuarterFinal does updating after end of a quarter

func (*Layer) ReadWtsJSON

func (ly *Layer) ReadWtsJSON(r io.Reader) error

ReadWtsJSON reads the weights from this layer from the receiver-side perspective in a JSON text format.

func (*Layer) SSE

func (ly *Layer) SSE(tol float32) float64

SSE returns the sum-squared-error over the layer, in terms of ActP - ActM (valid even on non-target layers FWIW). Uses the given tolerance per-unit to count an error at all (e.g., .5 = activity just has to be on the right side of .5). Use this in Python which only allows single return values.

func (*Layer) SendGDelta

func (ly *Layer) SendGDelta(ltime *Time)

SendGDelta sends change in activation since last sent, to increment recv synaptic conductances G, if above thresholds

func (*Layer) UnLesionNeurons

func (ly *Layer) UnLesionNeurons()

UnLesionNeurons unlesions (clears the Off flag) for all neurons in the layer

func (*Layer) UnitVal

func (ly *Layer) UnitVal(varNm string, idx []int) float32

UnitVal returns value of given variable name on given unit, using shape-based dimensional index

func (*Layer) UnitVal1D

func (ly *Layer) UnitVal1D(varNm string, idx int) float32

UnitVal1D returns value of given variable name on given unit, using 1-dimensional index.

func (*Layer) UnitVal1DTry

func (ly *Layer) UnitVal1DTry(varNm string, idx int) (float32, error)

UnitVal1DTry returns value of given variable name on given unit, using 1-dimensional index.

func (*Layer) UnitValTry

func (ly *Layer) UnitValTry(varNm string, idx []int) (float32, error)

UnitValTry returns value of given variable name on given unit, using shape-based dimensional index

func (*Layer) UnitVals

func (ly *Layer) UnitVals(varNm string) []float32

UnitVals is emer.Layer interface method to return values of given variable

func (*Layer) UnitValsTensor

func (ly *Layer) UnitValsTensor(varNm string) etensor.Tensor

UnitValsTensor returns values of given variable name on unit for each unit in the layer, as a float32 tensor in same shape as layer units.

func (*Layer) UnitValsTensorTry

func (ly *Layer) UnitValsTensorTry(varNm string) (etensor.Tensor, error)

UnitValsTensorTry returns values of given variable name on unit for each unit in the layer, as a float32 tensor in same shape as layer units.

func (*Layer) UnitValsTry

func (ly *Layer) UnitValsTry(varNm string) ([]float32, error)

UnitValsTry is emer.Layer interface method to return values of given variable

func (*Layer) UnitVarNames

func (ly *Layer) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*Layer) UpdateParams

func (ly *Layer) UpdateParams()

UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer

func (*Layer) VarRange

func (ly *Layer) VarRange(varNm string) (min, max float32, err error)

VarRange returns the min / max values for given variable todo: support r. s. projection values

func (*Layer) WriteWtsJSON

func (ly *Layer) WriteWtsJSON(w io.Writer, depth int)

WriteWtsJSON writes the weights from this layer from the receiver-side perspective in a JSON text format. We build in the indentation logic to make it much faster and more efficient.

func (*Layer) WtBalFmWt

func (ly *Layer) WtBalFmWt()

WtBalFmWt computes the Weight Balance factors based on average recv weights

func (*Layer) WtFmDWt

func (ly *Layer) WtFmDWt()

WtFmDWt updates the weights from delta-weight changes -- on the sending projections

type LayerStru

type LayerStru struct {
	LeabraLay LeabraLayer    `` /* 299-byte string literal not displayed */
	Nm        string         `` /* 151-byte string literal not displayed */
	Cls       string         `desc:"Class is for applying parameter styles, can be space separated multple tags"`
	Off       bool           `desc:"inactivate this layer -- allows for easy experimentation"`
	Shp       etensor.Shape  `` /* 219-byte string literal not displayed */
	Typ       emer.LayerType `` /* 161-byte string literal not displayed */
	Thr       int            `` /* 216-byte string literal not displayed */
	Rel       relpos.Rel     `view:"inline" desc:"Spatial relationship to other layer, determines positioning"`
	Ps        mat32.Vec3     `` /* 154-byte string literal not displayed */
	Idx       int            `` /* 258-byte string literal not displayed */
	RcvPrjns  emer.Prjns     `desc:"list of receiving projections into this layer from other layers"`
	SndPrjns  emer.Prjns     `desc:"list of sending projections from this layer to other layers"`
}

leabra.LayerStru manages the structural elements of the layer, which are common to any Layer type

func (*LayerStru) ApplyParams

func (ls *LayerStru) ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

ApplyParams applies given parameter style Sheet to this layer and its recv projections. Calls UpdateParams on anything set to ensure derived parameters are all updated. If setMsg is true, then a message is printed to confirm each parameter that is set. it always prints a message if a parameter fails to be set. returns true if any params were set, and error if there were any errors.

func (*LayerStru) Class

func (ls *LayerStru) Class() string

func (*LayerStru) Config

func (ls *LayerStru) Config(shape []int, typ emer.LayerType)

Config configures the basic properties of the layer

func (*LayerStru) Index

func (ls *LayerStru) Index() int

func (*LayerStru) InitName

func (ls *LayerStru) InitName(lay emer.Layer, name string)

InitName MUST be called to initialize the layer's pointer to itself as an emer.Layer which enables the proper interface methods to be called. Also sets the name.

func (*LayerStru) Is2D

func (ls *LayerStru) Is2D() bool

func (*LayerStru) Is4D

func (ls *LayerStru) Is4D() bool

func (*LayerStru) IsOff

func (ls *LayerStru) IsOff() bool

func (*LayerStru) Label

func (ls *LayerStru) Label() string

func (*LayerStru) NPools

func (ls *LayerStru) NPools() int

NPools returns the number of unit sub-pools according to the shape parameters. Currently supported for a 4D shape, where the unit pools are the first 2 Y,X dims and then the units within the pools are the 2nd 2 Y,X dims

func (*LayerStru) NRecvPrjns

func (ls *LayerStru) NRecvPrjns() int

func (*LayerStru) NSendPrjns

func (ls *LayerStru) NSendPrjns() int

func (*LayerStru) Name

func (ls *LayerStru) Name() string

func (*LayerStru) NonDefaultParams

func (ls *LayerStru) NonDefaultParams() string

NonDefaultParams returns a listing of all parameters in the Layer that are not at their default values -- useful for setting param styles etc.

func (*LayerStru) Pos

func (ls *LayerStru) Pos() mat32.Vec3

func (*LayerStru) RecipToSendPrjn

func (ls *LayerStru) RecipToSendPrjn(spj emer.Prjn) (emer.Prjn, bool)

RecipToSendPrjn finds the reciprocal projection relative to the given sending projection found within the SendPrjns of this layer. This is then a recv prjn within this layer:

S=A -> R=B recip: R=A <- S=B -- ly = A -- we are the sender of srj and recv of rpj.

returns false if not found.

func (*LayerStru) RecvPrjn

func (ls *LayerStru) RecvPrjn(idx int) emer.Prjn

func (*LayerStru) RecvPrjns

func (ls *LayerStru) RecvPrjns() *emer.Prjns

func (*LayerStru) RelPos

func (ls *LayerStru) RelPos() relpos.Rel

func (*LayerStru) SendPrjn

func (ls *LayerStru) SendPrjn(idx int) emer.Prjn

func (*LayerStru) SendPrjns

func (ls *LayerStru) SendPrjns() *emer.Prjns

func (*LayerStru) SetClass

func (ls *LayerStru) SetClass(cls string)

func (*LayerStru) SetIndex

func (ls *LayerStru) SetIndex(idx int)

func (*LayerStru) SetOff

func (ls *LayerStru) SetOff(off bool)

func (*LayerStru) SetPos

func (ls *LayerStru) SetPos(pos mat32.Vec3)

func (*LayerStru) SetRelPos

func (ls *LayerStru) SetRelPos(rel relpos.Rel)

func (*LayerStru) SetShape

func (ls *LayerStru) SetShape(shape []int)

SetShape sets the layer shape and also uses default dim names

func (*LayerStru) SetThread

func (ls *LayerStru) SetThread(thr int)

func (*LayerStru) SetType

func (ls *LayerStru) SetType(typ emer.LayerType)

func (*LayerStru) Shape

func (ls *LayerStru) Shape() *etensor.Shape

func (*LayerStru) Size

func (ls *LayerStru) Size() mat32.Vec2

func (*LayerStru) Thread

func (ls *LayerStru) Thread() int

func (*LayerStru) Type

func (ls *LayerStru) Type() emer.LayerType

func (*LayerStru) TypeName

func (ls *LayerStru) TypeName() string

type LeabraLayer

type LeabraLayer interface {
	emer.Layer

	// AsLeabra returns this layer as a leabra.Layer -- all derived layers must redefine
	// this to return the base Layer type, so that the LeabraLayer interface does not
	// need to include accessors to all the basic stuff
	AsLeabra() *Layer

	// InitWts initializes the weight values in the network, i.e., resetting learning
	// Also calls InitActs
	InitWts()

	// InitActAvg initializes the running-average activation values that drive learning.
	InitActAvg()

	// InitActs fully initializes activation state -- only called automatically during InitWts
	InitActs()

	// InitWtsSym initializes the weight symmetry -- higher layers copy weights from lower layers
	InitWtSym()

	// InitExt initializes external input state -- called prior to apply ext
	InitExt()

	// ApplyExt applies external input in the form of an etensor.Tensor
	// If the layer is a Target or Compare layer type, then it goes in Targ
	// otherwise it goes in Ext.
	ApplyExt(ext etensor.Tensor)

	// ApplyExt1D applies external input in the form of a flat 1-dimensional slice of floats
	// If the layer is a Target or Compare layer type, then it goes in Targ
	// otherwise it goes in Ext
	ApplyExt1D(ext []float64)

	// AlphaCycInit handles all initialization at start of new input pattern, including computing
	// netinput scaling from running average activation etc.
	// should already have presented the external input to the network at this point.
	AlphaCycInit()

	// AvgLFmAvgM updates AvgL long-term running average activation that drives BCM Hebbian learning
	AvgLFmAvgM()

	// GScaleFmAvgAct computes the scaling factor for synaptic conductance input
	// based on sending layer average activation.
	// This attempts to automatically adjust for overall differences in raw activity coming into the units
	// to achieve a general target of around .5 to 1 for the integrated G values.
	GScaleFmAvgAct()

	// GenNoise generates random noise for all neurons
	GenNoise()

	// DecayState decays activation state by given proportion (default is on ly.Act.Init.Decay)
	DecayState(decay float32)

	// HardClamp hard-clamps the activations in the layer -- called during AlphaCycInit
	// for hard-clamped Input layers
	HardClamp()

	// InitGInc initializes synaptic conductance increments -- optional
	InitGInc()

	// SendGDelta sends change in activation since last sent, to increment recv
	// synaptic conductances G, if above thresholds
	SendGDelta(ltime *Time)

	// GFmInc integrates new synaptic conductances from increments sent during last SendGDelta
	GFmInc(ltime *Time)

	// AvgMaxGe computes the average and max Ge stats, used in inhibition
	AvgMaxGe(ltime *Time)

	// InhibiFmGeAct computes inhibition Gi from Ge and Act averages within relevant Pools
	InhibFmGeAct(ltime *Time)

	// ActFmG computes rate-code activation from Ge, Gi, Gl conductances
	// and updates learning running-average activations from that Act
	ActFmG(ltime *Time)

	// AvgMaxAct computes the average and max Act stats, used in inhibition
	AvgMaxAct(ltime *Time)

	// QuarterFinal does updating after end of a quarter
	QuarterFinal(ltime *Time)

	// CosDiffFmActs computes the cosine difference in activation state between minus and plus phases.
	// this is also used for modulating the amount of BCM hebbian learning
	CosDiffFmActs()

	// DWt computes the weight change (learning) -- calls DWt method on sending projections
	DWt()

	// WtFmDWt updates the weights from delta-weight changes -- on the sending projections
	WtFmDWt()

	// WtBalFmWt computes the Weight Balance factors based on average recv weights
	WtBalFmWt()
}

LeabraLayer defines the essential algorithmic API for Leabra, at the layer level. These are the methods that the leabra.Network calls on its layers at each step of processing. Other Layer types can selectively re-implement (override) these methods to modify the computation, while inheriting the basic behavior for non-overridden methods.

All of the structural API is in emer.Layer, which this interface also inherits for convenience.

type LeabraPrjn

type LeabraPrjn interface {
	emer.Prjn

	// AsLeabra returns this prjn as a leabra.Prjn -- all derived prjns must redefine
	// this to return the base Prjn type, so that the LeabraPrjn interface does not
	// need to include accessors to all the basic stuff.
	AsLeabra() *Prjn

	// InitWts initializes weight values according to Learn.WtInit params
	InitWts()

	// InitWtSym initializes weight symmetry -- is given the reciprocal projection where
	// the Send and Recv layers are reversed.
	InitWtSym(rpj LeabraPrjn)

	// InitGInc initializes the per-projection synaptic conductance threadsafe increments.
	// This is not typically needed (called during InitWts only) but can be called when needed
	InitGInc()

	// SendGDelta sends the delta-activation from sending neuron index si,
	// to integrate synaptic conductances on receivers
	SendGDelta(si int, delta float32)

	// RecvGInc increments the receiver's synaptic conductances from those of all the projections.
	RecvGInc()

	// DWt computes the weight change (learning) -- on sending projections
	DWt()

	// WtFmDWt updates the synaptic weight values from delta-weight changes -- on sending projections
	WtFmDWt()

	// WtBalFmWt computes the Weight Balance factors based on average recv weights
	WtBalFmWt()
}

LeabraPrjn defines the essential algorithmic API for Leabra, at the projection level. These are the methods that the leabra.Layer calls on its prjns at each step of processing. Other Prjn types can selectively re-implement (override) these methods to modify the computation, while inheriting the basic behavior for non-overridden methods.

All of the structural API is in emer.Prjn, which this interface also inherits for convenience.

type LearnNeurParams

type LearnNeurParams struct {
	ActAvg  LrnActAvgParams `view:"inline" desc:"parameters for computing running average activations that drive learning"`
	AvgL    AvgLParams      `view:"inline" desc:"parameters for computing AvgL long-term running average"`
	CosDiff CosDiffParams   `view:"inline" desc:"parameters for computing cosine diff between minus and plus phase"`
}

leabra.LearnNeurParams manages learning-related parameters at the neuron-level. This is mainly the running average activations that drive learning

func (*LearnNeurParams) AvgLFmAvgM

func (ln *LearnNeurParams) AvgLFmAvgM(nrn *Neuron)

AvgLFmAct computes long-term average activation value, and learning factor, from current AvgM. Called at start of new alpha-cycle.

func (*LearnNeurParams) AvgsFmAct

func (ln *LearnNeurParams) AvgsFmAct(nrn *Neuron)

AvgsFmAct updates the running averages based on current activation. Computed after new activation for current cycle is updated.

func (*LearnNeurParams) Defaults

func (ln *LearnNeurParams) Defaults()

func (*LearnNeurParams) InitActAvg

func (ln *LearnNeurParams) InitActAvg(nrn *Neuron)

InitActAvg initializes the running-average activation values that drive learning. Called by InitWts (at start of learning).

func (*LearnNeurParams) Update

func (ln *LearnNeurParams) Update()

type LearnSynParams

type LearnSynParams struct {
	Learn    bool            `desc:"enable learning for this projection"`
	Lrate    float32         `desc:"learning rate"`
	WtInit   erand.RndParams `view:"inline" desc:"initial random weight distribution"`
	XCal     XCalParams      `view:"inline" desc:"parameters for the XCal learning rule"`
	WtSig    WtSigParams     `view:"inline" desc:"parameters for the sigmoidal contrast weight enhancement"`
	Norm     DWtNormParams   `view:"inline" desc:"parameters for normalizing weight changes by abs max dwt"`
	Momentum MomentumParams  `view:"inline" desc:"parameters for momentum across weight changes"`
	WtBal    WtBalParams     `view:"inline" desc:"parameters for balancing strength of weight increases vs. decreases"`
}

leabra.LearnSynParams manages learning-related parameters at the synapse-level.

func (*LearnSynParams) CHLdWt

func (ls *LearnSynParams) CHLdWt(suAvgSLrn, suAvgM, ruAvgSLrn, ruAvgM, ruAvgL float32) (err, bcm float32)

CHLdWt returns the error-driven and bcm Hebbian weight change components for the temporally eXtended Contrastive Attractor Learning (XCAL), CHL version

func (*LearnSynParams) Defaults

func (ls *LearnSynParams) Defaults()

func (*LearnSynParams) InitWts

func (ls *LearnSynParams) InitWts(syn *Synapse)

InitWts initializes weight values based on WtInit randomness parameters It also updates the linear weight value based on the sigmoidal weight value

func (*LearnSynParams) LWtFmWt

func (ls *LearnSynParams) LWtFmWt(syn *Synapse)

LWtFmWt updates the linear weight value based on the current effective Wt value. effective weight is sigmoidally contrast-enhanced relative to the linear weight.

func (*LearnSynParams) Update

func (ls *LearnSynParams) Update()

func (*LearnSynParams) WtFmDWt

func (ls *LearnSynParams) WtFmDWt(wbInc, wbDec float32, dwt, wt, lwt *float32)

WtFmDWt updates the synaptic weights from accumulated weight changes wbInc and wbDec are the weight balance factors, wt is the sigmoidal contrast-enhanced weight and lwt is the linear weight value

func (*LearnSynParams) WtFmLWt

func (ls *LearnSynParams) WtFmLWt(syn *Synapse)

WtFmLWt updates the effective weight value based on the current linear Wt value. effective weight is sigmoidally contrast-enhanced relative to the linear weight.

type LrnActAvgParams

type LrnActAvgParams struct {
	SSTau float32 `` /* 532-byte string literal not displayed */
	STau  float32 `` /* 378-byte string literal not displayed */
	MTau  float32 `` /* 518-byte string literal not displayed */
	LrnM  float32 `` /* 618-byte string literal not displayed */
	Init  float32 `def:"0.15" min:"0" max:"1" desc:"initial value for average"`

	SSDt float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"rate = 1 / tau"`
	SDt  float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"rate = 1 / tau"`
	MDt  float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"rate = 1 / tau"`
	LrnS float32 `view:"-" json:"-" xml:"-" inactive:"+" desc:"1-LrnM"`
}

LrnActAvgParams has rate constants for averaging over activations at different time scales, to produce the running average activation values that then drive learning in the XCAL learning rules

func (*LrnActAvgParams) AvgsFmAct

func (aa *LrnActAvgParams) AvgsFmAct(ruAct float32, avgSS, avgS, avgM, avgSLrn *float32)

AvgsFmAct computes averages based on current act

func (*LrnActAvgParams) Defaults

func (aa *LrnActAvgParams) Defaults()

func (*LrnActAvgParams) Update

func (aa *LrnActAvgParams) Update()

type MomentumParams

type MomentumParams struct {
	On     bool    `def:"true" desc:"whether to use standard simple momentum"`
	MTau   float32 `` /* 189-byte string literal not displayed */
	LrComp float32 `` /* 288-byte string literal not displayed */

	MDt  float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate constant of momentum integration = 1 / m_tau"`
	MDtC float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"complement rate constant of momentum integration = 1 - (1 / m_tau)"`
}

MomentumParams implements standard simple momentum -- accentuates consistent directions of weight change and cancels out dithering -- biologically captures slower timecourse of longer-term plasticity mechanisms.

func (*MomentumParams) Defaults

func (mp *MomentumParams) Defaults()

func (*MomentumParams) MomentFmDWt

func (mp *MomentumParams) MomentFmDWt(moment *float32, dwt float32) float32

MomentFmDWt updates synaptic moment variable based on dwt weight change value and returns new momentum factor * LrComp

func (*MomentumParams) Update

func (mp *MomentumParams) Update()

type Network

type Network struct {
	NetworkStru
	WtBalInterval int `def:"10" desc:"how frequently to update the weight balance average weight factor -- relatively expensive"`
	WtBalCtr      int `inactive:"+" desc:"counter for how long it has been since last WtBal"`
}

leabra.Network has parameters for running a basic rate-coded Leabra network

func (*Network) ActFmG

func (nt *Network) ActFmG(ltime *Time)

ActFmG computes rate-code activation from Ge, Gi, Gl conductances

func (*Network) AlphaCycInit

func (nt *Network) AlphaCycInit()

AlphaCycInit handles all initialization at start of new input pattern, including computing netinput scaling from running average activation etc.

func (*Network) AvgMaxAct

func (nt *Network) AvgMaxAct(ltime *Time)

AvgMaxGe computes the average and max Ge stats, used in inhibition

func (*Network) AvgMaxGe

func (nt *Network) AvgMaxGe(ltime *Time)

AvgMaxGe computes the average and max Ge stats, used in inhibition

func (*Network) Cycle

func (nt *Network) Cycle(ltime *Time)

Cycle runs one cycle of activation updating: * Sends Ge increments from sending to receiving layers * Average and Max Ge stats * Inhibition based on Ge stats and Act Stats (computed at end of Cycle) * Activation from Ge, Gi, and Gl * Average and Max Act stats This basic version doesn't use the time info, but more specialized types do, and we want to keep a consistent API for end-user code.

func (*Network) DWt

func (nt *Network) DWt()

DWt computes the weight change (learning) based on current running-average activation values

func (*Network) Defaults

func (nt *Network) Defaults()

Defaults sets all the default parameters for all layers and projections

func (*Network) InhibFmGeAct

func (nt *Network) InhibFmGeAct(ltime *Time)

InhibiFmGeAct computes inhibition Gi from Ge and Act stats within relevant Pools

func (*Network) InitActs

func (nt *Network) InitActs()

InitActs fully initializes activation state -- not automatically called

func (*Network) InitExt

func (nt *Network) InitExt()

InitExt initializes external input state -- call prior to applying external inputs to layers

func (*Network) InitWts

func (nt *Network) InitWts()

InitWts initializes synaptic weights and all other associated long-term state variables including running-average state values (e.g., layer running average activations etc)

func (*Network) NewLayer

func (nt *Network) NewLayer() emer.Layer

NewLayer returns new layer of proper type

func (*Network) NewPrjn

func (nt *Network) NewPrjn() emer.Prjn

NewPrjn returns new prjn of proper type

func (*Network) QuarterFinal

func (nt *Network) QuarterFinal(ltime *Time)

QuarterFinal does updating after end of a quarter

func (*Network) SendGDelta

func (nt *Network) SendGDelta(ltime *Time)

SendGeDelta sends change in activation since last sent, if above thresholds and integrates sent deltas into GeRaw and time-integrated Ge values

func (*Network) UpdateParams

func (nt *Network) UpdateParams()

UpdateParams updates all the derived parameters if any have changed, for all layers and projections

func (*Network) WtBalFmWt

func (nt *Network) WtBalFmWt()

WtBalFmWt updates the weight balance factors based on average recv weights

func (*Network) WtFmDWt

func (nt *Network) WtFmDWt()

WtFmDWt updates the weights from delta-weight changes. Also calls WtBalFmWt every WtBalInterval times

type NetworkStru

type NetworkStru struct {
	EmerNet emer.Network          `` /* 274-byte string literal not displayed */
	Nm      string                `desc:"overall name of network -- helps discriminate if there are multiple"`
	Layers  emer.Layers           `desc:"list of layers"`
	WtsFile string                `desc:"filename of last weights file loaded or saved"`
	LayMap  map[string]emer.Layer `view:"-" desc:"map of name to layers -- layer names must be unique"`
	MinPos  mat32.Vec3            `view:"-" desc:"minimum display position in network"`
	MaxPos  mat32.Vec3            `view:"-" desc:"maximum display position in network"`

	NThreads int                    `` /* 203-byte string literal not displayed */
	ThrLay   [][]emer.Layer         `` /* 179-byte string literal not displayed */
	ThrChans []LayFunChan           `view:"-" desc:"layer function channels, per thread"`
	ThrTimes []timer.Time           `view:"-" desc:"timers for each thread, so you can see how evenly the workload is being distributed"`
	FunTimes map[string]*timer.Time `view:"-" desc:"timers for each major function (step of processing)"`
	WaitGp   sync.WaitGroup         `view:"-" desc:"network-level wait group for synchronizing threaded layer calls"`
}

leabra.NetworkStru holds the basic structural components of a network (layers)

func (*NetworkStru) AddLayer

func (nt *NetworkStru) AddLayer(name string, shape []int, typ emer.LayerType) emer.Layer

AddLayer adds a new layer with given name and shape to the network. 2D and 4D layer shapes are generally preferred but not essential -- see AddLayer2D and 4D for convenience methods for those. 4D layers enable pool (unit-group) level inhibition in Leabra networks, for example. shape is in row-major format with outer-most dimensions first: e.g., 4D 3, 2, 4, 5 = 3 rows (Y) of 2 cols (X) of pools, with each unit group having 4 rows (Y) of 5 (X) units.

func (*NetworkStru) AddLayer2D

func (nt *NetworkStru) AddLayer2D(name string, shapeY, shapeX int, typ emer.LayerType) emer.Layer

AddLayer2D adds a new layer with given name and 2D shape to the network. 2D and 4D layer shapes are generally preferred but not essential.

func (*NetworkStru) AddLayer4D

func (nt *NetworkStru) AddLayer4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, typ emer.LayerType) emer.Layer

AddLayer4D adds a new layer with given name and 4D shape to the network. 4D layers enable pool (unit-group) level inhibition in Leabra networks, for example. shape is in row-major format with outer-most dimensions first: e.g., 4D 3, 2, 4, 5 = 3 rows (Y) of 2 cols (X) of pools, with each pool having 4 rows (Y) of 5 (X) neurons.

func (*NetworkStru) AllParams

func (nt *NetworkStru) AllParams() string

AllParams returns a listing of all parameters in the Network.

func (*NetworkStru) ApplyParams

func (nt *NetworkStru) ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

ApplyParams applies given parameter style Sheet to layers and prjns in this network. Calls UpdateParams to ensure derived parameters are all updated. If setMsg is true, then a message is printed to confirm each parameter that is set. it always prints a message if a parameter fails to be set. returns true if any params were set, and error if there were any errors.

func (*NetworkStru) Bounds

func (nt *NetworkStru) Bounds() (min, max mat32.Vec3)

func (*NetworkStru) BoundsUpdt

func (nt *NetworkStru) BoundsUpdt()

BoundsUpdt updates the Min / Max display bounds for 3D display

func (*NetworkStru) Build

func (nt *NetworkStru) Build() error

Build constructs the layer and projection state based on the layer shapes and patterns of interconnectivity

func (*NetworkStru) BuildThreads

func (nt *NetworkStru) BuildThreads()

BuildThreads constructs the layer thread allocation based on Thread setting in the layers

func (*NetworkStru) ConnectLayerNames

func (nt *NetworkStru) ConnectLayerNames(send, recv string, pat prjn.Pattern, typ emer.PrjnType) (rlay, slay emer.Layer, pj emer.Prjn, err error)

ConnectLayerNames establishes a projection between two layers, referenced by name adding to the recv and send projection lists on each side of the connection. Returns error if not successful. Does not yet actually connect the units within the layers -- that requires Build.

func (*NetworkStru) ConnectLayers

func (nt *NetworkStru) ConnectLayers(send, recv emer.Layer, pat prjn.Pattern, typ emer.PrjnType) emer.Prjn

ConnectLayers establishes a projection between two layers, adding to the recv and send projection lists on each side of the connection. Returns false if not successful. Does not yet actually connect the units within the layers -- that requires Build.

func (*NetworkStru) FunTimerStart

func (nt *NetworkStru) FunTimerStart(fun string)

FunTimerStart starts function timer for given function name -- ensures creation of timer

func (*NetworkStru) FunTimerStop

func (nt *NetworkStru) FunTimerStop(fun string)

FunTimerStop stops function timer -- timer must already exist

func (*NetworkStru) InitName

func (nt *NetworkStru) InitName(net emer.Network, name string)

InitName MUST be called to initialize the network's pointer to itself as an emer.Network which enables the proper interface methods to be called. Also sets the name.

func (*NetworkStru) Label

func (nt *NetworkStru) Label() string

func (*NetworkStru) Layer

func (nt *NetworkStru) Layer(idx int) emer.Layer

func (*NetworkStru) LayerByName

func (nt *NetworkStru) LayerByName(name string) emer.Layer

LayerByName returns a layer by looking it up by name in the layer map (nil if not found). Will create the layer map if it is nil or a different size than layers slice, but otherwise needs to be updated manually.

func (*NetworkStru) LayerByNameTry

func (nt *NetworkStru) LayerByNameTry(name string) (emer.Layer, error)

LayerByNameTry returns a layer by looking it up by name -- emits a log error message if layer is not found

func (*NetworkStru) Layout

func (nt *NetworkStru) Layout()

Layout computes the 3D layout of layers based on their relative position settings

func (*NetworkStru) MakeLayMap

func (nt *NetworkStru) MakeLayMap()

MakeLayMap updates layer map based on current layers

func (*NetworkStru) NLayers

func (nt *NetworkStru) NLayers() int

func (*NetworkStru) Name

func (nt *NetworkStru) Name() string

emer.Network interface methods:

func (*NetworkStru) NonDefaultParams

func (nt *NetworkStru) NonDefaultParams() string

NonDefaultParams returns a listing of all parameters in the Network that are not at their default values -- useful for setting param styles etc.

func (*NetworkStru) OpenWtsJSON

func (nt *NetworkStru) OpenWtsJSON(filename gi.FileName) error

OpenWtsJSON opens network weights (and any other state that adapts with learning) from a JSON-formatted file

func (*NetworkStru) ReadWtsJSON

func (nt *NetworkStru) ReadWtsJSON(r io.Reader) error

ReadWtsJSON reads the weights from this layer from the receiver-side perspective in a JSON text format.

func (*NetworkStru) SaveWtsJSON

func (nt *NetworkStru) SaveWtsJSON(filename gi.FileName) error

SaveWtsJSON saves network weights (and any other state that adapts with learning) to a JSON-formatted file

func (*NetworkStru) StartThreads

func (nt *NetworkStru) StartThreads()

StartThreads starts up the computation threads, which monitor the channels for work

func (*NetworkStru) StdVertLayout

func (nt *NetworkStru) StdVertLayout()

StdVertLayout arranges layers in a standard vertical (z axis stack) layout, by setting the Rel settings

func (*NetworkStru) StopThreads

func (nt *NetworkStru) StopThreads()

StopThreads stops the computation threads

func (*NetworkStru) ThrLayFun

func (nt *NetworkStru) ThrLayFun(fun func(ly LeabraLayer), funame string)

ThrLayFun calls function on layer, using threaded (go routine worker) computation if NThreads > 1 and otherwise just iterates over layers in the current thread.

func (*NetworkStru) ThrTimerReset

func (nt *NetworkStru) ThrTimerReset()

ThrTimerReset resets the per-thread timers

func (*NetworkStru) ThrWorker

func (nt *NetworkStru) ThrWorker(tt int)

ThrWorker is the worker function run by the worker threads

func (*NetworkStru) TimerReport

func (nt *NetworkStru) TimerReport()

TimerReport reports the amount of time spent in each function, and in each thread

func (*NetworkStru) VarRange

func (nt *NetworkStru) VarRange(varNm string) (min, max float32, err error)

VarRange returns the min / max values for given variable todo: support r. s. projection values

func (*NetworkStru) WriteWtsJSON

func (nt *NetworkStru) WriteWtsJSON(w io.Writer)

WriteWtsJSON writes the weights from this layer from the receiver-side perspective in a JSON text format. We build in the indentation logic to make it much faster and more efficient.

type NeurFlags

type NeurFlags int32

NeurFlags are bit-flags encoding relevant binary state for neurons

const (
	// NeurOff flag indicates that this neuron has been turned off (i.e., lesioned)
	NeurOff NeurFlags = iota

	// NeurHasExt means the neuron has external input in its Ext field
	NeurHasExt

	// NeurHasTarg means the neuron has external target input in its Targ field
	NeurHasTarg

	// NeurHasCmpr means the neuron has external comparison input in its Targ field -- used for computing
	// comparison statistics but does not drive neural activity ever
	NeurHasCmpr

	NeurFlagsN
)

The neuron flags

func (*NeurFlags) FromString

func (i *NeurFlags) FromString(s string) error

func (NeurFlags) MarshalJSON

func (ev NeurFlags) MarshalJSON() ([]byte, error)

func (NeurFlags) String

func (i NeurFlags) String() string

func (*NeurFlags) UnmarshalJSON

func (ev *NeurFlags) UnmarshalJSON(b []byte) error

type Neuron

type Neuron struct {
	Flags   NeurFlags `desc:"bit flags for binary state variables"`
	SubPool int32     `` /* 184-byte string literal not displayed */
	Act     float32   `desc:"overall rate coded activation value -- what is sent to other neurons -- typically in range 0-1"`
	Ge      float32   `desc:"total excitatory synaptic conductance -- the net excitatory input to the neuron -- does *not* include Gbar.E"`
	Gi      float32   `desc:"total inhibitory synaptic conductance -- the net inhibitory input to the neuron -- does *not* include Gbar.I"`
	Inet    float32   `desc:"net current produced by all channels -- drives update of Vm"`
	Vm      float32   `desc:"membrane potential -- integrates Inet current over time"`

	Targ float32 `desc:"target value: drives learning to produce this activation value"`
	Ext  float32 `desc:"external input: drives activation of unit from outside influences (e.g., sensory input)"`

	AvgSS   float32 `` /* 244-byte string literal not displayed */
	AvgS    float32 `` /* 180-byte string literal not displayed */
	AvgM    float32 `` /* 138-byte string literal not displayed */
	AvgL    float32 `` /* 127-byte string literal not displayed */
	AvgLLrn float32 `` /* 356-byte string literal not displayed */
	AvgSLrn float32 `` /* 414-byte string literal not displayed */

	ActM   float32 `` /* 130-byte string literal not displayed */
	ActP   float32 `desc:"records the traditional posterior-cortical plus_phase activation, as activation at end of current alpha cycle"`
	ActDif float32 `` /* 164-byte string literal not displayed */
	ActDel float32 `desc:"delta activation: change in Act from one cycle to next -- can be useful to track where changes are taking place"`
	ActAvg float32 `` /* 216-byte string literal not displayed */

	Noise  float32 `desc:"noise value added to unit (ActNoiseParams determines distribution, and when / where it is added)"`
	GiSyn  float32 `` /* 168-byte string literal not displayed */
	GiSelf float32 `desc:"total amount of self-inhibition -- time-integrated to avoid oscillations"`

	ActSent float32 `desc:"last activation value sent (only send when diff is over threshold)"`
	GeRaw   float32 `desc:"raw excitatory conductance (net input) received from sending units (send delta's are added to this value)"`
	GeInc   float32 `desc:"delta increment in GeRaw sent using SendGeDelta"`
	GiRaw   float32 `desc:"raw inhibitory conductance (net input) received from sending units (send delta's are added to this value)"`
	GiInc   float32 `desc:"delta increment in GiRaw sent using SendGeDelta"`
}

leabra.Neuron holds all of the neuron (unit) level variables -- this is the most basic version with rate-code only and no optional features at all. All variables accessible via Unit interface must be float32 and start at the top, in contiguous order

func (*Neuron) ClearFlag

func (nrn *Neuron) ClearFlag(flag NeurFlags)

func (*Neuron) ClearMask

func (nrn *Neuron) ClearMask(mask int32)

func (*Neuron) HasFlag

func (nrn *Neuron) HasFlag(flag NeurFlags) bool

func (*Neuron) IsOff

func (nrn *Neuron) IsOff() bool

IsOff returns true if the neuron has been turned off (lesioned)

func (*Neuron) SetFlag

func (nrn *Neuron) SetFlag(flag NeurFlags)

func (*Neuron) SetMask

func (nrn *Neuron) SetMask(mask int32)

func (*Neuron) VarByIndex

func (nrn *Neuron) VarByIndex(idx int) float32

VarByIndex returns variable using index (0 = first variable in NeuronVars list)

func (*Neuron) VarByName

func (nrn *Neuron) VarByName(varNm string) (float32, error)

VarByName returns variable by name, or error

func (*Neuron) VarNames

func (nrn *Neuron) VarNames() []string

type OptThreshParams

type OptThreshParams struct {
	Send  float32 `def:"0.1" desc:"don't send activation when act <= send -- greatly speeds processing"`
	Delta float32 `` /* 129-byte string literal not displayed */
}

OptThreshParams provides optimization thresholds for faster processing

func (*OptThreshParams) Defaults

func (ot *OptThreshParams) Defaults()

func (*OptThreshParams) Update

func (ot *OptThreshParams) Update()

type Pool

type Pool struct {
	StIdx, EdIdx int             `desc:"starting and ending (exlusive) indexes for the list of neurons in this pool"`
	Inhib        FFFBInhib       `desc:"FFFB inhibition computed values"`
	Ge           minmax.AvgMax32 `desc:"average and max Ge excitatory conductance values, which drive FF inhibition"`
	Act          minmax.AvgMax32 `desc:"average and max Act activation values, which drive FB inhibition"`
	ActM         minmax.AvgMax32 `desc:"minus phase average and max Act activation values, for ActAvg updt"`
	ActP         minmax.AvgMax32 `desc:"plus phase average and max Act activation values, for ActAvg updt"`
	ActAvg       ActAvg          `desc:"running-average activation levels used for netinput scaling and adaptive inhibition"`
}

Pool contains computed values for FFFB inhibition, and various other state values for layers and pools (unit groups) that can be subject to inhibition, including: * average / max stats on Ge and Act that drive inhibition * average activity overall that is used for normalizing netin (at layer level)

func (*Pool) Init

func (pl *Pool) Init()

type Prjn

type Prjn struct {
	PrjnStru
	WtScale WtScaleParams  `desc:"weight scaling parameters: modulates overall strength of projection, using both absolute and relative factors"`
	Learn   LearnSynParams `desc:"synaptic-level learning parameters"`
	Syns    []Synapse      `desc:"synaptic state values, ordered by the sending layer units which owns them -- one-to-one with SConIdx array"`

	// misc state variables below:
	GScale float32         `` /* 145-byte string literal not displayed */
	GInc   []float32       `` /* 178-byte string literal not displayed */
	WbRecv []WtBalRecvPrjn `desc:"weight balance state variables for this projection, one per recv neuron"`
}

leabra.Prjn is a basic Leabra projection with synaptic learning parameters

func (*Prjn) AllParams

func (pj *Prjn) AllParams() string

AllParams returns a listing of all parameters in the Layer

func (*Prjn) AsLeabra

func (pj *Prjn) AsLeabra() *Prjn

AsLeabra returns this prjn as a leabra.Prjn -- all derived prjns must redefine this to return the base Prjn type, so that the LeabraPrjn interface does not need to include accessors to all the basic stuff.

func (*Prjn) Build

func (pj *Prjn) Build() error

Build constructs the full connectivity among the layers as specified in this projection. Calls PrjnStru.BuildStru and then allocates the synaptic values in Syns accordingly.

func (*Prjn) DWt

func (pj *Prjn) DWt()

DWt computes the weight change (learning) -- on sending projections

func (*Prjn) Defaults

func (pj *Prjn) Defaults()

func (*Prjn) InitGInc

func (pj *Prjn) InitGInc()

IniteGInc initializes the per-projection GInc threadsafe increment -- not typically needed (called during InitWts only) but can be called when needed

func (*Prjn) InitWtSym

func (pj *Prjn) InitWtSym(rpjp LeabraPrjn)

InitWtSym initializes weight symmetry -- is given the reciprocal projection where the Send and Recv layers are reversed.

func (*Prjn) InitWts

func (pj *Prjn) InitWts()

InitWts initializes weight values according to Learn.WtInit params

func (*Prjn) ReadWtsJSON

func (pj *Prjn) ReadWtsJSON(r io.Reader) error

ReadWtsJSON reads the weights for this projection from the receiver-side perspective in a JSON text format.

func (*Prjn) RecvGInc

func (pj *Prjn) RecvGInc()

RecvGInc increments the receiver's GeInc or GiInc from that of all the projections.

func (*Prjn) SendGDelta

func (pj *Prjn) SendGDelta(si int, delta float32)

SendGDelta sends the delta-activation from sending neuron index si, to integrate synaptic conductances on receivers

func (*Prjn) SetSynVal

func (pj *Prjn) SetSynVal(varnm string, sidx, ridx int, val float32) error

SetSynVal sets value of given variable name on the synapse between given send, recv unit indexes (1D, flat indexes) returns error for access errors.

func (*Prjn) SynVal

func (pj *Prjn) SynVal(varnm string, sidx, ridx int) float32

SynVal returns value of given variable name on the synapse between given send, recv unit indexes (1D, flat indexes) returns nil for access errors.

func (*Prjn) SynValTry

func (pj *Prjn) SynValTry(varnm string, sidx, ridx int) (float32, error)

SynValTry returns value of given variable name on the synapse between given send, recv unit indexes (1D, flat indexes) returns error for access errors.

func (*Prjn) SynVals

func (pj *Prjn) SynVals(varnm string) []float32

SynVals returns values of given variable name on synapses for each synapse in the projection using the natural ordering of the synapses (sender based for Leabra)

func (*Prjn) SynValsTry

func (pj *Prjn) SynValsTry(varnm string) ([]float32, error)

SynValsTry returns values of given variable name on synapses for each synapse in the projection using the natural ordering of the synapses (sender based for Leabra)

func (*Prjn) SynVarNames

func (pj *Prjn) SynVarNames() []string

func (*Prjn) UpdateParams

func (pj *Prjn) UpdateParams()

UpdateParams updates all params given any changes that might have been made to individual values

func (*Prjn) WriteWtsJSON

func (pj *Prjn) WriteWtsJSON(w io.Writer, depth int)

WriteWtsJSON writes the weights from this projection from the receiver-side perspective in a JSON text format. We build in the indentation logic to make it much faster and more efficient.

func (*Prjn) WtBalFmWt

func (pj *Prjn) WtBalFmWt()

WtBalFmWt computes the Weight Balance factors based on average recv weights

func (*Prjn) WtFmDWt

func (pj *Prjn) WtFmDWt()

WtFmDWt updates the synaptic weight values from delta-weight changes -- on sending projections

type PrjnStru

type PrjnStru struct {
	LeabraPrj   LeabraPrjn      `` /* 269-byte string literal not displayed */
	Off         bool            `desc:"inactivate this projection -- allows for easy experimentation"`
	Cls         string          `desc:"Class is for applying parameter styles, can be space separated multple tags"`
	Notes       string          `desc:"can record notes about this projection here"`
	Recv        emer.Layer      `` /* 169-byte string literal not displayed */
	Send        emer.Layer      `desc:"sending layer for this projection"`
	Pat         prjn.Pattern    `desc:"pattern of connectivity"`
	Typ         emer.PrjnType   `` /* 154-byte string literal not displayed */
	RConN       []int32         `view:"-" desc:"number of recv connections for each neuron in the receiving layer, as a flat list"`
	RConNAvgMax minmax.AvgMax32 `inactive:"+" desc:"average and maximum number of recv connections in the receiving layer"`
	RConIdxSt   []int32         `view:"-" desc:"starting index into ConIdx list for each neuron in receiving layer -- just a list incremented by ConN"`
	RConIdx     []int32         `` /* 213-byte string literal not displayed */
	RSynIdx     []int32         `` /* 185-byte string literal not displayed */
	SConN       []int32         `view:"-" desc:"number of sending connections for each neuron in the sending layer, as a flat list"`
	SConNAvgMax minmax.AvgMax32 `inactive:"+" desc:"average and maximum number of sending connections in the sending layer"`
	SConIdxSt   []int32         `view:"-" desc:"starting index into ConIdx list for each neuron in sending layer -- just a list incremented by ConN"`
	SConIdx     []int32         `` /* 213-byte string literal not displayed */
}

PrjnStru contains the basic structural information for specifying a projection of synaptic connections between two layers, and maintaining all the synaptic connection-level data. The exact same struct object is added to the Recv and Send layers, and it manages everything about the connectivity, and methods on the Prjn handle all the relevant computation.

func (*PrjnStru) ApplyParams

func (ps *PrjnStru) ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

ApplyParams applies given parameter style Sheet to this projection. Calls UpdateParams if anything set to ensure derived parameters are all updated. If setMsg is true, then a message is printed to confirm each parameter that is set. it always prints a message if a parameter fails to be set. returns true if any params were set, and error if there were any errors.

func (*PrjnStru) BuildStru

func (ps *PrjnStru) BuildStru() error

BuildStru constructs the full connectivity among the layers as specified in this projection. Calls Validate and returns false if invalid. Pat.Connect is called to get the pattern of the connection. Then the connection indexes are configured according to that pattern.

func (*PrjnStru) Class

func (ps *PrjnStru) Class() string

func (*PrjnStru) Connect

func (ps *PrjnStru) Connect(slay, rlay emer.Layer, pat prjn.Pattern, typ emer.PrjnType)

Connect sets the connectivity between two layers and the pattern to use in interconnecting them

func (*PrjnStru) Init

func (ps *PrjnStru) Init(prjn emer.Prjn)

Init MUST be called to initialize the prjn's pointer to itself as an emer.Prjn which enables the proper interface methods to be called.

func (*PrjnStru) IsOff

func (ps *PrjnStru) IsOff() bool

func (*PrjnStru) Label

func (ps *PrjnStru) Label() string

func (*PrjnStru) Name

func (ps *PrjnStru) Name() string

func (*PrjnStru) NonDefaultParams

func (ps *PrjnStru) NonDefaultParams() string

NonDefaultParams returns a listing of all parameters in the Layer that are not at their default values -- useful for setting param styles etc.

func (*PrjnStru) Pattern

func (ps *PrjnStru) Pattern() prjn.Pattern

func (*PrjnStru) RecvLay

func (ps *PrjnStru) RecvLay() emer.Layer

func (*PrjnStru) SendLay

func (ps *PrjnStru) SendLay() emer.Layer

func (*PrjnStru) SetClass

func (ps *PrjnStru) SetClass(cls string)

func (*PrjnStru) SetNIdxSt

func (ps *PrjnStru) SetNIdxSt(n *[]int32, avgmax *minmax.AvgMax32, idxst *[]int32, tn *etensor.Int32) int32

SetNIdxSt sets the *ConN and *ConIdxSt values given n tensor from Pat. Returns total number of connections for this direction.

func (*PrjnStru) SetOff

func (ps *PrjnStru) SetOff(off bool)

func (*PrjnStru) SetType

func (ps *PrjnStru) SetType(typ emer.PrjnType)

func (*PrjnStru) String

func (ps *PrjnStru) String() string

String satisfies fmt.Stringer for prjn

func (*PrjnStru) Type

func (ps *PrjnStru) Type() emer.PrjnType

func (*PrjnStru) TypeName

func (ps *PrjnStru) TypeName() string

func (*PrjnStru) Validate

func (ps *PrjnStru) Validate(logmsg bool) error

Validate tests for non-nil settings for the projection -- returns error message or nil if no problems (and logs them if logmsg = true)

type Quarters

type Quarters int32

Quarters are the different alpha trial quarters, as a bitflag, for use in relevant timing parameters where quarters need to be specified

const (
	// Q1 is the first quarter, which, due to 0-based indexing, shows up as Quarter = 0 in timer
	Q1 Quarters = iota
	Q2
	Q3
	Q4
	QuartersN
)

The quarters

func (*Quarters) FromString

func (i *Quarters) FromString(s string) error

func (Quarters) MarshalJSON

func (ev Quarters) MarshalJSON() ([]byte, error)

func (Quarters) String

func (i Quarters) String() string

func (*Quarters) UnmarshalJSON

func (ev *Quarters) UnmarshalJSON(b []byte) error

type SelfInhibParams

type SelfInhibParams struct {
	On  bool    `desc:"enable neuron self-inhibition"`
	Gi  float32 `` /* 247-byte string literal not displayed */
	Tau float32 `` /* 373-byte string literal not displayed */
	Dt  float32 `inactive:"+" view:"-" json:"-" xml:"-" desc:"rate = 1 / tau"`
}

SelfInhibParams defines parameters for Neuron self-inhibition -- activation of the neuron directly feeds back to produce a proportional additional contribution to Gi

func (*SelfInhibParams) Defaults

func (si *SelfInhibParams) Defaults()

func (*SelfInhibParams) Inhib

func (si *SelfInhibParams) Inhib(self *float32, act float32)

Inhib updates the self inhibition value based on current unit activation

func (*SelfInhibParams) Update

func (si *SelfInhibParams) Update()

type Synapse

type Synapse struct {
	Wt     float32 `desc:"synaptic weight value -- sigmoid contrast-enhanced"`
	LWt    float32 `` /* 216-byte string literal not displayed */
	DWt    float32 `desc:"change in synaptic weight, from learning"`
	Norm   float32 `` /* 162-byte string literal not displayed */
	Moment float32 `` /* 148-byte string literal not displayed */
}

leabra.Synapse holds state for the synaptic connection between neurons

func (*Synapse) SetVarByName

func (sy *Synapse) SetVarByName(varNm string, val float64) bool

func (*Synapse) VarByName

func (sy *Synapse) VarByName(varNm string) (float32, bool)

func (*Synapse) VarNames

func (sy *Synapse) VarNames() []string

type Time

type Time struct {
	Time      float32 `desc:"accumulated amount of time the network has been running, in simulation-time (not real world time), in seconds"`
	Cycle     int     `` /* 217-byte string literal not displayed */
	CycleTot  int     `` /* 151-byte string literal not displayed */
	Quarter   int     `` /* 224-byte string literal not displayed */
	PlusPhase bool    `desc:"true if this is the plus phase (final quarter = 3) -- else minus phase"`

	TimePerCyc float32 `def:"0.001" desc:"amount of time to increment per cycle"`
	CycPerQtr  int     `def:"25" desc:"number of cycles per quarter to run -- 25 = standard 100 msec alpha-cycle"`
}

leabra.Time contains all the timing state and parameter information for running a model

func NewTime

func NewTime() *Time

NewTime returns a new Time struct with default parameters

func (*Time) AlphaCycStart

func (tm *Time) AlphaCycStart()

AlphaCycStart starts a new alpha-cycle (set of 4 quarters)

func (*Time) CycleInc

func (tm *Time) CycleInc()

CycleInc increments at the cycle level

func (*Time) Defaults

func (tm *Time) Defaults()

Defaults sets default values

func (*Time) QuarterInc

func (tm *Time) QuarterInc()

QuarterInc increments at the quarter level, updating Quarter and PlusPhase

func (*Time) Reset

func (tm *Time) Reset()

Reset resets the counters all back to zero

type TimeScales

type TimeScales int32

TimeScales are the different time scales associated with overall simulation running, and can be used to parameterize the updating and control flow of simulations at different scales. The definitions become increasingly subjective imprecise as the time scales increase. This is not used directly in the algorithm code -- all control is responsibility of the end simulation. This list is designed to standardize terminology across simulations and establish a common conceptual framework for time -- it can easily be extended in specific simulations to add needed additional levels, although using one of the existing standard values is recommended wherever possible.

const (
	// Cycle is the finest time scale -- typically 1 msec -- a single activation update.
	Cycle TimeScales = iota

	// FastSpike is typically 10 cycles = 10 msec (100hz) = the fastest spiking time
	// generally observed in the brain.  This can be useful for visualizing updates
	// at a granularity in between Cycle and Quarter.
	FastSpike

	// Quarter is typically 25 cycles = 25 msec (40hz) = 1/4 of the 100 msec alpha trial
	// This is also the GammaCycle (gamma = 40hz), but we use Quarter functionally
	// by virtue of there being 4 per AlphaCycle.
	Quarter

	// Phase is either Minus or Plus phase -- Minus = first 3 quarters, Plus = last quarter
	Phase

	// BetaCycle is typically 50 cycles = 50 msec (20 hz) = one beta-frequency cycle.
	// Gating in the basal ganglia and associated updating in prefrontal cortex
	// occurs at this frequency.
	BetaCycle

	// AlphaCycle is typically 100 cycles = 100 msec (10 hz) = one alpha-frequency cycle,
	// which is the fundamental unit of learning in posterior cortex.
	AlphaCycle

	// ThetaCycle is typically 200 cycles = 200 msec (5 hz) = two alpha-frequency cycles.
	// This is the modal duration of a saccade, the update frequency of medial temporal lobe
	// episodic memory, and the minimal predictive learning cycle (perceive an Alpha 1, predict on 2).
	ThetaCycle

	// Event is the smallest unit of naturalistic experience that coheres unto itself
	// (e.g., something that could be described in a sentence).
	// Typically this is on the time scale of a few seconds: e.g., reaching for
	// something, catching a ball.
	Event

	// Trial is one unit of behavior in an experiment -- it is typically environmentally
	// defined instead of endogenously defined in terms of basic brain rhythms.
	// In the minimal case it could be one AlphaCycle, but could be multiple, and
	// could encompass multiple Events (e.g., one event is fixation, next is stimulus,
	// last is response)
	Trial

	// Sequence is a sequential group of Trials (not always needed).
	Sequence

	// Block is a collection of Trials, Sequences or Events, often used in experiments
	// when conditions are varied across blocks.
	Block

	// Epoch is used in two different contexts.  In machine learning, it represents a
	// collection of Trials, Sequences or Events that constitute a "representative sample"
	// of the environment.  In the simplest case, it is the entire collection of Trials
	// used for training.  In electrophysiology, it is a timing window used for organizing
	// the analysis of electrode data.
	Epoch

	// Run is a complete run of a model / subject, from training to testing, etc.
	// Often multiple runs are done in an Expt to obtain statistics over initial
	// random weights etc.
	Run

	// Expt is an entire experiment -- multiple Runs through a given protocol / set of
	// parameters.
	Expt

	// Scene is a sequence of events that constitutes the next larger-scale coherent unit
	// of naturalistic experience corresponding e.g., to a scene in a movie.
	// Typically consists of events that all take place in one location over
	// e.g., a minute or so. This could be a paragraph or a page or so in a book.
	Scene

	// Episode is a sequence of scenes that constitutes the next larger-scale unit
	// of naturalistic experience e.g., going to the grocery store or eating at a
	// restaurant, attending a wedding or other "event".
	// This could be a chapter in a book.
	Episode

	TimeScalesN
)

The time scales

func (*TimeScales) FromString

func (i *TimeScales) FromString(s string) error

func (TimeScales) MarshalJSON

func (ev TimeScales) MarshalJSON() ([]byte, error)

func (TimeScales) String

func (i TimeScales) String() string

func (*TimeScales) UnmarshalJSON

func (ev *TimeScales) UnmarshalJSON(b []byte) error

type WtBalParams

type WtBalParams struct {
	On     bool    `` /* 561-byte string literal not displayed */
	AvgThr float32 `` /* 351-byte string literal not displayed */
	HiThr  float32 `` /* 146-byte string literal not displayed */
	HiGain float32 `` /* 188-byte string literal not displayed */
	LoThr  float32 `` /* 145-byte string literal not displayed */
	LoGain float32 `` /* 273-byte string literal not displayed */
}

WtBalParams are weight balance soft renormalization params: maintains overall weight balance by progressively penalizing weight increases as a function of how strong the weights are overall (subject to thresholding) and long time-averaged activation. Plugs into soft bounding function.

func (*WtBalParams) Defaults

func (wb *WtBalParams) Defaults()

func (*WtBalParams) Update

func (wb *WtBalParams) Update()

func (*WtBalParams) WtBal

func (wb *WtBalParams) WtBal(wbAvg float32) (fact, inc, dec float32)

WtBal computes weight balance factors for increase and decrease based on extent to which weights and average act exceed thresholds

type WtBalRecvPrjn

type WtBalRecvPrjn struct {
	Avg  float32 `desc:"average of effective weight values that exceed WtBal.AvgThr across given Recv Neuron's connections for given Prjn"`
	Fact float32 `` /* 154-byte string literal not displayed */
	Inc  float32 `desc:"weight balance increment factor -- extra multiplier to add to weight increases to maintain overall weight balance"`
	Dec  float32 `desc:"weight balance decrement factor -- extra multiplier to add to weight decreases to maintain overall weight balance"`
}

WtBalRecvPrjn are state variables used in computing the WtBal weight balance function There is one of these for each Recv Neuron participating in the projection.

func (*WtBalRecvPrjn) Init

func (wb *WtBalRecvPrjn) Init()

type WtScaleParams

type WtScaleParams struct {
	Abs float32 `def:"1" min:"0" desc:"absolute scaling, which is not subject to normalization: directly multiplies weight values"`
	Rel float32 `` /* 169-byte string literal not displayed */
}

/ WtScaleParams are weight scaling parameters: modulates overall strength of projection, using both absolute and relative factors

func (*WtScaleParams) Defaults

func (ws *WtScaleParams) Defaults()

func (*WtScaleParams) FullScale

func (ws *WtScaleParams) FullScale(savg, snu, ncon float32) float32

FullScale returns full scaling factor, which is product of Abs * Rel * SLayActScale

func (*WtScaleParams) SLayActScale

func (ws *WtScaleParams) SLayActScale(savg, snu, ncon float32) float32

SLayActScale computes scaling factor based on sending layer activity level (savg), number of units in sending layer (snu), and number of recv connections (ncon). Uses a fixed sem_extra standard-error-of-the-mean (SEM) extra value of 2 to add to the average expected number of active connections to receive, for purposes of computing scaling factors with partial connectivity For 25% layer activity, binomial SEM = sqrt(p(1-p)) = .43, so 3x = 1.3 so 2 is a reasonable default.

func (*WtScaleParams) Update

func (ws *WtScaleParams) Update()

type WtSigParams

type WtSigParams struct {
	Gain      float32 `def:"1,6" min:"0" desc:"gain (contrast, sharpness) of the weight contrast function (1 = linear)"`
	Off       float32 `def:"1" min:"0" desc:"offset of the function (1=centered at .5, >1=higher, <1=lower) -- 1 is standard for XCAL"`
	SoftBound bool    `def:"true" desc:"apply exponential soft bounding to the weight changes"`
}

WtSigParams are sigmoidal weight contrast enhancement function parameters

func (*WtSigParams) Defaults

func (ws *WtSigParams) Defaults()

func (*WtSigParams) LinFmSigWt

func (ws *WtSigParams) LinFmSigWt(sw float32) float32

LinFmSigWt returns linear weight from sigmoidal contrast-enhanced weight

func (*WtSigParams) SigFmLinWt

func (ws *WtSigParams) SigFmLinWt(lw float32) float32

SigFmLinWt returns sigmoidal contrast-enhanced weight from linear weight

func (*WtSigParams) Update

func (ws *WtSigParams) Update()

type XCalParams

type XCalParams struct {
	MLrn    float32 `` /* 316-byte string literal not displayed */
	SetLLrn bool    `` /* 459-byte string literal not displayed */
	LLrn    float32 `` /* 279-byte string literal not displayed */
	DRev    float32 `` /* 270-byte string literal not displayed */
	DThr    float32 `` /* 139-byte string literal not displayed */
	LrnThr  float32 `` /* 338-byte string literal not displayed */

	DRevRatio float32 `` /* 131-byte string literal not displayed */
}

XCalParams are parameters for temporally eXtended Contrastive Attractor Learning function (XCAL) which is the standard learning equation for leabra .

func (*XCalParams) DWt

func (xc *XCalParams) DWt(srval, thrP float32) float32

DWt is the XCAL function for weight change -- the "check mark" function -- no DGain, no ThrPMin

func (*XCalParams) Defaults

func (xc *XCalParams) Defaults()

func (*XCalParams) LongLrate

func (xc *XCalParams) LongLrate(avgLLrn float32) float32

LongLrate returns the learning rate for long-term floating average component (BCM)

func (*XCalParams) Update

func (xc *XCalParams) Update()

type XX1Params

type XX1Params struct {
	Thr          float32 `` /* 152-byte string literal not displayed */
	Gain         float32 `` /* 305-byte string literal not displayed */
	NVar         float32 `` /* 372-byte string literal not displayed */
	VmActThr     float32 `` /* 182-byte string literal not displayed */
	SigMult      float32 `def:"0.33" view:"-" json:"-" xml:"-" desc:"multiplier on sigmoid used for computing values for net < thr"`
	SigMultPow   float32 `def:"0.8" view:"-" json:"-" xml:"-" desc:"power for computing sig_mult_eff as function of gain * nvar"`
	SigGain      float32 `def:"3" view:"-" json:"-" xml:"-" desc:"gain multipler on (net - thr) for sigmoid used for computing values for net < thr"`
	InterpRange  float32 `def:"0.01" view:"-" json:"-" xml:"-" desc:"interpolation range above zero to use interpolation"`
	GainCorRange float32 `` /* 130-byte string literal not displayed */
	GainCor      float32 `def:"0.1" view:"-" json:"-" xml:"-" desc:"gain correction multiplier -- how much to correct gains"`

	SigGainNVar float32 `view:"-" json:"-" xml:"-" desc:"sig_gain / nvar"`
	SigMultEff  float32 `` /* 145-byte string literal not displayed */
	SigValAt0   float32 `view:"-" json:"-" xml:"-" desc:"0.5 * sig_mult_eff -- used for interpolation portion"`
	InterpVal   float32 `view:"-" json:"-" xml:"-" desc:"function value at interp_range - sig_val_at_0 -- for interpolation"`
}

XX1Params are the X/(X+1) rate-coded activation function parameters for leabra using the GeLin (g_e linear) rate coded activation function

func (*XX1Params) Defaults

func (xp *XX1Params) Defaults()

func (*XX1Params) NoisyXX1

func (xp *XX1Params) NoisyXX1(x float32) float32

NoisyXX1 computes the Noisy x/(x+1) function -- directly computes close approximation to x/(x+1) convolved with a gaussian noise function with variance nvar. No need for a lookup table -- very reasonable approximation for standard range of parameters (nvar = .01 or less -- higher values of nvar are less accurate with large gains, but ok for lower gains)

func (*XX1Params) NoisyXX1Gain

func (xp *XX1Params) NoisyXX1Gain(x, gain float32) float32

NoisyXX1Gain computes the noisy x/(x+1) function -- directly computes close approximation to x/(x+1) convolved with a gaussian noise function with variance nvar. No need for a lookup table -- very reasonable approximation for standard range of parameters (nvar = .01 or less -- higher values of nvar are less accurate with large gains, but ok for lower gains). Using external gain factor.

func (*XX1Params) Update

func (xp *XX1Params) Update()

func (*XX1Params) XX1

func (xp *XX1Params) XX1(x float32) float32

XX1 computes the basic x/(x+1) function

func (*XX1Params) XX1GainCor

func (xp *XX1Params) XX1GainCor(x float32) float32

XX1GainCor computes x/(x+1) with gain correction within GainCorRange to compensate for convolution effects

func (*XX1Params) XX1GainCorGain

func (xp *XX1Params) XX1GainCorGain(x, gain float32) float32

X11GainCorGain computes x/(x+1) with gain correction within GainCorRange to compensate for convolution effects -- using external gain factor

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL