deep

package
v0.5.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 12, 2019 License: BSD-3-Clause Imports: 8 Imported by: 0

Documentation

Overview

Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.

This package allows you to specify layer types as Super, Deep, and TRC which in turn drives specific forms of computation associated with each of those layer types.

DeepLeabra captures both the predictive learning and attentional modulation functions of the deep layer and thalamocortical circuitry.

* Super layer neurons reflect the superficial layers of the neocortex, but they also are the basis for directly computing the DeepBurst activation signal that reflects the deep layer 5 IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).

* The alpha-cycle quarter(s) when DeepBurst is updated and broadcast is set in DeepBurstParams.BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the DeepBurst from Super layers is continuously sent via BurstTRC projections to TRC layers (using efficient delta-based computation) to drive plus-phase outcome states in those layers. At the end of the burst quarter(s), BurstCtxt projections convey the DeepBurst signal to Deep layer neurons, where it is integrated into the DeepCtxt value representing the temporally-delayed context information. Note: Deep layers also compute a DeepBurst value themselves, which can be sent via self projections to relfect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.

* Deep layer neurons reflect the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus, and back up to all the other lamina within a microcolumn, where they drive a multiplicative attentional modulation signal. These neurons receive the DeepBurst activation via a BurstCtxt projection type, typically once every 100 msec, and integrate that in the DeepCtxt value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the DeepBurst inputs, this causes these Deep layer neurons to reflect what the superficial layers encoded on the *previous* timestep -- thus they represent a temporally-delayed context state.

* Deep layer neurons project to the TRC (Pulvinar) neurons via standard Act-driven projections that integrate into standard Ge excitatory input in TRC neurons, to drive the prediction aspect of predictive learning. They also can project back to the Super layer neurons via a DeepAttn projection to drive attentional modulation of activity there.

* TRC layer neurons receive a BurstTRC projection from the Super layer (typically a one-to-one projection), which drives the plus-phase "outcome" activation state of these Pulvinar layers (Super actually computes the 5IB DeepBurst activation). These layers also receive regular connections from Deep layers, which drive the prediction of this plus-phase outcome state, based on the temporally-delayed deep layer context information.

* The attentional effects are implemented via DeepAttn projections from Deep to Super layers, which are typically fixed, non-learning, one-to-one projections, that drive the AttnGe excitatory condutance in Super layers. AttnGe then drives the computation of DeepAttn and DeepLrn values that modulate (i.e., multiply) the activation (DeepAttn) or learning rate (DeepLrn) of these superficial neurons.

All of the relevant parameters are in the params.go file, in the Deep*Params classes, which are then fields in the deep.Layer.

* DeepBurstParams (layer DeepBurst) has the BurstQtr when DeepBurst is updated, and the thresholding parameters.

* DeepCtxtParams (layer DeepCtxt) has parameters for integrating DeepCtxt input

* DeepTRCParams (layer DeepTRC) has parameters for how to compute TRC plus phase activation states based on the TRCBurstGe excitatory input from the BurstTRC projections.

* DeepAttnParams (layer DeepAttn) has the parameters for computing DeepAttn and DeepLrn from AttnGe

Index

Constants

View Source
const (
	// Super are superficial-layer neurons, which also compute DeepBurst activation as a
	// thresholded version of superficial activation, and send that to both TRC (for plus
	// phase outcome) and Deep layers (for DeepCtxt temporal context).
	Super emer.LayerType = emer.LayerTypeN + iota

	// Deep are deep-layer neurons, reflecting activation of layer 6 regular spiking
	// CT corticothalamic neurons, which drive both attention in Super (via DeepAttn
	// projections) and  predictions in TRC (Pulvinar) via standard projections.
	Deep

	// TRC are thalamic relay cell neurons, typically in the Pulvinar, which alternately reflect
	// predictions driven by Deep layer projections, and actual outcomes driven by BurstTRC
	// projections from corresponding Super layer neurons that provide strong driving inputs to
	// TRC neurons.
	TRC

	LayerTypeN
)

The DeepLeabra layer types

View Source
const (
	// BurstCtxt are projections from Superficial layers to Deep layers that
	// send DeepBurst activations drive updating of DeepCtxt excitatory conductance,
	// at end of a DeepBurst quarter.  These projections also use a special learning
	// rule that takes into account the temporal delays in the activation states.
	BurstCtxt emer.PrjnType = emer.PrjnTypeN + iota

	// BurstTRC are projections from Superficial layers to TRC (thalamic relay cell)
	// neurons (e.g., in the Pulvinar) that send DeepBurst activation continuously
	// during the DeepBurst quarter(s), driving the TRCBurstGe value, which then drives
	// the 	plus-phase activation state of the TRC representing the "outcome" against
	// which prior predictions are (implicitly) compared via the temporal difference
	// in TRC activation state.
	BurstTRC

	// DeepAttn are projections from Deep layers (representing layer 6 regular-spiking
	// CT corticothalamic neurons) up to corresponding Superficial layer neurons, that drive
	// the attentional modulation of activations there (i.e., DeepAttn and DeepLrn values).
	// This is sent continuously all the time from deep layers using the standard delta-based
	// Ge computation, and aggregated into the AttnGe variable on Super neurons.
	DeepAttn

	PrjnTypeN
)

The DeepLeabra prjn types

Variables

View Source
var AllNeuronVars []string
View Source
var KiT_LayerType = kit.Enums.AddEnum(LayerTypeN, false, nil)
View Source
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
View Source
var KiT_PrjnType = kit.Enums.AddEnum(PrjnTypeN, false, nil)
View Source
var NetworkProps = leabra.NetworkProps
View Source
var NeuronVars = []string{"ActNoAttn", "DeepBurst", "DeepBurstPrv", "DeepCtxt", "TRCBurstGe", "DeepBurstSent", "AttnGet", "DeepAttn", "DeepLrn"}
View Source
var NeuronVarsMap map[string]int

Functions

func NeuronVarByName

func NeuronVarByName(varNm string) (int, error)

NeuronVarByName returns the index of the variable in the Neuron, or error

Types

type DeepAttnParams

type DeepAttnParams struct {
	On  bool    `desc:"Enable the computation of DeepAttn, DeepLrn from AttnGe (otherwise, DeepAttn and DeepLrn = 1"`
	Min float32 `` /* 239-byte string literal not displayed */
	Thr float32 `` /* 256-byte string literal not displayed */

	Range float32 `` /* 128-byte string literal not displayed */
}

DeepAttnParams are parameters determining how the DeepAttn and DeepLrn attentional modulation is computed from the AttnGe inputs received via DeepAttn projections

func (*DeepAttnParams) DeepAttnFmG

func (db *DeepAttnParams) DeepAttnFmG(lrn float32) float32

DeepAttnFmG returns the DeepAttn value computed from DeepLrn value

func (*DeepAttnParams) DeepLrnFmG

func (db *DeepAttnParams) DeepLrnFmG(attnG, attnMax float32) float32

DeepLrnFmG returns the DeepLrn value computed from AttnGe and MAX(AttnGe) across layer. As simply the max-normalized value.

func (*DeepAttnParams) Defaults

func (db *DeepAttnParams) Defaults()

func (*DeepAttnParams) Update

func (db *DeepAttnParams) Update()

type DeepBurstParams

type DeepBurstParams struct {
	BurstQtr    leabra.Quarters `` /* 210-byte string literal not displayed */
	On          bool            `` /* 157-byte string literal not displayed */
	FmActNoAttn bool            `` /* 285-byte string literal not displayed */
	ThrRel      float32         `` /* 373-byte string literal not displayed */
	ThrAbs      float32         `` /* 266-byte string literal not displayed */
}

DeepBurstParams are parameters determining how the DeepBurst activation is computed from the superficial layer activation values.

func (*DeepBurstParams) Defaults

func (db *DeepBurstParams) Defaults()

func (*DeepBurstParams) IsBurstQtr

func (db *DeepBurstParams) IsBurstQtr(qtr int) bool

IsBurstQtr returns true if the given quarter (0-3) is set as a Bursting quarter according to BurstQtr settings

func (*DeepBurstParams) NextIsBurstQtr

func (db *DeepBurstParams) NextIsBurstQtr(qtr int) bool

NextIsBurstQtr returns true if the quarter after given quarter (0-3) is set as a Bursting quarter according to BurstQtr settings. wraps around -- if qtr=3 and qtr=0 is a burst qtr, then it is true

func (*DeepBurstParams) PrevIsBurstQtr

func (db *DeepBurstParams) PrevIsBurstQtr(qtr int) bool

PrevIsBurstQtr returns true if the quarter before given quarter (0-3) is set as a Bursting quarter according to BurstQtr settings. wraps around -- if qtr=0 and qtr=3 is a burst qtr, then it is true

func (*DeepBurstParams) Update

func (db *DeepBurstParams) Update()

type DeepCtxtParams

type DeepCtxtParams struct {
	FmPrv float32 `` /* 382-byte string literal not displayed */
	FmNew float32 `view:"-" inactive:"+" desc:"1 - FmPrv -- new context amount"`
}

DeepCtxtParams are parameters determining how the DeepCtxt temporal context state is computed based on BurstCtxt projections from Super layers to Deep layers.

func (*DeepCtxtParams) DeepCtxtFmGe

func (db *DeepCtxtParams) DeepCtxtFmGe(ge, dctxt float32) float32

DeepCtxtFmGe computes the new DeepCtxt value based on current excitatory conductance of DeepBurst signals received, and current (now previous) DeepCtxt value.

func (*DeepCtxtParams) Defaults

func (db *DeepCtxtParams) Defaults()

func (*DeepCtxtParams) Update

func (db *DeepCtxtParams) Update()

type DeepLayer

type DeepLayer interface {
	leabra.LeabraLayer

	// AsDeep returns this layer as a deep.Layer -- all derived layers must redefine
	// this to return the deep Layer type, so that the DeepLayer interface does not
	// need to include accessors to all the fields.
	AsDeep() *Layer

	// AvgMaxAttnGe computes the average and max AttnGe stats
	AvgMaxAttnGe(ltime *leabra.Time)

	// DeepAttnFmG computes DeepAttn and DeepLrn from AttnGe input,
	// and then applies the DeepAttn modulation to the Act activation value.
	DeepAttnFmG(ltime *leabra.Time)

	// AvgMaxActNoAttn computes the average and max ActNoAttn stats
	AvgMaxActNoAttn(ltime *leabra.Time)

	// DeepBurstFmAct updates DeepBurst layer 5 IB bursting value from current Act (superficial activation)
	// Subject to thresholding.
	DeepBurstFmAct(ltime *leabra.Time)

	// SendTRCBurstGeDelta sends change in DeepBurst activation since last sent, over BurstTRC
	// projections.
	SendTRCBurstGeDelta(ltime *leabra.Time)

	// TRCBurstGeFmInc computes the TRCBurstGe input from sent values
	TRCBurstGeFmInc(ltime *leabra.Time)

	// AvgMaxTRCBurstGe computes the average and max TRCBurstGe stats
	AvgMaxTRCBurstGe(ltime *leabra.Time)

	// SendDeepCtxtGe sends full DeepBurst activation over BurstCtxt projections to integrate
	// DeepCtxtGe excitatory conductance on deep layers.
	// This must be called at the end of the DeepBurst quarter for this layer.
	SendDeepCtxtGe(ltime *leabra.Time)

	// DeepCtxtFmGe integrates new DeepCtxtGe excitatory conductance from projections, and computes
	// overall DeepCtxt value.  This must be called at the end of the DeepBurst quarter for this layer,
	// after SendDeepCtxtGe.
	DeepCtxtFmGe(ltime *leabra.Time)

	// DeepBurstPrv saves DeepBurst as DeepBurstPrv
	DeepBurstPrv(ltime *leabra.Time)
}

DeepLayer defines the essential algorithmic API for DeepLeabra at the layer level.

type DeepPrjn

type DeepPrjn interface {
	leabra.LeabraPrjn

	// SendDeepCtxtGe sends the full DeepBurst activation from sending neuron index si,
	// to integrate DeepCtxtGe excitatory conductance on receivers
	SendDeepCtxtGe(si int, dburst float32)

	// SendTRCBurstGeDelta sends the delta-DeepBurst activation from sending neuron index si,
	// to integrate TRCBurstGe excitatory conductance on receivers
	SendTRCBurstGeDelta(si int, delta float32)

	// SendAttnGeDelta sends the delta-activation from sending neuron index si,
	// to integrate into AttnGeInc excitatory conductance on receivers
	SendAttnGeDelta(si int, delta float32)

	// RecvDeepCtxtGeInc increments the receiver's DeepCtxtGe from that of all the projections
	RecvDeepCtxtGeInc()

	// RecvTRCBurstGeInc increments the receiver's TRCBurstGe from that of all the projections
	RecvTRCBurstGeInc()

	// RecvAttnGeInc increments the receiver's AttnGe from that of all the projections
	RecvAttnGeInc()

	// DWtDeepCtxt computes the weight change (learning) -- for DeepCtxt projections
	DWtDeepCtxt()
}

DeepPrjn defines the essential algorithmic API for DeepLeabra at the projection level.

type DeepTRCParams

type DeepTRCParams struct {
	Binarize bool    `` /* 244-byte string literal not displayed */
	BinThr   float32 `` /* 164-byte string literal not displayed */
	BinOn    float32 `def:"0.3" viewif:"Binarize" desc:"Effective value for units above threshold -- lower value around 0.3 or so seems best."`
	BinOff   float32 `def:"0" viewif:"Binarize" desc:"Effective value for units below threshold -- typically 0."`
}

DeepTRCParams provides parameters for how the plus-phase (outcome) state of thalamic relay cell (e.g., Pulvinar) neurons is computed from the BurstTRC projections that drive TRCBurstGe excitatory conductance.

func (*DeepTRCParams) BurstGe

func (tp *DeepTRCParams) BurstGe(burstGe float32) float32

BurstGe returns effective excitatory conductance to use for burst-quarter time in TRC layer.

func (*DeepTRCParams) Defaults

func (tp *DeepTRCParams) Defaults()

func (*DeepTRCParams) Update

func (tp *DeepTRCParams) Update()

type Layer

type Layer struct {
	leabra.Layer                 // access as .Layer
	DeepBurst    DeepBurstParams `` /* 132-byte string literal not displayed */
	DeepCtxt     DeepCtxtParams  `desc:"parameters for computing DeepCtxt in Deep layers, from BurstCtxt inputs from Super senders"`
	DeepTRC      DeepTRCParams   `` /* 131-byte string literal not displayed */
	DeepAttn     DeepAttnParams  `` /* 166-byte string literal not displayed */
	DeepNeurs    []Neuron        `` /* 151-byte string literal not displayed */
	DeepPools    []Pool          `` /* 247-byte string literal not displayed */
}

deep.Layer is the DeepLeabra layer, based on basic rate-coded leabra.Layer

func (*Layer) ActFmG

func (ly *Layer) ActFmG(ltime *leabra.Time)

ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act

func (*Layer) AsDeep

func (ly *Layer) AsDeep() *Layer

AsDeep returns this layer as a deep.Layer -- all derived layers must redefine this to return the deep Layer type, so that the DeepLayer interface does not need to include accessors to all the fields.

func (*Layer) AsLeabra

func (ly *Layer) AsLeabra() *leabra.Layer

AsLeabra returns this layer as a leabra.Layer -- all derived layers must redefine this to return the base Layer type, so that the LeabraLayer interface does not need to include accessors to all the basic stuff

func (*Layer) AvgMaxAct

func (ly *Layer) AvgMaxAct(ltime *leabra.Time)

AvgMaxAct computes the average and max Act stats, used in inhibition Deep version also computes AvgMaxActNoAttn

func (*Layer) AvgMaxActNoAttn

func (ly *Layer) AvgMaxActNoAttn(ltime *leabra.Time)

AvgMaxActNoAttn computes the average and max ActNoAttn stats

func (*Layer) AvgMaxAttnGe

func (ly *Layer) AvgMaxAttnGe(ltime *leabra.Time)

AvgMaxAttnGe computes the average and max AttnGe stats

func (*Layer) AvgMaxGe

func (ly *Layer) AvgMaxGe(ltime *leabra.Time)

AvgMaxGe computes the average and max Ge stats, used in inhibition Deep version also computes AttnGe stats

func (*Layer) AvgMaxTRCBurstGe

func (ly *Layer) AvgMaxTRCBurstGe(ltime *leabra.Time)

AvgMaxTRCBurstGe computes the average and max TRCBurstGe stats

func (*Layer) Build

func (ly *Layer) Build() error

Build constructs the layer state, including calling Build on the projections you MUST have properly configured the Inhib.Pool.On setting by this point to properly allocate Pools for the unit groups if necessary.

func (*Layer) DecayState

func (ly *Layer) DecayState(decay float32)

func (*Layer) DeepAttnFmG

func (ly *Layer) DeepAttnFmG(ltime *leabra.Time)

DeepAttnFmG computes DeepAttn and DeepLrn from AttnGe input, and then applies the DeepAttn modulation to the Act activation value.

func (*Layer) DeepBurstFmAct

func (ly *Layer) DeepBurstFmAct(ltime *leabra.Time)

DeepBurstFmAct updates DeepBurst layer 5 IB bursting value from current Act (superficial activation) Subject to thresholding.

func (*Layer) DeepBurstPrv

func (ly *Layer) DeepBurstPrv(ltime *leabra.Time)

DeepBurstPrv saves DeepBurst as DeepBurstPrv

func (*Layer) DeepCtxtFmGe

func (ly *Layer) DeepCtxtFmGe(ltime *leabra.Time)

DeepCtxtFmGe integrates new DeepCtxtGe excitatory conductance from projections, and computes overall DeepCtxt value, only on Deep layers. This must be called at the end of the DeepBurst quarter for this layer, after SendDeepCtxtGe.

func (*Layer) Defaults

func (ly *Layer) Defaults()

func (*Layer) GFmInc

func (ly *Layer) GFmInc(ltime *leabra.Time)

GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.

func (*Layer) GScaleFmAvgAct

func (ly *Layer) GScaleFmAvgAct()

GScaleFmAvgAct computes the scaling factor for synaptic input conductances G, based on sending layer average activation. This attempts to automatically adjust for overall differences in raw activity coming into the units to achieve a general target of around .5 to 1 for the integrated G values. DeepLeabra version separately normalizes the Deep projection types.

func (*Layer) InitActs

func (ly *Layer) InitActs()

func (*Layer) QuarterFinal

func (ly *Layer) QuarterFinal(ltime *leabra.Time)

QuarterFinal does updating after end of a quarter

func (*Layer) SendDeepCtxtGe

func (ly *Layer) SendDeepCtxtGe(ltime *leabra.Time)

SendDeepCtxtGe sends full DeepBurst activation over BurstCtxt projections to integrate DeepCtxtGe excitatory conductance on deep layers. This must be called at the end of the DeepBurst quarter for this layer.

func (*Layer) SendGDelta

func (ly *Layer) SendGDelta(ltime *leabra.Time)

SendGDelta sends change in activation since last sent, if above thresholds. Deep version sends either to standard Ge or AttnGe for DeepAttn projections.

func (*Layer) SendTRCBurstGeDelta

func (ly *Layer) SendTRCBurstGeDelta(ltime *leabra.Time)

SendTRCBurstGeDelta sends change in DeepBurst activation since last sent, over BurstTRC projections.

func (*Layer) TRCBurstGeFmInc

func (ly *Layer) TRCBurstGeFmInc(ltime *leabra.Time)

TRCBurstGeFmInc computes the TRCBurstGe input from sent values

func (*Layer) UnitVal1DTry

func (ly *Layer) UnitVal1DTry(varNm string, idx int) (float32, error)

UnitVal1DTry returns value of given variable name on given unit, using 1-dimensional index.

func (*Layer) UnitValTry

func (ly *Layer) UnitValTry(varNm string, idx []int) (float32, error)

UnitValTry returns value of given variable name on given unit, using shape-based dimensional index

func (*Layer) UnitValsTry

func (ly *Layer) UnitValsTry(varNm string) ([]float32, error)

UnitValsTry is emer.Layer interface method to return values of given variable

func (*Layer) UnitVarNames

func (ly *Layer) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*Layer) UpdateParams

func (ly *Layer) UpdateParams()

UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer

type Network

type Network struct {
	leabra.Network
}

deep.Network has parameters for running a DeepLeabra network

func (*Network) Cycle

func (nt *Network) Cycle(ltime *leabra.Time)

Cycle runs one cycle of activation updating Deep version adds call to update DeepBurst at end

func (*Network) DeepBurst

func (nt *Network) DeepBurst(ltime *leabra.Time)

DeepBurst is called at end of Cycle, computes DeepBurst and sends it to other layers

func (*Network) DeepCtxt

func (nt *Network) DeepCtxt(ltime *leabra.Time)

DeepCtxt sends DeepBurst to Deep layers and integrates DeepCtxt on Deep layers

func (*Network) Defaults

func (nt *Network) Defaults()

Defaults sets all the default parameters for all layers and projections

func (*Network) NewLayer

func (nt *Network) NewLayer() emer.Layer

NewLayer returns new layer of proper type

func (*Network) NewPrjn

func (nt *Network) NewPrjn() emer.Prjn

NewPrjn returns new prjn of proper type

func (*Network) QuarterFinal

func (nt *Network) QuarterFinal(ltime *leabra.Time)

QuarterFinal does updating after end of a quarter

func (*Network) UpdateParams

func (nt *Network) UpdateParams()

UpdateParams updates all the derived parameters if any have changed, for all layers and projections

type Neuron

type Neuron struct {
	ActNoAttn     float32 `` /* 320-byte string literal not displayed */
	DeepBurst     float32 `` /* 694-byte string literal not displayed */
	DeepBurstPrv  float32 `desc:"DeepBurst from the previous alpha trial -- this is typically used for learning in the BurstCtxt projection."`
	DeepCtxtGe    float32 `` /* 357-byte string literal not displayed */
	DeepCtxt      float32 `` /* 322-byte string literal not displayed */
	TRCBurstGe    float32 `` /* 202-byte string literal not displayed */
	DeepBurstSent float32 `desc:"Last DeepBurst activation value sent, for computing TRCBurstGe using efficient delta mechanism."`
	AttnGe        float32 `` /* 314-byte string literal not displayed */
	DeepAttn      float32 `` /* 493-byte string literal not displayed */
	DeepLrn       float32 `` /* 246-byte string literal not displayed */
}

deep.Neuron holds the extra neuron (unit) level variables for DeepLeabra computation. DeepLeabra includes both attentional and predictive learning functions of the deep layers and thalamocortical circuitry. These are maintained in a separate parallel slice from the leabra.Neuron variables.

func (*Neuron) VarByIndex

func (nrn *Neuron) VarByIndex(idx int) float32

VarByIndex returns variable using index (0 = first variable in NeuronVars list)

func (*Neuron) VarByName

func (nrn *Neuron) VarByName(varNm string) (float32, error)

VarByName returns variable by name, or error

func (*Neuron) VarNames

func (nrn *Neuron) VarNames() []string

type Pool

type Pool struct {
	ActNoAttn  minmax.AvgMax32
	TRCBurstGe minmax.AvgMax32
	AttnGe     minmax.AvgMax32
}

deep.Pool contains extra statistics used in DeepLeabra

type Prjn

type Prjn struct {
	leabra.Prjn             // access as .Prjn
	DeepCtxtGeInc []float32 `desc:"local accumulator for DeepCtxt excitatory conductance from sending units -- not a delta -- the full value"`
	TRCBurstGeInc []float32 `desc:"local increment accumulator for TRCBurstGe excitatory conductance from sending units -- this will be thread-safe"`
	AttnGeInc     []float32 `desc:"local increment accumulator for AttnGe excitatory conductance from sending units -- this will be thread-safe"`
}

deep.Prjn is the DeepLeabra projection, based on basic rate-coded leabra.Prjn

func (*Prjn) AsLeabra

func (pj *Prjn) AsLeabra() *leabra.Prjn

AsLeabra returns this prjn as a leabra.Prjn -- all derived prjns must redefine this to return the base Prjn type, so that the LeabraPrjn interface does not need to include accessors to all the basic stuff.

func (*Prjn) Build

func (pj *Prjn) Build() error

func (*Prjn) DWt

func (pj *Prjn) DWt()

DWt computes the weight change (learning) -- on sending projections Deep version supports DeepCtxt temporal learning option

func (*Prjn) DWtDeepCtxt

func (pj *Prjn) DWtDeepCtxt()

DWtDeepCtxt computes the weight change (learning) -- for DeepCtxt projections

func (*Prjn) Defaults

func (pj *Prjn) Defaults()

func (*Prjn) InitGInc

func (pj *Prjn) InitGInc()

func (*Prjn) RecvAttnGeInc

func (pj *Prjn) RecvAttnGeInc()

RecvAttnGeInc increments the receiver's AttnGe from that of all the projections

func (*Prjn) RecvDeepCtxtGeInc

func (pj *Prjn) RecvDeepCtxtGeInc()

RecvDeepCtxtGeInc increments the receiver's DeepCtxtGe from that of all the projections

func (*Prjn) RecvTRCBurstGeInc

func (pj *Prjn) RecvTRCBurstGeInc()

RecvTRCBurstGeInc increments the receiver's TRCBurstGe from that of all the projections

func (*Prjn) SendAttnGeDelta

func (pj *Prjn) SendAttnGeDelta(si int, delta float32)

SendAttnGeDelta sends the delta-activation from sending neuron index si, to integrate into AttnGeInc excitatory conductance on receivers

func (*Prjn) SendDeepCtxtGe

func (pj *Prjn) SendDeepCtxtGe(si int, dburst float32)

SendDeepCtxtGe sends the full DeepBurst activation from sending neuron index si, to integrate DeepCtxtGe excitatory conductance on receivers

func (*Prjn) SendTRCBurstGeDelta

func (pj *Prjn) SendTRCBurstGeDelta(si int, delta float32)

SendTRCBurstGeDelta sends the delta-DeepBurst activation from sending neuron index si, to integrate TRCBurstGe excitatory conductance on receivers

func (*Prjn) UpdateParams

func (pj *Prjn) UpdateParams()

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL