emer

package
v1.4.31 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 27, 2023 License: BSD-3-Clause Imports: 17 Imported by: 184

README

Docs: GoDoc

Package emer provides minimal interfaces for the basic structural elements of neural networks including:

  • emer.Network, emer.Layer, emer.Unit, emer.Prjn (projection that interconnects layers)

These interfaces are intended to be just sufficient to support visualization and generic analysis kinds of functions, but explicitly avoid exposing ANY of the algorithmic aspects, so that those can be purely encoded in the implementation structs.

At this point, given the extra complexity it would require, these interfaces do not support the ability to build or modify networks.

Also added support for managing parameters in the emer.Params object, which handles standard parameter set logic and support for applying to networks, and the new NetSize map for configuring network size.

Documentation

Overview

Package emer provides minimal interfaces for the basic structural elements of neural networks including: * emer.Network, emer.Layer, emer.Unit, emer.Prjn (projection that interconnects layers)

These interfaces are intended to be just sufficient to support visualization and generic analysis kinds of functions, but explicitly avoid exposing ANY of the algorithmic aspects, so that those can be purely encoded in the implementation structs.

At this point, given the extra complexity it would require, these interfaces do not support the ability to build or modify networks.

Index

Constants

View Source
const (
	Version     = "v1.4.31"
	GitCommit   = "b2fd8b0"          // the commit JUST BEFORE the release
	VersionDate = "2023-09-27 23:06" // UTC
)

Variables

View Source
var KiT_LayerType = kit.Enums.AddEnum(LayerTypeN, kit.NotBitFlag, nil)
View Source
var KiT_PrjnType = kit.Enums.AddEnum(PrjnTypeN, kit.NotBitFlag, nil)
View Source
var LayerDimNames2D = []string{"Y", "X"}

LayerDimNames2D provides the standard Shape dimension names for 2D layers

View Source
var LayerDimNames4D = []string{"PoolY", "PoolX", "NeurY", "NeurX"}

LayerDimNames4D provides the standard Shape dimension names for 4D layers which have Pools and then neurons within pools.

Functions

func CenterPoolIdxs added in v1.1.55

func CenterPoolIdxs(ly Layer, n int) []int

CenterPoolIdxs returns the indexes for n x n center pools of given 4D layer. Useful for setting RepIdxs on Layer. Will crash if called on non-4D layers.

func CenterPoolShape added in v1.3.13

func CenterPoolShape(ly Layer, n int) []int

CenterPoolShape returns shape for n x n center pools of given 4D layer. Useful for setting RepShape on Layer.

func Layer2DRepIdxs added in v1.4.22

func Layer2DRepIdxs(ly Layer, maxSize int) (idxs, shape []int)

Layer2DRepIdxs returns neuron indexes and corresponding 2D shape for the representative neurons within a large 2D layer, for passing to [SetRepIdxsShape]. These neurons are used for the raster plot in the GUI and for computing PCA, among other cases where the full set of neurons is problematic. The lower-left corner of neurons up to given maxSize is selected.

func NetworkHyperParams added in v1.1.51

func NetworkHyperParams(net Network, sheet *params.Sheet) params.Flex

NetworkHyperParams returns the compiled hyper parameters from given Sheet for each layer and projection in the network -- applies the standard css styling logic for the hyper parameters.

Types

type LayNames added in v1.0.7

type LayNames []string

LayNames is a list of layer names. Has convenience methods for adding, validating.

func (*LayNames) Add added in v1.0.7

func (ln *LayNames) Add(laynm ...string)

Add adds given layer name(s) to list

func (*LayNames) AddAllBut added in v1.0.7

func (ln *LayNames) AddAllBut(net Network, excl ...string)

AddAllBut adds all layers in network except those in exlude list

func (*LayNames) AddOne added in v1.1.13

func (ln *LayNames) AddOne(laynm string)

AddOne adds one layer name to list -- python version -- doesn't support varargs

func (*LayNames) Layers added in v1.1.2

func (ln *LayNames) Layers(net Network) (lays []Layer, err error)

Layers returns slice of emer.Layers in given network based on layer names error is returned if any are not found.

func (*LayNames) Validate added in v1.0.7

func (ln *LayNames) Validate(net Network, ctxt string) error

Validate ensures that LayNames layers are valid. ctxt is string for error message to provide context.

type LaySize added in v1.1.51

type LaySize struct {

	// Y (vertical) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer
	Y int `desc:"Y (vertical) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer"`

	// X (horizontal) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer
	X int `desc:"X (horizontal) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer"`

	// Y (vertical) size of each pool in units, only for 4D layers (inner dimension)
	PoolY int `desc:"Y (vertical) size of each pool in units, only for 4D layers (inner dimension)"`

	// Y (horizontal) size of each pool in units, only for 4D layers (inner dimension)
	PoolX int `desc:"Y (horizontal) size of each pool in units, only for 4D layers (inner dimension)"`
}

LaySize contains parameters for size of layers

type Layer

type Layer interface {
	params.Styler // TypeName, Name, and Class methods for parameter styling

	// InitName MUST be called to initialize the layer's pointer to itself as an emer.Layer
	// which enables the proper interface methods to be called.  Also sets the name, and
	// the parent network that this layer belongs to (which layers may want to retain).
	InitName(lay Layer, name string, net Network)

	// Label satisfies the gi.Labeler interface for getting the name of objects generically
	Label() string

	// SetName sets name of layer
	SetName(nm string)

	// SetClass sets CSS-style class name(s) for this layer (space-separated if multiple)
	SetClass(cls string)

	// AddClass adds a CSS-style class name(s) for this layer,
	// ensuring that it is not a duplicate, and properly space separated.
	AddClass(cls string)

	// IsOff returns true if layer has been turned Off (lesioned) -- for experimentation
	IsOff() bool

	// SetOff sets the "off" (lesioned) status of layer. Also sets the Off state of all
	// projections from this layer to other layers.
	SetOff(off bool)

	// Shape returns the organization of units in the layer, in terms of an array of dimensions.
	// Row-major ordering is default (Y then X), outer-most to inner-most.
	// if 2D, then it is a simple Y,X layer with no sub-structure (pools).
	// If 4D, then it number of pools Y, X and then number of units per pool Y, X
	Shape() *etensor.Shape

	// Is2D() returns true if this is a 2D layer (no Pools)
	Is2D() bool

	// Is4D() returns true if this is a 4D layer (has Pools as inner 2 dimensions)
	Is4D() bool

	// Idx4DFrom2D returns the 4D index from 2D coordinates
	// within which inner dims are interleaved.  Returns false if 2D coords are invalid.
	Idx4DFrom2D(x, y int) ([]int, bool)

	// Type returns the functional type of layer according to LayerType (extensible in
	// more specialized algorithms)
	Type() LayerType

	// SetType sets the functional type of layer
	SetType(typ LayerType)

	// Config configures the basic parameters of the layer
	Config(shape []int, typ LayerType)

	// RelPos returns the relative 3D position specification for this layer
	// for display in the 3D NetView -- see Pos() for display conventions.
	RelPos() relpos.Rel

	// SetRelPos sets the the relative 3D position specification for this layer
	SetRelPos(r relpos.Rel)

	// Pos returns the 3D position of the lower-left-hand corner of the layer.
	// The 3D view has layers arranged in X-Y planes stacked vertically along the Z axis.
	// Somewhat confusingly, this differs from the standard 3D graphics convention,
	// where the vertical dimension is Y and Z is the depth dimension.  However, in the
	// more "layer-centric" way of thinking about it, it is natural for the width & height
	// to map onto X and Y, and then Z is left over for stacking vertically.
	Pos() mat32.Vec3

	// SetPos sets the 3D position of this layer -- will generally be overwritten by
	// automatic RelPos setting, unless that doesn't specify a valid relative position.
	SetPos(pos mat32.Vec3)

	// Size returns the display size of this layer for the 3D view -- see Pos() for general info.
	// This is multiplied by the RelPos.Scale factor to rescale layer sizes, and takes
	// into account 2D and 4D layer structures.
	Size() mat32.Vec2

	// Index returns a 0..n-1 index of the position of the layer within list of layers
	// in the network.  For backprop networks, index position has computational significance.
	// For Leabra networks, it only has significance in determining who gets which weights for
	// enforcing initial weight symmetry -- higher layers get weights from lower layers.
	Index() int

	// SetIndex sets the layer index
	SetIndex(idx int)

	// UnitVarNames returns a list of variable names available on the units in this layer.
	// This is typically a global list so do not modify!
	UnitVarNames() []string

	// UnitVarProps returns a map of unit variable properties, with the key being the
	// name of the variable, and the value gives a space-separated list of
	// go-tag-style properties for that variable.
	// The NetView recognizes the following properties:
	// range:"##" = +- range around 0 for default display scaling
	// min:"##" max:"##" = min, max display range
	// auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not.
	// zeroctr:"+" or "-" = control whether zero-centering is used
	// desc:"txt" tooltip description of the variable
	// Note: this is a global list so do not modify!
	UnitVarProps() map[string]string

	// UnitVarIdx returns the index of given variable within the Neuron,
	// according to *this layer's* UnitVarNames() list (using a map to lookup index),
	// or -1 and error message if not found.
	UnitVarIdx(varNm string) (int, error)

	// UnitVarNum returns the number of Neuron-level variables
	// for this layer.  This is needed for extending indexes in derived types.
	UnitVarNum() int

	// UnitVal1D returns value of given variable index on given unit,
	// using 1-dimensional index, and a data parallel index di,
	// for networks capable of processing multiple input patterns in parallel.
	// returns NaN on invalid index.
	// This is the core unit var access method used by other methods,
	// so it is the only one that needs to be updated for derived layer types.
	UnitVal1D(varIdx int, idx, di int) float32

	// UnitVals fills in values of given variable name on unit,
	// for each unit in the layer, into given float32 slice (only resized if not big enough).
	// di is a data parallel index di, for networks capable of processing input patterns in parallel.
	// Returns error on invalid var name.
	UnitVals(vals *[]float32, varNm string, di int) error

	// UnitValsTensor fills in values of given variable name on unit
	// for each unit in the layer, into given tensor.
	// di is a data parallel index di, for networks capable of processing input patterns in parallel.
	// If tensor is not already big enough to hold the values, it is
	// set to the same shape as the layer.
	// Returns error on invalid var name.
	UnitValsTensor(tsr etensor.Tensor, varNm string, di int) error

	// UnitValsRepTensor fills in values of given variable name on unit
	// for a smaller subset of representative units in the layer, into given tensor.
	// di is a data parallel index di, for networks capable of processing input patterns in parallel.
	// This is used for computationally intensive stats or displays that work
	// much better with a smaller number of units.
	// The set of representative units are defined by SetRepIdxs -- all units
	// are used if no such subset has been defined.
	// If tensor is not already big enough to hold the values, it is
	// set to RepShape to hold all the values if subset is defined,
	// otherwise it calls UnitValsTensor and is identical to that.
	// Returns error on invalid var name.
	UnitValsRepTensor(tsr etensor.Tensor, varNm string, di int) error

	// RepIdxs returns the current set of representative unit indexes.
	// which are a smaller subset of units that represent the behavior
	// of the layer, for computationally intensive statistics and displays
	// (e.g., PCA, ActRF, NetView rasters).
	// Returns nil if none has been set (in which case all units should be used).
	// See utility function CenterPoolIdxs that returns indexes of
	// units in the central pools of a 4D layer.
	RepIdxs() []int

	// RepShape returns the shape to use for the subset of representative
	// unit indexes, in terms of an array of dimensions.  See Shape() for more info.
	// Layers that set RepIdxs should also set this, otherwise a 1D array
	// of len RepIdxs will be used.
	// See utility function CenterPoolShape that returns shape of
	// units in the central pools of a 4D layer.
	RepShape() *etensor.Shape

	// SetRepIdxsShape sets the RepIdxs, and RepShape and as list of dimension sizes
	SetRepIdxsShape(idxs, shape []int)

	// UnitVal returns value of given variable name on given unit,
	// using shape-based dimensional index.
	// Returns NaN on invalid var name or index.
	// di is a data parallel index di, for networks capable of processing input patterns in parallel.
	UnitVal(varNm string, idx []int, di int) float32

	// NRecvPrjns returns the number of receiving projections
	NRecvPrjns() int

	// RecvPrjn returns a specific receiving projection
	RecvPrjn(idx int) Prjn

	// NSendPrjns returns the number of sending projections
	NSendPrjns() int

	// SendPrjn returns a specific sending projection
	SendPrjn(idx int) Prjn

	// SendNameTry looks for a projection connected to this layer whose sender layer has a given name
	SendNameTry(sender string) (Prjn, error)

	// SendNameTypeTry looks for a projection connected to this layer whose sender layer has a given name and type
	SendNameTypeTry(sender, typ string) (Prjn, error)

	// RecvNameTry looks for a projection connected to this layer whose receiver layer has a given name
	RecvNameTry(recv string) (Prjn, error)

	// RecvNameTypeTry looks for a projection connected to this layer whose receiver layer has a given name and type
	RecvNameTypeTry(recv, typ string) (Prjn, error)

	// RecvPrjnVals fills in values of given synapse variable name,
	// for projection from given sending layer and neuron 1D index,
	// for all receiving neurons in this layer,
	// into given float32 slice (only resized if not big enough).
	// prjnType is the string representation of the prjn type -- used if non-empty,
	// useful when there are multiple projections between two layers.
	// Returns error on invalid var name.
	// If the receiving neuron is not connected to the given sending layer or neuron
	// then the value is set to mat32.NaN().
	// Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err).
	RecvPrjnVals(vals *[]float32, varNm string, sendLay Layer, sendIdx1D int, prjnType string) error

	// SendPrjnVals fills in values of given synapse variable name,
	// for projection into given receiving layer and neuron 1D index,
	// for all sending neurons in this layer,
	// into given float32 slice (only resized if not big enough).
	// prjnType is the string representation of the prjn type -- used if non-empty,
	// useful when there are multiple projections between two layers.
	// Returns error on invalid var name.
	// If the sending neuron is not connected to the given receiving layer or neuron
	// then the value is set to mat32.NaN().
	// Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err).
	SendPrjnVals(vals *[]float32, varNm string, recvLay Layer, recvIdx1D int, prjnType string) error

	// Defaults sets default parameter values for all Layer and recv projection parameters
	Defaults()

	// UpdateParams() updates parameter values for all Layer and recv projection parameters,
	// based on any other params that might have changed.
	UpdateParams()

	// ApplyParams applies given parameter style Sheet to this layer and its recv projections.
	// Calls UpdateParams on anything set to ensure derived parameters are all updated.
	// If setMsg is true, then a message is printed to confirm each parameter that is set.
	// it always prints a message if a parameter fails to be set.
	// returns true if any params were set, and error if there were any errors.
	ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

	// NonDefaultParams returns a listing of all parameters in the Layer that
	// are not at their default values -- useful for setting param styles etc.
	NonDefaultParams() string

	// AllParams returns a listing of all parameters in the Layer
	AllParams() string

	// WriteWtsJSON writes the weights from this layer from the receiver-side perspective
	// in a JSON text format.  We build in the indentation logic to make it much faster and
	// more efficient.
	WriteWtsJSON(w io.Writer, depth int)

	// ReadWtsJSON reads the weights from this layer from the receiver-side perspective
	// in a JSON text format.  This is for a set of weights that were saved *for one layer only*
	// and is not used for the network-level ReadWtsJSON, which reads into a separate
	// structure -- see SetWts method.
	ReadWtsJSON(r io.Reader) error

	// SetWts sets the weights for this layer from weights.Layer decoded values
	SetWts(lw *weights.Layer) error

	// Build constructs the layer and projection state based on the layer shapes
	// and patterns of interconnectivity
	Build() error

	// VarRange returns the min / max values for given variable
	// over the layer
	VarRange(varNm string) (min, max float32, err error)
}

Layer defines the basic interface for neural network layers, used for managing the structural elements of a network, and for visualization, I/O, etc. Interfaces are automatically pointers -- think of this as a pointer to your specific layer type, with a very basic interface for accessing general structural properties. Nothing algorithm-specific is implemented here -- all of that goes in your specific layer struct.

type LayerType

type LayerType int32

LayerType is the type of the layer: Input, Hidden, Target, Compare. Class parameter styles automatically key off of these types. Specialized algorithms can extend this to other types, but these types encompass most standard neural network models.

const (
	// Hidden is an internal representational layer that does not receive direct input / targets
	Hidden LayerType = iota

	// Input is a layer that receives direct external input in its Ext inputs
	Input

	// Target is a layer that receives direct external target inputs used for driving plus-phase learning
	Target

	// Compare is a layer that receives external comparison inputs, which drive statistics but
	// do NOT drive activation or learning directly
	Compare

	LayerTypeN
)

The layer types

func (*LayerType) FromString

func (i *LayerType) FromString(s string) error

func (LayerType) MarshalJSON

func (ev LayerType) MarshalJSON() ([]byte, error)

func (LayerType) String

func (i LayerType) String() string

func (*LayerType) UnmarshalJSON

func (ev *LayerType) UnmarshalJSON(b []byte) error

type Layers

type Layers []Layer

Layers is a slice of layers

func (*Layers) ElemLabel

func (ls *Layers) ElemLabel(idx int) string

ElemLabel satisfies the gi.SliceLabeler interface to provide labels for slice elements

type NetParams added in v1.4.18

type NetParams struct {

	// [view: no-inline] full collection of param sets to use
	Params netparams.Sets `view:"no-inline" desc:"full collection of param sets to use"`

	// optional additional sheets of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Sheet names!)
	ExtraSheets string `` /* 148-byte string literal not displayed */

	// optional additional tag to add to file names, logs to identify params / run config
	Tag string `desc:"optional additional tag to add to file names, logs to identify params / run config"`

	// [view: -] the network to apply parameters to
	Network Network `view:"-" desc:"the network to apply parameters to"`

	// [view: -] list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used
	NetHypers params.Flex `` /* 198-byte string literal not displayed */

	// print out messages for each parameter that is set
	SetMsg bool `desc:"print out messages for each parameter that is set"`
}

NetParams handles standard parameters for a Network only (use econfig and a Config struct for other configuration params) Assumes a Set named "Base" has the base-level parameters, which are always applied first, followed optionally by additional Set(s) that can have different parameters to try.

func (*NetParams) Config added in v1.4.18

func (pr *NetParams) Config(pars netparams.Sets, extraSheets, tag string, net Network)

Config configures the ExtraSheets, Tag, and Network fields

func (*NetParams) Name added in v1.4.18

func (pr *NetParams) Name() string

Name returns name of current set of parameters, including Tag. if ExtraSheets is empty then it returns "Base", otherwise returns ExtraSheets

func (*NetParams) RunName added in v1.4.18

func (pr *NetParams) RunName(startRun int) string

RunName returns standard name simulation run based on params Name() and starting run number if > 0 (large models are often run separately)

func (*NetParams) SetAll added in v1.4.18

func (pr *NetParams) SetAll() error

SetAll sets all parameters, using "Base" Set then any ExtraSheets, Does a Validate call first.

func (*NetParams) SetAllSheet added in v1.4.18

func (pr *NetParams) SetAllSheet(sheetName string) error

SetAllSheet sets parameters for given Sheet name to the Network

func (*NetParams) SetNetworkMap added in v1.4.18

func (pr *NetParams) SetNetworkMap(net Network, vals map[string]any) error

SetNetworkMap applies params from given map of values The map keys are Selector:Path and the value is the value to apply, as a string.

func (*NetParams) SetNetworkSheet added in v1.4.18

func (pr *NetParams) SetNetworkSheet(net Network, sh *params.Sheet, setName string)

SetNetworkSheet applies params from given sheet

func (*NetParams) Validate added in v1.4.18

func (pr *NetParams) Validate() error

Validate checks that the Network has been set

type NetSize added in v1.1.51

type NetSize params.Flex

NetSize is a network schema for holding a params for layer sizes. Values can be queried for getting sizes when configuring the network. Uses params.Flex to support flexible parameter specification

func (*NetSize) AddLayers added in v1.1.51

func (ns *NetSize) AddLayers(names []string, class string)

AddLayers adds layer(s) of given class -- most efficient to add each class separately en-mass.

func (*NetSize) ApplySheet added in v1.1.51

func (ns *NetSize) ApplySheet(sheet *params.Sheet, setMsg bool)

ApplySheet applies given sheet of parameters to each layer

func (*NetSize) JSONString added in v1.1.51

func (ns *NetSize) JSONString() string

func (*NetSize) LayX added in v1.1.51

func (ns *NetSize) LayX(name string, def int) int

LayX returns the X value = horizontal size of 2D layer or number of pools (outer dimension) for 4D layer, for given layer from size, if it set there. Otherwise returns the provided default value

func (*NetSize) LayY added in v1.1.51

func (ns *NetSize) LayY(name string, def int) int

LayY returns the Y value = vertical size of 2D layer or number of pools (outer dimension) for 4D layer, for given layer from size, if it set there. Otherwise returns the provided default value

func (*NetSize) Layer added in v1.1.51

func (ns *NetSize) Layer(name string) (*LaySize, error)

Layer returns the layer size for given layer name -- nil if not found and an error is emitted and returned

func (*NetSize) PoolX added in v1.1.51

func (ns *NetSize) PoolX(name string, def int) int

PoolX returns the Pool X value (4D inner dim) = size of pool in units for given layer from size if it set there. Otherwise returns the provided default value

func (*NetSize) PoolY added in v1.1.51

func (ns *NetSize) PoolY(name string, def int) int

PoolY returns the Pool X value (4D inner dim) = size of pool in units for given layer from size if it set there. Otherwise returns the provided default value

type Network

type Network interface {
	// InitName MUST be called to initialize the network's pointer to itself as an emer.Network
	// which enables the proper interface methods to be called.  Also sets the name.
	InitName(net Network, name string)

	// Name() returns name of the network
	Name() string

	// Label satisfies the gi.Labeler interface for getting the name of objects generically
	Label() string

	// NLayers returns the number of layers in the network
	NLayers() int

	// Layer returns layer (as emer.Layer interface) at given index -- does not
	// do extra bounds checking
	Layer(idx int) Layer

	// LayerByName returns layer of given name, nil if not found.
	// Layer names must be unique and a map is used so this is a fast operation
	LayerByName(name string) Layer

	// LayerByNameTry returns layer of given name,
	// returns error if not found.
	// Layer names must be unique and a map is used so this is a fast operation
	LayerByNameTry(name string) (Layer, error)

	// Defaults sets default parameter values for everything in the Network
	Defaults()

	// UpdateParams() updates parameter values for all Network parameters,
	// based on any other params that might have changed.
	UpdateParams()

	// ApplyParams applies given parameter style Sheet to layers and prjns in this network.
	// Calls UpdateParams on anything set to ensure derived parameters are all updated.
	// If setMsg is true, then a message is printed to confirm each parameter that is set.
	// it always prints a message if a parameter fails to be set.
	// returns true if any params were set, and error if there were any errors.
	ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

	// NonDefaultParams returns a listing of all parameters in the Network that
	// are not at their default values -- useful for setting param styles etc.
	NonDefaultParams() string

	// AllParams returns a listing of all parameters in the Network
	AllParams() string

	// KeyLayerParams returns a listing for all layers in the network,
	// of the most important layer-level params (specific to each algorithm).
	KeyLayerParams() string

	// KeyPrjnParams returns a listing for all Recv projections in the network,
	// of the most important projection-level params (specific to each algorithm).
	KeyPrjnParams() string

	// UnitVarNames returns a list of variable names available on the units in this network.
	// This list determines what is shown in the NetView (and the order of vars list).
	// Not all layers need to support all variables, but must safely return mat32.NaN() for
	// unsupported ones.
	// This is typically a global list so do not modify!
	UnitVarNames() []string

	// UnitVarProps returns a map of unit variable properties, with the key being the
	// name of the variable, and the value gives a space-separated list of
	// go-tag-style properties for that variable.
	// The NetView recognizes the following properties:
	// range:"##" = +- range around 0 for default display scaling
	// min:"##" max:"##" = min, max display range
	// auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not.
	// zeroctr:"+" or "-" = control whether zero-centering is used
	// desc:"txt" tooltip description of the variable
	// Note: this is typically a global list so do not modify!
	UnitVarProps() map[string]string

	// SynVarNames returns the names of all the variables on the synapses in this network.
	// This list determines what is shown in the NetView (and the order of vars list).
	// Not all projections need to support all variables, but must safely return mat32.NaN() for
	// unsupported ones.
	// This is typically a global list so do not modify!
	SynVarNames() []string

	// SynVarProps returns a map of synapse variable properties, with the key being the
	// name of the variable, and the value gives a space-separated list of
	// go-tag-style properties for that variable.
	// The NetView recognizes the following properties:
	// range:"##" = +- range around 0 for default display scaling
	// min:"##" max:"##" = min, max display range
	// auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not.
	// zeroctr:"+" or "-" = control whether zero-centering is used
	// Note: this is typically a global list so do not modify!
	SynVarProps() map[string]string

	// WriteWtsJSON writes network weights (and any other state that adapts with learning)
	// to JSON-formatted output.
	WriteWtsJSON(w io.Writer) error

	// ReadWtsJSON reads network weights (and any other state that adapts with learning)
	// from JSON-formatted input.  Reads into a temporary weights.Network structure that
	// is then passed to SetWts to actually set the weights.
	ReadWtsJSON(r io.Reader) error

	// SetWts sets the weights for this network from weights.Network decoded values
	SetWts(nw *weights.Network) error

	// SaveWtsJSON saves network weights (and any other state that adapts with learning)
	// to a JSON-formatted file.  If filename has .gz extension, then file is gzip compressed.
	SaveWtsJSON(filename gi.FileName) error

	// OpenWtsJSON opens network weights (and any other state that adapts with learning)
	// from a JSON-formatted file.  If filename has .gz extension, then file is gzip uncompressed.
	OpenWtsJSON(filename gi.FileName) error

	// Bounds returns the minimum and maximum display coordinates of the network for 3D display
	Bounds() (min, max mat32.Vec3)

	// VarRange returns the min / max values for given variable
	VarRange(varNm string) (min, max float32, err error)

	// LayersByClass returns a list of layer names by given class(es).
	// Lists are compiled when network Build() function called.
	// The layer Type is always included as a Class, along with any other
	// space-separated strings specified in Class for parameter styling, etc.
	// If no classes are passed, all layer names in order are returned.
	LayersByClass(classes ...string) []string

	// MaxParallelData returns the maximum number of data inputs that can be
	// processed in parallel by the network.
	// The NetView supports display of up to this many data elements.
	MaxParallelData() int

	// NParallelData returns the current number of data inputs currently being
	// processed in parallel by the network.
	// Logging supports recording each of these where appropriate.
	NParallelData() int
}

Network defines the basic interface for a neural network, used for managing the structural elements of a network, and for visualization, I/O, etc

type Params added in v1.1.51

type Params struct {

	// [view: no-inline] full collection of param sets to use
	Params params.Sets `view:"no-inline" desc:"full collection of param sets to use"`

	// optional additional set(s) of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Set names!)
	ExtraSets string `` /* 146-byte string literal not displayed */

	// optional additional tag to add to file names, logs to identify params / run config
	Tag string `desc:"optional additional tag to add to file names, logs to identify params / run config"`

	// [view: -] map of objects to apply parameters to -- the key is the name of the Sheet for each object, e.g.,
	Objects map[string]any `` /* 148-byte string literal not displayed */

	// [view: -] list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used
	NetHypers params.Flex `` /* 198-byte string literal not displayed */

	// print out messages for each parameter that is set
	SetMsg bool `desc:"print out messages for each parameter that is set"`
}

Params handles standard parameters for a Network and other objects. Assumes a Set named "Base" has the base-level parameters, which are always applied first, followed optionally by additional Set(s) that can have different parameters to try.

func (*Params) AddLayers added in v1.1.51

func (pr *Params) AddLayers(names []string, class string)

AddLayers adds layer(s) of given class to the NetSize for sizing params. Most efficient to add each class separately en-mass.

func (*Params) AddNetSize added in v1.1.51

func (pr *Params) AddNetSize() *NetSize

AddNetSize adds a new Network Schema object to those configured by params. The network schema can be retrieved using NetSize() method, and also the direct LayX, ..Y, PoolX, ..Y methods can be used to directly access values.

func (*Params) AddNetwork added in v1.1.51

func (pr *Params) AddNetwork(net Network)

AddNetwork adds network to those configured by params -- replaces any existing network that was set previously.

func (*Params) AddObject added in v1.1.51

func (pr *Params) AddObject(name string, object any)

AddObject adds given object with given sheet name that applies to this object. It is based on a map keyed on the name, so any existing object is replaced (safe to call repeatedly).

func (*Params) AddSim added in v1.1.51

func (pr *Params) AddSim(sim any)

AddSim adds Sim object to those configured by params -- replaces any existing.

func (*Params) LayX added in v1.1.51

func (pr *Params) LayX(name string, def int) int

LayX returns the X value = horizontal size of 2D layer or number of pools (outer dimension) for 4D layer, for given layer from NetSize, if it set there. Otherwise returns the provided default value

func (*Params) LayY added in v1.1.51

func (pr *Params) LayY(name string, def int) int

LayY returns the Y value = vertical size of 2D layer or number of pools (outer dimension) for 4D layer, for given layer from NetSize, if it set there. Otherwise returns the provided default value

func (*Params) Name added in v1.1.51

func (pr *Params) Name() string

Name returns name of current set of parameters, including Tag. if ExtraSets is empty then it returns "Base", otherwise returns ExtraSets

func (*Params) NetSize added in v1.1.51

func (pr *Params) NetSize() *NetSize

NetSize returns the NetSize network size configuration object nil if it was not added

func (*Params) PoolX added in v1.1.51

func (pr *Params) PoolX(name string, def int) int

PoolX returns the Pool X value (4D inner dim) = size of pool in units for given layer from NetSize if it set there. Otherwise returns the provided default value

func (*Params) PoolY added in v1.1.51

func (pr *Params) PoolY(name string, def int) int

PoolY returns the Pool X value (4D inner dim) = size of pool in units for given layer from NetSize if it set there. Otherwise returns the provided default value

func (*Params) RunName added in v1.3.3

func (pr *Params) RunName(startRun int) string

RunName returns standard name simulation run based on params Name() and starting run number if > 0 (large models are often run separately)

func (*Params) SetAll added in v1.1.51

func (pr *Params) SetAll() error

SetAll sets all parameters, using "Base" Set then any ExtraSets, for all the Objects that have been added. Does a Validate call first.

func (*Params) SetAllSet added in v1.1.51

func (pr *Params) SetAllSet(setName string) error

SetAllSet sets parameters for given Set name to all Objects

func (*Params) SetNetworkMap added in v1.4.14

func (pr *Params) SetNetworkMap(net Network, vals map[string]any) error

SetNetworkMap applies params from given map of values The map keys are Selector:Path and the value is the value to apply, as a string.

func (*Params) SetNetworkSheet added in v1.4.14

func (pr *Params) SetNetworkSheet(net Network, sh *params.Sheet, setName string)

SetNetworkSheet applies params from given sheet

func (*Params) SetObject added in v1.1.51

func (pr *Params) SetObject(objName string) error

SetObject sets parameters, using "Base" Set then any ExtraSets, for the given object name (e.g., "Network" or "Sim" etc). Does not do Validate or collect hyper parameters.

func (*Params) SetObjectSet added in v1.1.51

func (pr *Params) SetObjectSet(objName, setName string) error

SetObjectSet sets parameters for given Set name to given object

func (*Params) Validate added in v1.1.51

func (pr *Params) Validate() error

Validate checks that there are sheets with the names for the Objects that have been added.

type Prjn

type Prjn interface {
	params.Styler // TypeName, Name, and Class methods for parameter styling

	// Init MUST be called to initialize the prjn's pointer to itself as an emer.Prjn
	// which enables the proper interface methods to be called.
	Init(prjn Prjn)

	// SendLay returns the sending layer for this projection
	SendLay() Layer

	// RecvLay returns the receiving layer for this projection
	RecvLay() Layer

	// Pattern returns the pattern of connectivity for interconnecting the layers
	Pattern() prjn.Pattern

	// SetPattern sets the pattern of connectivity for interconnecting the layers.
	// Returns Prjn so it can be chained to set other properties too
	SetPattern(pat prjn.Pattern) Prjn

	// Type returns the functional type of projection according to PrjnType (extensible in
	// more specialized algorithms)
	Type() PrjnType

	// SetType sets the functional type of projection according to PrjnType
	// Returns Prjn so it can be chained to set other properties too
	SetType(typ PrjnType) Prjn

	// PrjnTypeName returns the string rep of functional type of projection
	// according to PrjnType (extensible in more specialized algorithms, by
	// redefining this method as needed).
	PrjnTypeName() string

	// SetClass sets CSS-style class name(s) for this projection (space-separated if multiple)
	// Returns Prjn so it can be chained to set other properties too
	SetClass(cls string) Prjn

	// AddClass adds a CSS-style class name(s) for this prjn,
	// ensuring that it is not a duplicate, and properly space separated.
	AddClass(cls string)

	// Label satisfies the gi.Labeler interface for getting the name of objects generically
	Label() string

	// IsOff returns true if projection or either send or recv layer has been turned Off.
	// Useful for experimentation
	IsOff() bool

	// SetOff sets the projection Off status (i.e., lesioned). Careful: Layer.SetOff(true) will
	// reactivate that layer's projections, so projection-level lesioning should always be called
	// after layer-level lesioning.
	SetOff(off bool)

	// SynVarNames returns the names of all the variables on the synapse
	// This is typically a global list so do not modify!
	SynVarNames() []string

	// SynVarProps returns a map of synapse variable properties, with the key being the
	// name of the variable, and the value gives a space-separated list of
	// go-tag-style properties for that variable.
	// The NetView recognizes the following properties:
	// range:"##" = +- range around 0 for default display scaling
	// min:"##" max:"##" = min, max display range
	// auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not.
	// zeroctr:"+" or "-" = control whether zero-centering is used
	// Note: this is a global list so do not modify!
	SynVarProps() map[string]string

	// SynIdx returns the index of the synapse between given send, recv unit indexes
	// (1D, flat indexes). Returns -1 if synapse not found between these two neurons.
	// This requires searching within connections for receiving unit (a bit slow).
	SynIdx(sidx, ridx int) int

	// SynVarIdx returns the index of given variable within the synapse,
	// according to *this prjn's* SynVarNames() list (using a map to lookup index),
	// or -1 and error message if not found.
	SynVarIdx(varNm string) (int, error)

	// SynVarNum returns the number of synapse-level variables
	// for this prjn.  This is needed for extending indexes in derived types.
	SynVarNum() int

	// Syn1DNum returns the number of synapses for this prjn as a 1D array.
	// This is the max idx for SynVal1D and the number of vals set by SynVals.
	Syn1DNum() int

	// SynVal1D returns value of given variable index (from SynVarIdx) on given SynIdx.
	// Returns NaN on invalid index.
	// This is the core synapse var access method used by other methods,
	// so it is the only one that needs to be updated for derived layer types.
	SynVal1D(varIdx int, synIdx int) float32

	// SynVals sets values of given variable name for each synapse, using the natural ordering
	// of the synapses (sender based for Leabra),
	// into given float32 slice (only resized if not big enough).
	// Returns error on invalid var name.
	SynVals(vals *[]float32, varNm string) error

	// SynVal returns value of given variable name on the synapse
	// between given send, recv unit indexes (1D, flat indexes).
	// Returns mat32.NaN() for access errors.
	SynVal(varNm string, sidx, ridx int) float32

	// SetSynVal sets value of given variable name on the synapse
	// between given send, recv unit indexes (1D, flat indexes).
	// Typically only supports base synapse variables and is not extended
	// for derived types.
	// Returns error for access errors.
	SetSynVal(varNm string, sidx, ridx int, val float32) error

	// Defaults sets default parameter values for all Prjn parameters
	Defaults()

	// UpdateParams() updates parameter values for all Prjn parameters,
	// based on any other params that might have changed.
	UpdateParams()

	// ApplyParams applies given parameter style Sheet to this projection.
	// Calls UpdateParams if anything set to ensure derived parameters are all updated.
	// If setMsg is true, then a message is printed to confirm each parameter that is set.
	// it always prints a message if a parameter fails to be set.
	// returns true if any params were set, and error if there were any errors.
	ApplyParams(pars *params.Sheet, setMsg bool) (bool, error)

	// NonDefaultParams returns a listing of all parameters in the Projection that
	// are not at their default values -- useful for setting param styles etc.
	NonDefaultParams() string

	// AllParams returns a listing of all parameters in the Projection
	AllParams() string

	// WriteWtsJSON writes the weights from this projection from the receiver-side perspective
	// in a JSON text format.  We build in the indentation logic to make it much faster and
	// more efficient.
	WriteWtsJSON(w io.Writer, depth int)

	// ReadWtsJSON reads the weights from this projection from the receiver-side perspective
	// in a JSON text format.  This is for a set of weights that were saved *for one prjn only*
	// and is not used for the network-level ReadWtsJSON, which reads into a separate
	// structure -- see SetWts method.
	ReadWtsJSON(r io.Reader) error

	// SetWts sets the weights for this projection from weights.Prjn decoded values
	SetWts(pw *weights.Prjn) error

	// Build constructs the full connectivity among the layers as specified in this projection.
	Build() error
}

Prjn defines the basic interface for a projection which connects two layers. Name is set automatically to: SendLay().Name() + "To" + RecvLay().Name()

func RecvNameTry added in v1.3.32

func RecvNameTry(l Layer, recv string) (Prjn, error)

func RecvNameTypeTry added in v1.3.32

func RecvNameTypeTry(l Layer, recv, typ string) (Prjn, error)

func SendNameTry added in v1.3.32

func SendNameTry(l Layer, sender string) (Prjn, error)

we keep these here to make it easier for other packages to implement the emer.Layer interface by just calling these methods

func SendNameTypeTry added in v1.3.32

func SendNameTypeTry(l Layer, sender, typ string) (Prjn, error)

type PrjnType

type PrjnType int32

PrjnType is the type of the projection (extensible for more specialized algorithms). Class parameter styles automatically key off of these types.

const (
	// Forward is a feedforward, bottom-up projection from sensory inputs to higher layers
	Forward PrjnType = iota

	// Back is a feedback, top-down projection from higher layers back to lower layers
	Back

	// Lateral is a lateral projection within the same layer / area
	Lateral

	// Inhib is an inhibitory projection that drives inhibitory synaptic inputs instead of excitatory
	Inhib

	PrjnTypeN
)

The projection types

func (*PrjnType) FromString

func (i *PrjnType) FromString(s string) error

func (PrjnType) MarshalJSON

func (ev PrjnType) MarshalJSON() ([]byte, error)

func (PrjnType) String

func (i PrjnType) String() string

func (*PrjnType) UnmarshalJSON

func (ev *PrjnType) UnmarshalJSON(b []byte) error

type Prjns

type Prjns []Prjn

Prjns is a slice of projections

func (*Prjns) Add

func (pl *Prjns) Add(p Prjn)

Add adds a projection to the list

func (*Prjns) ElemLabel

func (pl *Prjns) ElemLabel(idx int) string

ElemLabel satisfies the gi.SliceLabeler interface to provide labels for slice elements

func (*Prjns) Recv

func (pl *Prjns) Recv(recv Layer) (Prjn, bool)

Recv finds the projection with given recv layer

func (*Prjns) RecvName

func (pl *Prjns) RecvName(recv string) Prjn

RecvName finds the projection with given recv layer name, nil if not found see Try version for error checking.

func (*Prjns) RecvNameTry

func (pl *Prjns) RecvNameTry(recv string) (Prjn, error)

RecvNameTry finds the projection with given recv layer name. returns error message if not found

func (*Prjns) RecvNameTypeTry added in v1.0.0

func (pl *Prjns) RecvNameTypeTry(recv, typ string) (Prjn, error)

RecvNameTypeTry finds the projection with given recv layer name and Type string. returns error message if not found.

func (*Prjns) Send

func (pl *Prjns) Send(send Layer) (Prjn, bool)

Send finds the projection with given send layer

func (*Prjns) SendName

func (pl *Prjns) SendName(sender string) Prjn

SendName finds the projection with given send layer name, nil if not found see Try version for error checking.

func (*Prjns) SendNameTry

func (pl *Prjns) SendNameTry(sender string) (Prjn, error)

SendNameTry finds the projection with given send layer name. returns error message if not found

func (*Prjns) SendNameTypeTry added in v1.0.0

func (pl *Prjns) SendNameTypeTry(sender, typ string) (Prjn, error)

SendNameTypeTry finds the projection with given send layer name and Type string. returns error message if not found.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL