leabra

package module
v0.0.0-...-da82786 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 21, 2022 License: BSD-3-Clause Imports: 0 Imported by: 0

README

leabra-sleep

Modified Leabra for Sleep

Documentation

Overview

Package leabra is the overall repository for all standard Leabra algorithm code implemented in the Go language (golang) with Python wrappers.

This top-level of the repository has no functional code -- everything is organized into the following sub-repositories:

* leabra: the core standard implementation with the minimal set of standard mechanisms exclusively using rate-coded neurons -- there are too many differences with spiking, so that is now separated out into a different package.

* deep: the DeepLeabra version which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.

* examples: these actually compile into runnable programs and provide the starting point for your own simulations. examples/ra25 is the place to start for the most basic standard template of a model that learns a small set of input / output patterns in a classic supervised-learning manner.

* python: follow the instructions in the README.md file to build a python wrapper that will allow you to fully control the models using Python.

Directories

Path Synopsis
Package chans provides standard neural conductance channels for computing a point-neuron approximation based on the standard equivalent RC circuit model of a neuron (i.e., basic Ohms law equations).
Package chans provides standard neural conductance channels for computing a point-neuron approximation based on the standard equivalent RC circuit model of a neuron (i.e., basic Ohms law equations).
Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.
Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.
examples
bench
bench runs a benchmark model with 5 layers (3 hidden, Input, Output) all of the same size, for benchmarking different size networks.
bench runs a benchmark model with 5 layers (3 hidden, Input, Output) all of the same size, for benchmarking different size networks.
deep_fsa
deep_fsa runs a DeepLeabra network on the classic Reber grammar finite state automaton problem.
deep_fsa runs a DeepLeabra network on the classic Reber grammar finite state automaton problem.
hip
hip runs a hippocampus model on the AB-AC paired associate learning task
hip runs a hippocampus model on the AB-AC paired associate learning task
hip_bench
hip_bench runs a hippocampus model for testing parameters and new learning ideas
hip_bench runs a hippocampus model for testing parameters and new learning ideas
ra25
ra25 runs a simple random-associator four-layer leabra network that uses the standard supervised learning paradigm to learn mappings between 25 random input / output patterns defined over 5x5 input / output layers (i.e., 25 units)
ra25 runs a simple random-associator four-layer leabra network that uses the standard supervised learning paradigm to learn mappings between 25 random input / output patterns defined over 5x5 input / output layers (i.e., 25 units)
sir2
sir illustrates the dynamic gating of information into PFC active maintenance, by the basal ganglia (BG).
sir illustrates the dynamic gating of information into PFC active maintenance, by the basal ganglia (BG).
Package fffb provides feedforward (FF) and feedback (FB) inhibition (FFFB) based on average (or maximum) excitatory netinput (FF) and activation (FB).
Package fffb provides feedforward (FF) and feedback (FB) inhibition (FFFB) based on average (or maximum) excitatory netinput (FF) and activation (FB).
Package hip provides special hippocampus algorithms for implementing the Theta-phase hippocampus model from Ketz, Morkonda, & O'Reilly (2013).
Package hip provides special hippocampus algorithms for implementing the Theta-phase hippocampus model from Ketz, Morkonda, & O'Reilly (2013).
Package knadapt provides code for sodium (Na) gated potassium (K) currents that drive adaptation (accommodation) in neural firing.
Package knadapt provides code for sodium (Na) gated potassium (K) currents that drive adaptation (accommodation) in neural firing.
Package leabra provides the basic reference leabra implementation, for rate-coded activations and standard error-driven learning.
Package leabra provides the basic reference leabra implementation, for rate-coded activations and standard error-driven learning.
Package nxx1 provides the Noisy-X-over-X-plus-1 activation function that well-characterizes the neural response function empirically, as a saturating sigmoid-like nonlinear response with an initial largely-linear regime.
Package nxx1 provides the Noisy-X-over-X-plus-1 activation function that well-characterizes the neural response function empirically, as a saturating sigmoid-like nonlinear response with an initial largely-linear regime.
Package pbwm provides the prefrontal cortex basal ganglia working memory (PBWM) model of the basal ganglia (BG) and prefrontal cortex (PFC) circuitry that supports dynamic BG gating of PFC robust active maintenance.
Package pbwm provides the prefrontal cortex basal ganglia working memory (PBWM) model of the basal ganglia (BG) and prefrontal cortex (PFC) circuitry that supports dynamic BG gating of PFC robust active maintenance.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL