goNEAT

command module
v2.8.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 26, 2021 License: MIT Imports: 14 Imported by: 0

README

goNEAT

banner

NeuroEvolution — evolving Artificial Neural Networks topology from the scratch

version Build Status GoDoc Go version license yaricom/goNEAT Sourcegraph

Branch Tests Coverage Linting Code Security
master CI codecov Lint CodeQL

Overview

This repository provides implementation of NeuroEvolution of Augmenting Topologies (NEAT) method written in Go language.

The NeuroEvolution (NE) is an artificial evolution of Neural Networks (NN) using genetic algorithms to find optimal NN parameters and network topology. NeuroEvolution of NN may assume a search for optimal weights of connections between NN nodes and search for the optimal topology of the resulting network graph. The NEAT method implemented in this work searches for optimal connection weights and the network graph topology for a given task (number of NN nodes per layer and their interconnections).

Minimum requirements

Requirement Notes
Go version Go1.15 or higher

Releases

Please do not depend on master as your production branch. Use releases instead.

Quick Start

You can evaluate the NEAT algorithm performance by running the following command:

cd $GOPATH/src/github.com/yaricom/goNEAT
go run executor.go -out ./out/xor -context ./data/xor.neat -genome ./data/xorstartgenes -experiment XOR

Or

make run-xor

The command above will run the XOR problem solver experiment and save the collected data samples. You can use saved experimental data for analysis using standard plotting libraries as in the figure below.

The XOR results plot

The figure was created using Matplotlib. You can find more details in the Jupyter notebook.

Documentation

You can find the algorithm performance evaluation and related documentation in the project's wiki

The goNEAT library saves results of the experiments using Numpy NPZ format, which allows analysis of collected experimental data samples using a variety of readily available Python libraries.

For your reference, we included Jupyter notebook with an example of the collected experimental data analysis, which can be used as a starter kit to analyze data samples acquired from your experiments.

Installation

Make sure you have at least GO 1.15.x installed onto your system and execute the following command:


go get github.com/yaricom/goNEAT

For new projects, consider using the v2 of the library with the following import:

import "github.com/AISystemsInc/goNEAT/v2"

Essential Packages

genetics package

The genetics package provides the genetic part of the NEAT algorithm describing all the machinery related to genome mutations, mating, and speciation of the population of organisms.

It contains implementation of all important types related to the NEAT algorithm:

  • Gene type in this system specifies a "Connection Gene."
  • MIMOControlGene type is the Multiple-Input Multiple-Output (MIMO) control Gene which allows creation of modular genomes
  • Genome type is the primary source of genotype information used to create a phenotype.
  • Organism type is Genotypes (Genomes) and Phenotypes (Networks) combined with fitness information, i.e. the genotype and phenotype together.
  • Population type is a group of Organisms including their Species
  • Species type is a group of similar Organisms. Reproduction takes place mostly within a single species, so that compatible organisms can mate.

Additionally, it contains variety of utility functions to serialise/deserialize specified above types using two supported data formats:

  • plain text
  • YAML

The current implementation supports sequential and parallel execution of evolution epoch which controlled by related parameter in the NEAT context options.

math package

Package math defines standard mathematical primitives used by the NEAT algorithm as well as utility functions

network package

Package network provides data structures and utilities to describe Artificial Neural Network and network solvers.

The most important types are:

  • NNode type defines the node of the network and is a part of organism's genotype as well as phenotype
  • Link type is a connection from one node to another with an associated weight.
  • Network type is a collection of all nodes within an organism's phenotype, which effectively defines Neural Network topology.
  • Solver type defines network solver interface, which allows propagation of the activation waves through the underlying network graph.

The current implementation supports two types of network solvers:

  • FastModularNetworkSolver is the network solver implementation to be used for large neural networks simulation.
  • Standard Network Solver implemented by the Network type
experiment package

Package experiment defines standard evolutionary epochs evaluators and experimental data samples collectors. It provides standardised approach to define experiments using the NEAT algorithm implementation.

The most important type here is:

You can find examples of GenerationEvaluator implementations at experiments:

  • pole - single-, double-pole balancing experiments
  • xor - XOR solver experiment

The following code snippet demonstrates how to run experiments using different implementations of the GenerationEvaluator and the experiment.Execute:

// create experiment
expt := experiment.Experiment{
    Id:       0,
    Trials:   make(experiment.Trials, neatOptions.NumRuns),
    RandSeed: seed,
}
var generationEvaluator experiment.GenerationEvaluator
switch *experimentName {
case "XOR":
    expt.MaxFitnessScore = 16.0 // as given by fitness function definition
    generationEvaluator = xor.NewXORGenerationEvaluator(outDir)
case "cart_pole":
    expt.MaxFitnessScore = 1.0 // as given by fitness function definition
    generationEvaluator = pole.NewCartPoleGenerationEvaluator(outDir, true, 500000)
case "cart_2pole_markov":
    expt.MaxFitnessScore = 1.0 // as given by fitness function definition
    generationEvaluator = pole.NewCartDoublePoleGenerationEvaluator(outDir, true, pole.ContinuousAction)
case "cart_2pole_non-markov":
    generationEvaluator = pole.NewCartDoublePoleGenerationEvaluator(outDir, false, pole.ContinuousAction)
default:
    log.Fatalf("Unsupported experiment: %s", *experimentName)
}

// prepare to execute
errChan := make(chan error)
ctx, cancel := context.WithCancel(context.Background())

// run experiment in the separate GO routine
go func() {
    if err = expt.Execute(neat.NewContext(ctx, neatOptions), startGenome, generationEvaluator, nil); err != nil {
        errChan <- err
    } else {
        errChan <- nil
    }
}()

For more details, take a look at the experiment executor implementation provided with the goNEAT library.

neat package

Package neat is an entry point to the NEAT algorithm. It defines the NEAT execution context and configuration options.

You can find all available configuration options in the Options.

The configuration options can be saved either using plain text or the YAML format. We recommend using the YAML format for new projects because it allows for a more flexible setup and detailed documentation of the configuration parameters.

Take a look at the example configuration file to get a better understanding.

The NEAT context options can be read as follows:

// Loading YAML options
optFile, err := os.Open("./data/xor_test.neat.yml")
if err != nil {
	return err
}
options, err := neat.LoadYAMLOptions(optFile)

Or with plain-text format:

// Loading plain-text options
optFile, err := os.Open("./data/xor_test.neat")
if err != nil {
	return err
}
options, err := neat.LoadNeatOptions(optFile)

Conclusion

The experiments described in this work confirm that introduced NEAT algorithm implementation can evolve new structures in the Artificial Neural Networks (XOR experiments) and can solve reinforcement learning tasks under conditions of incomplete knowledge (single-pole balancing and double-pole balancing).

We hope that you will find great applications in your research and work projects for the provided NEAT algorithm's implementation as well as utilities to run experiments while collecting relevant data samples.

Support this work

If you found this library helpful, please consider supporting further work on the project by donating.

You can help to evolve this project either by pressing Sponsor or by sending some funds to:

  • LTC: LPi2hvnMQLWy1BKbjtyPeEqVcfyPfQLErs
  • DOGE: D9u3YQJfpjYQT67ZQRub97jjgiiG7S3S6x

References

This source code maintained and managed by Iaroslav Omelianenko

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis
Package experiment defines standard evolutionary epochs evaluators and experimental data samples collectors.
Package experiment defines standard evolutionary epochs evaluators and experimental data samples collectors.
experiments
pole
Package pole provides definition of the pole balancing experiments is classic Reinforced Learning task proposed by Richard Sutton and Charles Anderson.
Package pole provides definition of the pole balancing experiments is classic Reinforced Learning task proposed by Richard Sutton and Charles Anderson.
xor
Package xor defines the XOR experiment which serves to actually check that network topology actually evolves and everything works as expected.
Package xor defines the XOR experiment which serves to actually check that network topology actually evolves and everything works as expected.
Package neat implements the NeuroEvolution of Augmenting Topologies (NEAT) method, which can be used to evolve specific Artificial Neural Networks from scratch using genetic algorithms.
Package neat implements the NeuroEvolution of Augmenting Topologies (NEAT) method, which can be used to evolve specific Artificial Neural Networks from scratch using genetic algorithms.
genetics
Package genetics holds data holders and helper utilities used to implement genetic evolution algorithm
Package genetics holds data holders and helper utilities used to implement genetic evolution algorithm
math
Package math defines standard mathematical primitives used by the NEAT algorithm as well as utility functions
Package math defines standard mathematical primitives used by the NEAT algorithm as well as utility functions
network
Package network provides data structures and utilities to describe Artificial Neural Network and network solvers.
Package network provides data structures and utilities to describe Artificial Neural Network and network solvers.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL