eaopt

package module
Version: v0.4.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 16, 2020 License: MIT Imports: 11 Imported by: 7

README

logo


eaopt is an evolutionary optimization library

Table of Contents

Changelog

  • 11/11/18: a simple version of OpenAI's evolution strategy has been implemented, it's called OES.
  • 02/08/18: gago has now become eaopt. You can still everything you could do before but the scope is now larger than genetic algorithms. The goal is to implement many more evolutionary optimization algorithms on top of the existing codebase.

Example

The following example attempts to minimize the Drop-Wave function using a genetic algorithm. The Drop-Wave function is known to have a minimum value of -1 when each of it's arguments is equal to 0.

drop_wave_chart drop_wave_function
package main

import (
    "fmt"
    m "math"
    "math/rand"

    "github.com/MaxHalford/eaopt"
)

// A Vector contains float64s.
type Vector []float64

// Evaluate a Vector with the Drop-Wave function which takes two variables as
// input and reaches a minimum of -1 in (0, 0). The function is simple so there
// isn't any error handling to do.
func (X Vector) Evaluate() (float64, error) {
    var (
        numerator   = 1 + m.Cos(12*m.Sqrt(m.Pow(X[0], 2)+m.Pow(X[1], 2)))
        denominator = 0.5*(m.Pow(X[0], 2)+m.Pow(X[1], 2)) + 2
    )
    return -numerator / denominator, nil
}

// Mutate a Vector by resampling each element from a normal distribution with
// probability 0.8.
func (X Vector) Mutate(rng *rand.Rand) {
    eaopt.MutNormalFloat64(X, 0.8, rng)
}

// Crossover a Vector with another Vector by applying uniform crossover.
func (X Vector) Crossover(Y eaopt.Genome, rng *rand.Rand) {
    eaopt.CrossUniformFloat64(X, Y.(Vector), rng)
}

// Clone a Vector to produce a new one that points to a different slice.
func (X Vector) Clone() eaopt.Genome {
    var Y = make(Vector, len(X))
    copy(Y, X)
    return Y
}

// VectorFactory returns a random vector by generating 2 values uniformally
// distributed between -10 and 10.
func VectorFactory(rng *rand.Rand) eaopt.Genome {
    return Vector(eaopt.InitUnifFloat64(2, -10, 10, rng))
}

func main() {
    // Instantiate a GA with a GAConfig
    var ga, err = eaopt.NewDefaultGAConfig().NewGA()
    if err != nil {
        fmt.Println(err)
        return
    }

    // Set the number of generations to run for
    ga.NGenerations = 10

    // Add a custom print function to track progress
    ga.Callback = func(ga *eaopt.GA) {
        fmt.Printf("Best fitness at generation %d: %f\n", ga.Generations, ga.HallOfFame[0].Fitness)
    }

    // Find the minimum
    err = ga.Minimize(VectorFactory)
    if err != nil {
        fmt.Println(err)
        return
    }
}

>>> Best fitness at generation 0: -0.550982
>>> Best fitness at generation 1: -0.924220
>>> Best fitness at generation 2: -0.987282
>>> Best fitness at generation 3: -0.987282
>>> Best fitness at generation 4: -0.987282
>>> Best fitness at generation 5: -0.987282
>>> Best fitness at generation 6: -0.987282
>>> Best fitness at generation 7: -0.997961
>>> Best fitness at generation 8: -0.999954
>>> Best fitness at generation 9: -0.999995
>>> Best fitness at generation 10: -0.999999

All the examples can be found in this repository.

Background

Evolutionary optimization algorithms are a subdomain of evolutionary computation. Their goal is to minimize/maximize a function without using any gradient information (usually because there isn't any gradient available). They share the common property of exploring the search space by breeding, mutating, evaluating, and sorting so-called individuals. Most evolutionary algorithms are designed to handle real valued functions, however in practice they are commonly used for handling more exotic problems. For example genetic algorithms can be used to find the optimal structure of a neural network.

eaopt provides implementations for various evolutionary optimization algorithms. Implementation-wise, the idea is that most (if not all) of said algorithms can be written as special cases of a genetic algorithm. Indeed this is made possible by using a generic definition of a genetic algorithm by allowing the mutation, crossover, selection, and replacement procedures to be modified at will. The GA struct is thus the most flexible struct of eaopt, the other algorithms are written on top of it. If you don't find any algorithm that suits your need then you can easily write your own operators (as is done in most of the examples).

Features

  • Different evolutionary algorithms are available with a consistent API
  • You can practically do anything by using the GA struct
  • Speciation and migration procedures are available
  • Common genetic operators (mutation, crossover, selection, migration, speciation) are already implemented
  • Function evaluation can be done in parallel if your function is costly

Usage

General advice

  • Evolutionary algorithms are usually designed for solving specific kinds of problems. Take a look at the Minimize function of each method to get an idea of what type of function it can optimize.
  • Use the associated constructor function of each method to initialize it. For example use the NewPSO function instead of instantiating the PSO struct yourself. Along with making your life easier, these functions provide the added benefit of checking for parameter input errors.
  • If you're going to use the GA struct then be aware that some evolutionary operators are already implemented in eaopt (you don't necessarily have to reinvent the wheel).
  • Don't feel overwhelmed by the fact that algorithms are implemented as special cases of genetic algorithms. It doesn't matter if you just want to get things done, it just makes things easier under the hood.

Genetic algorithms

Overview

Genetic algorithms are the backbone of eaopt. Most of the other algorithms available in eaopt are implemented as special cases of GAs. A GA isn't an algorithm per say, but rather a blueprint which can be used to optimize any kind of problem.

In a nutshell, a GA solves an optimization problem by doing the following:

  1. Generate random solutions to a problem.
  2. Assign a fitness to each solutions.
  3. Check if a new best solution has been found.
  4. Apply genetic operators following a given evolutionary model.
  5. Repeat from step 2 until the stopping criterion is satisfied.

This description is voluntarily vague. It is up to the user to define the problem and the genetic operators to use. Different categories of genetic operators exist:

  • Mutation operators modify an existing solution.
  • Crossover operators generate a new solution by combining two or more existing ones.
  • Selection operators selects individuals that are to be evolved.
  • Migration swaps individuals between populations.
  • Speciation clusters individuals into subpopulations.

Popular stopping criteria include

  • a fixed number of generations,
  • a fixed duration,
  • an indicator that the population is stagnating.

Genetic algorithms can be used via the GA struct. The necessary steps for using the GA struct are

  1. Implement the Genome interface to model your problem
  2. Instantiate a GA struct (preferably via the GAConfig struct)
  3. Call the GA's Minimize function and check the HallOfFame field
Implementing the Genome interface

To use the GA struct you first have to implement the Genome interface, which is used to define the logic that is specific to your problem (logic that eaopt doesn't know about). For example this is where you will define an Evaluate() method for evaluating a particular problem. The GA struct contains context-agnostic information. For example this is where you can choose the number of individuals in a population (which is a separate concern from your particular problem). Apart from a good design pattern, decoupling the problem definition from the optimization through the Genome interface means that eaopt can be used to optimize any kind of problem.

Let's have a look at the Genome interface.

type Genome interface {
    Evaluate() (float64, error)
    Mutate(rng *rand.Rand)
    Crossover(genome Genome, rng *rand.Rand)
    Clone() Genome
}

The Evaluate() method returns the fitness of a genome. The sweet thing is that you can do whatever you want in this method. Your struct that implements the interface doesn't necessarily have to be a slice. The Evaluate() method is your problem to deal with. eaopt only needs it's output to be able to function. You can also return an error which eaopt will catch and return when calling ga.Initialize() and ga.Evolve().

The Mutate(rng *rand.Rand) method is where you can modify an existing solution by tinkering with it's variables. The way in which you should mutate a solution essentially boils down to your particular problem. eaopt provides some common mutation methods that you can use instead of reinventing the wheel -- this is what is being done in most of the examples.

The Crossover(genome Genome, rng *rand.Rand) method combines two individuals. The important thing to notice is that the type of first argument differs from the struct calling the method. Indeed the first argument is a Genome that has to be casted into your struct before being able to apply a crossover operator. This is due to the fact that Go doesn't provide generics out of the box; it's easier to convince yourself by checking out the examples.

The Clone() method is there to produce independent copies of the struct you want to evolve. This is necessary for internal reasons and ensures that pointer fields are not pointing to identical memory addresses. Usually this is not too difficult implement; you just have to make sure that the clones you produce are not shallow copies of the genome that is being cloned. This is also fairly easy to unit test.

Once you have implemented the Genome interface you have provided eaopt with all the information it couldn't guess for you.

Instantiate the GA struct

You can now instantiate a GA and use it to find an optimal solution to your problem. The GA struct has a lot of fields, hence the recommended way is to use the GAConfig struct and call it's NewGA method.

Let's have a look at the GAConfig struct.

type GAConfig struct {
    // Required fields
    NPops        uint
    PopSize      uint
    NGenerations uint
    HofSize      uint
    Model        Model

    // Optional fields
    ParallelEval bool // Whether to evaluate Individuals in parallel or not
    Migrator     Migrator
    MigFrequency uint // Frequency at which migrations occur
    Speciator    Speciator
    Logger       *log.Logger
    Callback     func(ga *GA)
    EarlyStop    func(ga *GA) bool
    RNG          *rand.Rand
}
  • Required fields
    • NPops determines the number of populations that will be used.
    • PopSize determines the number of individuals inside each population.
    • NGenerations determines for many generations the populations will be evolved.
    • HofSize determines how many of the best individuals should be recorded.
    • Model is a struct that determines how to evolve each population of individuals.
  • Optional fields
    • ParallelEval determines if a population is evaluated in parallel. The rule of thumb is to set this to true if your Evaluate method is expensive, if not it won't be worth the overhead. Refer to the section on parallelism for a more comprehensive explanation.
    • Migrator and MigFrequency should be provided if you want to exchange individuals between populations in case of a multi-population GA. If not the populations will be run independently. Again this is an advanced concept in the genetic algorithms field that you shouldn't deal with at first.
    • Speciator will split each population in distinct species at each generation. Each specie will be evolved separately from the others, after all the species has been evolved they are regrouped.
    • Logger can be used to record basic population statistics, you can read more about it in the logging section.
    • Callback will execute any piece of code you wish every time ga.Evolve() is called. Callback will also be called when ga.Initialize() is. Using a callback can be useful for many things:
      • Calculating specific population statistics that are not provided by the logger
      • Changing parameters of the GA after a certain number of generations
      • Monitoring convergence
    • EarlyStop will be called before each generation to check if the evolution should be stopped early.
    • RNG can be set to make results reproducible. If it is not provided then a default rand.New(rand.NewSource(time.Now().UnixNano())) will be used. If you want to make your results reproducible use a constant source, e.g. rand.New(rand.NewSource(42)).

Once you have instantiated a GAConfig you can call it's NewGA method to obtain a GA. The GA struct has the following definition:

type GA struct {
    GAConfig

    Populations Populations
    HallOfFame  Individuals
    Age         time.Duration
    Generations uint
}

Naturally a GA stores a copy of the GAConfig that was used to instantiate it. Apart from this the following fields are available:

  • Populations is where all the current populations and individuals are kept.
  • HallOfFame contains the HofSize best individuals ever encountered. This slice is always sorted, meaning that the first element of the slice will be the best individual ever encountered.
  • Age indicates the duration the GA has spent evolving.
  • Generations indicates how many how many generations have gone by.

You could bypass the NewGA method instantiate a GA with a GAConfig but this would leave the GAConfig's fields unchecked for input errors.

Calling the Minimize method

You are now all set to find an optimal solution to your problem. To do so you have to call the GA's Minimize function which has the following signature:

func (ga *GA) Minimize(newGenome func(rng *rand.Rand) Genome) error

You have to provide the Minimize a function which returns a Genome. It is recommended that the Genome thus produced contains random values. This is where the connection between the Genome interface and the GA struct is made.

The Minimize function will return an error (nil if everything went okay) once it is done. You can done access the first entry in the HallOfFame field to retrieve the best encountered solution.

Using the Slice interface

Classically GAs are used to optimize problems where the genome has a slice representation - eg. a vector or a sequence of DNA code. Almost all the mutation and crossover algorithms available in eaopt are based on the Slice interface which has the following definition.

type Slice interface {
    At(i int) interface{}
    Set(i int, v interface{})
    Len() int
    Swap(i, j int)
    Slice(a, b int) Slice
    Split(k int) (Slice, Slice)
    Append(Slice) Slice
    Replace(Slice)
    Copy() Slice
}

Internally IntSlice, Float64Slice and StringSlice implement this interface so that you can use the available operators for most use cases. If however you wish to use the operators with slices of a different type you will have to implement the Slice interface. Although there are many methods to implement, they are all trivial (have a look at slice.go and the TSP example.

Models

eaopt makes it easy to use different so called models. Simply put, a models defines how a GA evolves a population of individuals through a sequence of genetic operators. It does so without considering whatsoever the intrinsics of the underlying operators. In a nutshell, an evolution model attempts to mimic evolution in the real world. It's extremely important to choose a good model because it is usually the highest influence on the performance of a GA.

Generational model

The generational model is one the, if not the most, popular models. Simply put it generates n offsprings from a population of size n and replaces the population with the offsprings. The offsprings are generated by selecting 2 individuals from the population and applying a crossover method to the selected individuals until the n offsprings have been generated. The newly generated offsprings are then optionally mutated before replacing the original population. Crossover generates two new individuals, thus if the population size isn't an even number then the second individual from the last crossover (individual n+1) won't be included in the new population.

generational
Steady state model

The steady state model differs from the generational model in that the entire population isn't replaced between each generations. Instead of adding the children of the selected parents into the next generation, the 2 best individuals out of the two parents and two children are added back into the population so that the population size remains constant. However, one may also replace the parents with the children regardless of their fitness. This method has the advantage of not having to evaluate the newly generated offsprings. Whats more, crossover often generates individuals who are sub-par but who have a lot of potential; giving individuals generated from crossover a chance can be beneficial on the long run.

steady-state
Select down to size model

The select down to size model uses two selection rounds. The first one is similar to the one used in the generational model. Parents are selected to generate new individuals through crossover. However, the offsprings are then merged with the original population and a second selection round occurs to determine which individuals will survive to the next generation. Formally m offsprings are generated from a population of n, the n+m individuals are then "selected down to size" so that there only remains n individuals.

select-down-to-size
Ring model

In the ring model, crossovers are applied to neighbours in a one-directional ring topology. Two by the two neighbours generate 2 offsprings. The best out of the 4 individuals (2 parents + 2 offsprings) replaces the first neighbour.

ring
Mutation only

It's possible to run a GA without crossover simply by mutating individuals. This can be done with the ModMutationOnly struct. At each generation each individual is mutated. ModMutationOnly has a strict field to determine if the mutant should replace the initial individual only if it's fitness is lower.

Speciation

Clusters, also called species in the literature, are a partitioning of individuals into smaller groups of similar individuals. Programmatically a cluster is a list of lists that each contain individuals. Individuals inside each species are supposed to be similar. The similarity depends on a metric, for example it could be based on the fitness of the individuals. In the literature, speciation is also called speciation.

The purpose of a partitioning individuals is to apply genetic operators to similar individuals. In biological terms this encourages "incest" and maintains isolated species. For example in nature animals usually breed with local mates and don't breed with different animal species.

Using speciation/speciation with genetic algorithms became "popular" when they were first applied to the optimization of neural network topologies. By mixing two neural networks during crossover, the resulting neural networks were often useless because the inherited weights were not optimized for the new topology. This meant that newly generated neural networks were not performing well and would likely disappear during the selection phase. Thus speciation was introduced so that neural networks evolved in similar groups in order for new neural networks wouldn't disappear immediately. Instead the similar neural networks would evolve between each other until they were good enough to mixed with the other neural networks.

With eaopt it's possible to use speciation on top of all the rest. To do so the Speciator field of the GA struct has to specified.

speciation
Multiple populations and migration

Multi-populations GAs run independent populations in parallel. They are not frequently used, however they are very easy to understand and to implement. In eaopt a GA struct contains a Populations field which stores each population in a slice. The number of populations is specified in the GAConfig's NPops field.

If Migrator and MigFrequency are not provided the populations will be run independently in parallel. However, if they are provided then at each generation number that is divisible by MigFrequency (for example 5 divides generation number 25) individuals will be exchanged between the populations following the Migrator.

Using multi-populations can be an easy way to gain in diversity. Moreover, not using multi-populations on a multi-core architecture is a waste of resources.

With eaopt you can use multi-populations and speciation at the same time. The following flowchart shows what that would look like.

multi-population_and_speciation
Logging population statistics

It's possible to log statistics for each population at every generation. To do so you simply have to provide the GA struct a Logger from the Go standard library. This is quite convenient because it allows you to decide where to write the log output, whether it be in a file or directly in the standard output.

ga.Logger = log.New(os.Stdout, "", log.Ldate|log.Ltime)

If a logger is provided, each row in the log output will include

  • the population ID,
  • the population minimum fitness,
  • the population maximum fitness,
  • the population average fitness,
  • the population's fitness standard deviation.

Particle swarm optimization

Description

Particle swarm optimization (PSO) can be used to optimize real valued functions. It maintains a population of candidate solutions called particles. The particles move around the search-space according to a mathematical formula that takes as input the particle's position and it's velocity. Each particle's movement is influenced by its's local best encountered position, as well as the best overall position in the search-space (these values are updated after each generation). This is expected to move the swarm toward the best solutions.

As can be expected there are many variants of PSO. The SPSO struct implements the SPSO-2011 standard.

Example

In this example we're going to minimize th Styblinski-Tang function with two dimensions. The global minimum is about -39.16599 times the number of dimensions.

package main

import (
    "fmt"
    m "math"
    "math/rand"

    "github.com/MaxHalford/eaopt"
)

func StyblinskiTang(X []float64) (y float64) {
    for _, x := range X {
        y += m.Pow(x, 4) - 16*m.Pow(x, 2) + 5*x
    }
    return 0.5 * y
}

func main() {
    // Instantiate SPSO
    var spso, err = eaopt.NewDefaultSPSO()
    if err != nil {
        fmt.Println(err)
        return
    }

    // Fix random number generation
    spso.GA.RNG = rand.New(rand.NewSource(42))

    // Run minimization
    _, y, err := spso.Minimize(StyblinskiTang, 2)
    if err != nil {
        fmt.Println(err)
        return
    }

    // Output best encountered solution
    fmt.Printf("Found minimum of %.5f, the global minimum is %.5f\n", y, -39.16599*2)
}

This should produce the following output.

>>> Found minimum of -78.23783, the global minimum is -78.33198
Parameters

You can (and should) instantiate an SPSO with the NewSPSO method. You can also use the NewDefaultSPSO method as is done in the previous example.

func NewSPSO(nParticles, nSteps uint, min, max, w float64, parallel bool, rng *rand.Rand) (*SPSO, error)
  • nParticles is the number of particles to use
  • nSteps is the number of steps during which evolution occurs
  • min and max are the boundaries from which the initial values are sampled from
  • w is the velocity amplifier
  • parallel determines if the particles are evaluated in parallel or not
  • rng is a random number generator, you can set it to nil if you want it to be random

Differential evolution

Description

Differential evolution (DE) somewhat resembles PSO and is also used for optimizing real-valued functions. At each generation, each so-called agent is moved according to the position of 3 randomly sampled agents. If the new position is not better than the current one then it is discarded.

As can be expected there are many variants of PSO. The SPSO struct implements the SPSO-2011 standard.

Example

In this example we're going to minimize th Ackley function with two dimensions. The global minimum is 0.

package main

import (
    "fmt"
    m "math"
    "math/rand"

    "github.com/MaxHalford/eaopt"
)

func Ackley(x []float64) float64 {
    var (
        a, b, c = 20.0, 0.2, 2 * m.Pi
        s1, s2  float64
        d       = float64(len(x))
    )
    for _, xi := range x {
        s1 += xi * xi
        s2 += m.Cos(c * xi)
    }
    return -a*m.Exp(-b*m.Sqrt(s1/d)) - m.Exp(s2/d) + a + m.Exp(1)
}

func main() {
    // Instantiate DiffEvo
    var de, err = eaopt.NewDefaultDiffEvo()
    if err != nil {
        fmt.Println(err)
        return
    }

    // Fix random number generation
    de.GA.RNG = rand.New(rand.NewSource(42))

    // Run minimization
    _, y, err := de.Minimize(Ackley, 2)
    if err != nil {
        fmt.Println(err)
        return
    }

    // Output best encountered solution
    fmt.Printf("Found minimum of %.5f, the global minimum is 0\n", y)
}

This should produce the following output.

>>> Found minimum of 0.00137, the global minimum is 0
Parameters

You can (and should) instantiate an DiffEvo with the NewDiffEvo method. You can also use the NewDefaultDiffEvo method as is done in the previous example.

func NewDiffEvo(nAgents, nSteps uint, min, max, cRate, dWeight float64, parallel bool, rng *rand.Rand) (*DiffEvo, error)
  • nAgents is the number of agents to use (it has to be at least 4)
  • nSteps is the number of steps during which evolution occurs
  • min and max are the boundaries from which the initial values are sampled from
  • cRate is the crossover rate
  • dWeight is the differential weight
  • parallel determines if the agents are evaluated in parallel or not
  • rng is a random number generator, you can set it to nil if you want it to be random

OpenAI evolution strategy

Description

OpenAI proposed a simple evolution strategy based on the use of natural gradients. The algorithm is dead simple:

  1. Choose a center mu at random
  2. Sample points around mu using a normal distribution
  3. Evaluate each point and obtain the natural gradient g
  4. Move mu along the natural gradient g using a learning rate
  5. Repeat from step 2 until satisfied
Example

In this example we're going to minimize th Rastrigin function with three dimensions. The global minimum is 0.

package main

import (
    "fmt"
    m "math"
    "math/rand"

    "github.com/MaxHalford/eaopt"
)

func Rastrigin(x []float64) (y float64) {
    y = 10 * float64(len(x))
    for _, xi := range x {
        y += m.Pow(xi, 2) - 10*m.Cos(2*m.Pi*xi)
    }
    return y
}

func main() {
    // Instantiate DiffEvo
    var oes, err = eaopt.NewDefaultOES()
    if err != nil {
        fmt.Println(err)
        return
    }

    // Fix random number generation
    oes.GA.RNG = rand.New(rand.NewSource(42))

    // Run minimization
    _, y, err := oes.Minimize(Rastrigin, 2)
    if err != nil {
        fmt.Println(err)
        return
    }

    // Output best encountered solution
    fmt.Printf("Found minimum of %.5f, the global minimum is 0\n", y)
}

This should produce the following output.

>>> Found minimum of 0.02270, the global minimum is 0
Parameters

You can (and should) instantiate an OES with the NewOES method. You can also use the NewDefaultOES method as is done in the previous example.

func NewOES(nPoints, nSteps uint, sigma, lr float64, parallel bool, rng *rand.Rand) (*OES, error)
  • nPoints is the number of points to use (it has to be at least 3)
  • nSteps is the number of steps during which evolution occurs
  • sigma determines the shape of the normal distribution used to sample new points
  • lr is the learning rate
  • parallel determines if the agents are evaluated in parallel or not
  • rng is a random number generator, you can set it to nil if you want it to be random

A note on parallelism

Evolutionary algorithms are famous for being embarrassingly parallel. Most of the operations can be run independently each one from another. For example individuals can be mutated in parallel because mutation doesn't have any side effects.

The Go language provides nice mechanisms to run stuff in parallel, provided you have more than one core available. However, parallelism is only worth it when the functions you want to run in parallel are heavy. If the functions are cheap then the overhead of spawning routines will be too high and not worth it. It's simply not worth using a routine for each individual because operations at an individual level are often not time consuming enough.

By default eaopt will evolve populations in parallel. This is because evolving one population implies a lot of operations and parallelism is worth it. If your Evaluate method is heavy then it might be worth evaluating individuals in parallel, which can done by setting the GA's ParallelEval field to true. Evaluating individuals in parallel can be done regardless of the fact that you are using more than one population.

FAQ

What if I don't want to use crossover?

Alas you still have to implement the Genome interface. You can however provide a blank Crossover method just to satisfy the interface.

type Vector []float64

func (X Vector) Crossover(Y eaopt.Genome, rng *rand.Rand) {}

Why aren't my Mutate and Crossover methods modifying my Genomes?

The Mutate and Crossover methods have to modify the values of the Genome in-place. The following code will work because the Vector is a slice; slices in Go are references to underlying data, hence modifying a slice modifies them in-place.

type Vector []float64

func (X Vector) Mutate(rng *rand.Rand) {
    eaopt.MutNormal(X, rng, 0.5)
}

On the contrary, mutating other kind of structs will require the * symbol to access the struct's pointer. Notice the *Name in the following example.

type Name string

func (n *Name) Mutate(rng *rand.Rand) {
    n = randomName()
}

Are evolutionary optimization algorithms any good?

For real-valued, differentiable functions, evolutionary optimization algorithms will probably not fair well against methods based on gradient descent. Intuitively this is because evolutionary optimization algorithms ignore the shape and slope of the function. However gradient descent algorithms usually get stuck in local optimas, whereas evolutionary optimization algorithms don't.

As mentioned earlier, some problems can simply not be written down as closed-form expressions. In this case methods based on gradient information can't be used whilst evolutionary optimization algorithms can still be used. For example tuning the number of layers and of neurons per layer in a neural network is an open problem that doesn't yet have a reliable solution. Neural networks architectures used in production are usually designed by human experts. The field of neuroevolution aims to train neural networks with evolutionary algorithms.

How can I contribute?

Feel free to implement your own operators or to make suggestions! Check out the CONTRIBUTING file for some guidelines. This repository has a long list of existing evolutionary algorithms.

Dependencies

You can see the list of dependencies here and the graph view here. Here is the list of external dependencies:

License

The MIT License (MIT). Please see the LICENSE file for more information.

Documentation

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

func CrossCX

func CrossCX(p1, p2 Slice)

CrossCX (Cycle Crossover). Cycles between the parents are indentified, they are then copied alternatively onto the offsprings. The CX method is deterministic and preserves gene uniqueness.

func CrossCXFloat64

func CrossCXFloat64(s1 []float64, s2 []float64)

CrossCXFloat64 calls CrossCX on a float64 slice.

func CrossCXInt

func CrossCXInt(s1 []int, s2 []int)

CrossCXInt calls CrossCX on an int slice.

func CrossCXString

func CrossCXString(s1 []string, s2 []string)

CrossCXString calls CrossCX on a string slice.

func CrossERX

func CrossERX(p1, p2 Slice)

CrossERX (Edge Recombination Crossover).

func CrossERXFloat64

func CrossERXFloat64(s1 []float64, s2 []float64)

CrossERXFloat64 callsCrossERX on a float64 slice.

func CrossERXInt

func CrossERXInt(s1 []int, s2 []int)

CrossERXInt calls CrossERX on an int slice.

func CrossERXString

func CrossERXString(s1 []string, s2 []string)

CrossERXString calls CrossERX on a string slice.

func CrossGNX

func CrossGNX(p1 Slice, p2 Slice, n uint, rng *rand.Rand)

CrossGNX (Generalized N-point Crossover). An identical point is chosen on each parent's genome and the mirroring segments are switched. n determines the number of crossovers (aka mirroring segments) to perform. n has to be equal or lower than the number of genes in each parent.

func CrossGNXFloat64

func CrossGNXFloat64(s1 []float64, s2 []float64, n uint, rng *rand.Rand)

CrossGNXFloat64 calls CrossGNX on two float64 slices.

func CrossGNXInt

func CrossGNXInt(s1 []int, s2 []int, n uint, rng *rand.Rand)

CrossGNXInt calls CrossGNX on two int slices.

func CrossGNXString

func CrossGNXString(s1 []string, s2 []string, n uint, rng *rand.Rand)

CrossGNXString calls CrossGNX on two string slices.

func CrossOX

func CrossOX(p1 Slice, p2 Slice, rng *rand.Rand)

CrossOX (Ordered Crossover). Part of the first parent's genome is copied onto the first offspring's genome. Then the second parent's genome is iterated over, starting on the right of the part that was copied. Each gene of the second parent's genome is copied onto the next blank gene of the first offspring's genome if it wasn't already copied from the first parent. The OX method preserves gene uniqueness.

func CrossOXFloat64

func CrossOXFloat64(s1 []float64, s2 []float64, rng *rand.Rand)

CrossOXFloat64 calls CrossOX on a float64 slice.

func CrossOXInt

func CrossOXInt(s1 []int, s2 []int, rng *rand.Rand)

CrossOXInt calls CrossOX on a int slice.

func CrossOXString

func CrossOXString(s1 []string, s2 []string, rng *rand.Rand)

CrossOXString calls CrossOX on a string slice.

func CrossPMX

func CrossPMX(p1 Slice, p2 Slice, rng *rand.Rand)

CrossPMX (Partially Mapped Crossover). The offsprings are generated by copying one of the parents and then copying the other parent's values up to a randomly chosen crossover point. Each gene that is replaced is permuted with the gene that is copied in the first parent's genome. Two offsprings are generated in such a way (because there are two parents). The PMX method preserves gene uniqueness.

func CrossPMXFloat64

func CrossPMXFloat64(s1 []float64, s2 []float64, rng *rand.Rand)

CrossPMXFloat64 calls CrossPMX on a float64 slice.

func CrossPMXInt

func CrossPMXInt(s1 []int, s2 []int, rng *rand.Rand)

CrossPMXInt calls CrossPMX on an int slice.

func CrossPMXString

func CrossPMXString(s1 []string, s2 []string, rng *rand.Rand)

CrossPMXString calls CrossPMX on a string slice.

func CrossUniformFloat64

func CrossUniformFloat64(p1 []float64, p2 []float64, rng *rand.Rand)

CrossUniformFloat64 crossover combines two individuals (the parents) into one (the offspring). Each parent's contribution to the Genome is determined by the value of a probability p. Each offspring receives a proportion of both of it's parents genomes. The new values are located in the hyper-rectangle defined between both parent's position in Cartesian space.

func InitJaggFloat64

func InitJaggFloat64(n uint, lower, upper []float64, rng *rand.Rand) (floats []float64)

InitJaggFloat64 generates random float64s x such that lower < x < upper with jagged bounds

func InitNormFloat64

func InitNormFloat64(n uint, mean, std float64, rng *rand.Rand) (floats []float64)

InitNormFloat64 generates random float64s sampled from a normal distribution.

func InitUnifFloat64

func InitUnifFloat64(n uint, lower, upper float64, rng *rand.Rand) (floats []float64)

InitUnifFloat64 generates random float64s x such that lower < x < upper.

func InitUnifString

func InitUnifString(n uint, corpus []string, rng *rand.Rand) (strings []string)

InitUnifString generates random strings based on a given corpus. The strings are not necessarily distinct.

func InitUniqueString

func InitUniqueString(n uint, corpus []string, rng *rand.Rand) (strings []string)

InitUniqueString generates random string slices based on a given corpus, each element from the corpus is only represented once in each slice. The method starts by shuffling, it then assigns the elements of the corpus in increasing index order to an individual.

func MutNormalFloat64

func MutNormalFloat64(genome []float64, rate float64, rng *rand.Rand)

MutNormalFloat64 modifies a float64 gene if a coin toss is under a defined mutation rate. The new gene value is a random value sampled from a normal distribution centered on the gene's current value and with a standard deviation proportional to the current value. It does so for each gene.

func MutPermute

func MutPermute(genome Slice, n int, rng *rand.Rand)

MutPermute permutes two genes at random n times.

func MutPermuteFloat64

func MutPermuteFloat64(s []float64, n int, rng *rand.Rand)

MutPermuteFloat64 calls MutPermute on a float64 slice.

func MutPermuteInt

func MutPermuteInt(s []int, n int, rng *rand.Rand)

MutPermuteInt calls MutPermute on an int slice.

func MutPermuteString

func MutPermuteString(s []string, n int, rng *rand.Rand)

MutPermuteString callsMutPermute on a string slice.

func MutSplice

func MutSplice(genome Slice, rng *rand.Rand)

MutSplice splits a genome in 2 and glues the pieces back together in reverse order.

func MutSpliceFloat64

func MutSpliceFloat64(s []float64, rng *rand.Rand)

MutSpliceFloat64 calls MutSplice on a float64 slice.

func MutSpliceInt

func MutSpliceInt(s []int, rng *rand.Rand)

MutSpliceInt calls MutSplice on an int slice.

func MutSpliceString

func MutSpliceString(s []string, rng *rand.Rand)

MutSpliceString calls MutSplice on a string slice.

func MutUniformString

func MutUniformString(genome []string, corpus []string, n int, rng *rand.Rand)

MutUniformString picks a gene at random and replaces it with a random from a provided corpus. It repeats this n times.

Types

type Agent added in v0.4.2

type Agent struct {
	DE *DiffEvo
	// contains filtered or unexported fields
}

An Agent is a candidate solution to a problem.

func (Agent) Clone added in v0.4.2

func (a Agent) Clone() Genome

Clone returns a deep copy of an Agent.

func (*Agent) Crossover added in v0.4.2

func (a *Agent) Crossover(q Genome, rng *rand.Rand)

Crossover doesn't do anything.

func (Agent) Evaluate added in v0.4.2

func (a Agent) Evaluate() (float64, error)

Evaluate the Agent by computing the value of the function at the current position.

func (*Agent) Mutate added in v0.4.2

func (a *Agent) Mutate(rng *rand.Rand)

Mutate the Agent.

type DiffEvo added in v0.4.2

type DiffEvo struct {
	Min, Max float64 // Boundaries for initial values
	CRate    float64 // Crossover rate
	DWeight  float64 // Differential weight
	NDims    uint
	F        func(x []float64) float64
	GA       *GA
}

DiffEvo implements differential evolution.

Example
// Instantiate DiffEvo
var de, err = NewDefaultDiffEvo()
if err != nil {
	fmt.Println(err)
	return
}

// Fix random number generation
de.GA.RNG = rand.New(rand.NewSource(42))

// Define function to minimize
var ackley = func(x []float64) float64 {
	var (
		a, b, c = 20.0, 0.2, 2 * math.Pi
		s1, s2  float64
		d       = float64(len(x))
	)
	for _, xi := range x {
		s1 += xi * xi
		s2 += math.Cos(c * xi)
	}
	return -a*math.Exp(-b*math.Sqrt(s1/d)) - math.Exp(s2/d) + a + math.Exp(1)
}

// Run minimization
x, y, err := de.Minimize(ackley, 2)
if err != nil {
	fmt.Println(err)
	return
}

// Output best encountered solution
fmt.Printf("Found minimum of %.5f in %v\n", y, x)
Output:

Found minimum of 0.00137 in [0.0004420129693826938 0.000195924625132926]

func NewDefaultDiffEvo added in v0.4.2

func NewDefaultDiffEvo() (*DiffEvo, error)

NewDefaultDiffEvo calls NewDiffEvo with default values.

func NewDiffEvo added in v0.4.2

func NewDiffEvo(nAgents, nSteps uint, min, max, cRate, dWeight float64,
	parallel bool, rng *rand.Rand) (*DiffEvo, error)

NewDiffEvo instantiates and returns a DiffEvo instance after having checked for input errors.

func (*DiffEvo) Minimize added in v0.4.2

func (de *DiffEvo) Minimize(f func([]float64) float64, nDims uint) ([]float64, float64, error)

Minimize finds the minimum of a given real-valued function.

type DistanceMemoizer

type DistanceMemoizer struct {
	Metric    Metric
	Distances map[string]map[string]float64
	// contains filtered or unexported fields
}

A DistanceMemoizer computes and stores Metric calculations.

func (*DistanceMemoizer) GetDistance

func (dm *DistanceMemoizer) GetDistance(a, b Individual) float64

GetDistance returns the distance between two Individuals based on the DistanceMemoizer's Metric field. If the two individuals share the same ID then GetDistance returns 0. DistanceMemoizer stores the calculated distances so that if GetDistance is called twice with the two same Individuals then the second call will return the stored distance instead of recomputing it.

type Float64Slice

type Float64Slice []float64

Float64Slice attaches the methods of Slice to []float64

func (Float64Slice) Append

func (s Float64Slice) Append(t Slice) Slice

Append method from Slice

func (Float64Slice) At

func (s Float64Slice) At(i int) interface{}

At method from Slice

func (Float64Slice) Copy

func (s Float64Slice) Copy() Slice

Copy method from Slice

func (Float64Slice) Len

func (s Float64Slice) Len() int

Len method from Slice

func (Float64Slice) Replace

func (s Float64Slice) Replace(t Slice)

Replace method from Slice

func (Float64Slice) Set

func (s Float64Slice) Set(i int, v interface{})

Set method from Slice

func (Float64Slice) Slice

func (s Float64Slice) Slice(a, b int) Slice

Slice method from Slice

func (Float64Slice) Split

func (s Float64Slice) Split(k int) (Slice, Slice)

Split method from Slice

func (Float64Slice) Swap

func (s Float64Slice) Swap(i, j int)

Swap method from Slice

type GA

type GA struct {
	GAConfig `json:"-"`

	// Fields generated at runtime
	Populations Populations   `json:"populations"`
	HallOfFame  Individuals   `json:"hall_of_fame"` // Sorted best Individuals ever encountered
	Age         time.Duration `json:"duration"`     // Duration during which the GA has been evolved
	Generations uint          `json:"generations"`  // Number of generations the GA has been evolved
}

A GA contains populations which themselves contain individuals.

func (*GA) Minimize added in v0.4.2

func (ga *GA) Minimize(newGenome func(rng *rand.Rand) Genome) error

Minimize evolves the GA's Populations following the given evolutionary method. The GA's hall of fame is updated after each generation.

type GAConfig added in v0.4.2

type GAConfig struct {
	// Required fields
	NPops        uint
	PopSize      uint
	NGenerations uint
	HofSize      uint
	Model        Model

	// Optional fields
	ParallelEval bool // Whether to evaluate Individuals in parallel or not
	Migrator     Migrator
	MigFrequency uint // Frequency at which migrations occur
	Speciator    Speciator
	Logger       *log.Logger
	Callback     func(ga *GA)
	EarlyStop    func(ga *GA) bool
	RNG          *rand.Rand
}

GAConfig contains fields that are necessary to instantiate a GA.

func NewDefaultGAConfig added in v0.4.2

func NewDefaultGAConfig() GAConfig

NewDefaultGAConfig returns a valid GAConfig with default values.

func (GAConfig) NewGA added in v0.4.2

func (conf GAConfig) NewGA() (*GA, error)

NewGA returns a pointer to a GA instance and checks for configuration errors.

type Genome

type Genome interface {
	Evaluate() (float64, error)
	Mutate(rng *rand.Rand)
	Crossover(genome Genome, rng *rand.Rand)
	Clone() Genome
}

A Genome is an entity that can have any number and kinds of properties. It can be evolved as long as it can be evaluated, mutated, crossedover, and cloned then it can.

type Individual

type Individual struct {
	Genome    Genome  `json:"genome"`
	Fitness   float64 `json:"fitness"`
	Evaluated bool    `json:"-"`
	ID        string  `json:"id"`
}

An Individual wraps a Genome and contains the fitness assigned to the Genome.

func NewIndividual

func NewIndividual(genome Genome, rng *rand.Rand) Individual

NewIndividual returns a fresh individual.

func (Individual) Clone

func (indi Individual) Clone(rng *rand.Rand) Individual

Clone an individual to produce a new individual with a different pointer and a different ID.

func (*Individual) Crossover

func (indi *Individual) Crossover(mate Individual, rng *rand.Rand)

Crossover an individual by calling the Crossover method of it's Genome.

func (*Individual) Evaluate

func (indi *Individual) Evaluate() error

Evaluate the fitness of an individual. Don't evaluate individuals that have already been evaluated.

func (*Individual) GetFitness

func (indi *Individual) GetFitness() float64

GetFitness returns the fitness of an Individual after making sure it has been evaluated.

func (Individual) IdxOfClosest

func (indi Individual) IdxOfClosest(indis Individuals, dm DistanceMemoizer) (i int)

IdxOfClosest returns the index of the closest individual from a slice of individuals based on the Metric field of a DistanceMemoizer.

func (*Individual) Mutate

func (indi *Individual) Mutate(rng *rand.Rand)

Mutate an individual by calling the Mutate method of it's Genome.

func (Individual) String added in v0.4.2

func (indi Individual) String() string

String representation of an Individual. A tick (✔) or cross (✘) marker is added at the end to indicate if the Individual has been evaluated or not.

type Individuals

type Individuals []Individual

Individuals is a convenience type, methods that belong to an Individual can be called declaratively.

func (Individuals) Clone

func (indis Individuals) Clone(rng *rand.Rand) Individuals

Clone returns the same exact same slice of individuals but with different pointers and ID fields.

func (Individuals) Evaluate

func (indis Individuals) Evaluate(parallel bool) error

Evaluate each Individual in a slice.

func (Individuals) FitAvg

func (indis Individuals) FitAvg() float64

FitAvg returns the average fitness of a slice of individuals.

func (Individuals) FitMax

func (indis Individuals) FitMax() float64

FitMax returns the worst fitness of a slice of individuals.

func (Individuals) FitMin

func (indis Individuals) FitMin() float64

FitMin returns the best fitness of a slice of individuals.

func (Individuals) FitStd

func (indis Individuals) FitStd() float64

FitStd returns the standard deviation of the fitness of a slice of individuals.

func (Individuals) IsSortedByFitness

func (indis Individuals) IsSortedByFitness() bool

IsSortedByFitness checks if individuals are ascendingly sorted by fitness.

func (Individuals) Mutate

func (indis Individuals) Mutate(mutRate float64, rng *rand.Rand)

Mutate each individual.

func (Individuals) SortByDistanceToMedoid

func (indis Individuals) SortByDistanceToMedoid(dm DistanceMemoizer)

SortByDistanceToMedoid sorts Individuals according to their distance to the medoid. The medoid is the Individual that has the lowest average distance to the rest of the Individuals.

func (Individuals) SortByFitness

func (indis Individuals) SortByFitness()

SortByFitness ascendingly sorts individuals by fitness.

func (Individuals) String added in v0.4.2

func (indis Individuals) String() string

String representation of a slice of Individuals.

type IntSlice

type IntSlice []int

IntSlice attaches the methods of Slice to []float64

func (IntSlice) Append

func (s IntSlice) Append(t Slice) Slice

Append method from Slice

func (IntSlice) At

func (s IntSlice) At(i int) interface{}

At method from Slice

func (IntSlice) Copy

func (s IntSlice) Copy() Slice

Copy method from Slice

func (IntSlice) Len

func (s IntSlice) Len() int

Len method from Slice

func (IntSlice) Replace

func (s IntSlice) Replace(t Slice)

Replace method from Slice

func (IntSlice) Set

func (s IntSlice) Set(i int, v interface{})

Set method from Slice

func (IntSlice) Slice

func (s IntSlice) Slice(a, b int) Slice

Slice method from Slice

func (IntSlice) Split

func (s IntSlice) Split(k int) (Slice, Slice)

Split method from Slice

func (IntSlice) Swap

func (s IntSlice) Swap(i, j int)

Swap method from Slice

type Metric

type Metric func(a, b Individual) float64

A Metric returns the distance between two genomes.

type MigRing

type MigRing struct {
	NMigrants uint // Number of migrants per exchange between Populations
}

MigRing migration exchanges individuals between consecutive Populations in a random fashion. One by one, each population exchanges NMigrants individuals at random with the next population. NMigrants should be not higher than the number of individuals in each population, else all the individuals will migrate and it will be as if nothing happened.

func (MigRing) Apply

func (mig MigRing) Apply(pops Populations, rng *rand.Rand)

Apply MigRing.

func (MigRing) Validate

func (mig MigRing) Validate() error

Validate MigRing fields.

type Migrator

type Migrator interface {
	Apply(pops Populations, rng *rand.Rand)
	Validate() error
}

Migrator applies crossover to the GA level, as such it doesn't require an independent random number generator and can use the global one.

type ModDownToSize

type ModDownToSize struct {
	NOffsprings uint
	SelectorA   Selector
	SelectorB   Selector
	MutRate     float64
	CrossRate   float64
}

ModDownToSize implements the select down to size model.

func (ModDownToSize) Apply

func (mod ModDownToSize) Apply(pop *Population) error

Apply ModDownToSize.

func (ModDownToSize) Validate

func (mod ModDownToSize) Validate() error

Validate ModDownToSize fields.

type ModGenerational

type ModGenerational struct {
	Selector  Selector
	MutRate   float64
	CrossRate float64
}

ModGenerational implements the generational model.

func (ModGenerational) Apply

func (mod ModGenerational) Apply(pop *Population) error

Apply ModGenerational.

func (ModGenerational) Validate

func (mod ModGenerational) Validate() error

Validate ModGenerational fields.

type ModMutationOnly

type ModMutationOnly struct {
	Strict bool
}

ModMutationOnly implements the mutation only model. Each generation, all the Individuals are mutated. If Strict is true then the individuals are only replaced if the mutation is favorable.

func (ModMutationOnly) Apply

func (mod ModMutationOnly) Apply(pop *Population) error

Apply ModMutationOnly.

func (ModMutationOnly) Validate

func (mod ModMutationOnly) Validate() error

Validate ModMutationOnly fields.

type ModRing

type ModRing struct {
	Selector Selector
	MutRate  float64
}

ModRing implements the island ring model.

func (ModRing) Apply

func (mod ModRing) Apply(pop *Population) error

Apply ModRing.

func (ModRing) Validate

func (mod ModRing) Validate() error

Validate ModRing fields.

type ModSteadyState

type ModSteadyState struct {
	Selector  Selector
	KeepBest  bool
	MutRate   float64
	CrossRate float64
}

ModSteadyState implements the steady state model.

func (ModSteadyState) Apply

func (mod ModSteadyState) Apply(pop *Population) error

Apply ModSteadyState.

func (ModSteadyState) Validate

func (mod ModSteadyState) Validate() error

Validate ModSteadyState fields.

type Model

type Model interface {
	Apply(pop *Population) error
	Validate() error
}

A Model specifies a protocol for applying genetic operators to a population at generation i in order for it obtain better individuals at generation i+1.

type OES added in v0.4.2

type OES struct {
	Sigma        float64
	LearningRate float64
	Mu           []float64
	F            func([]float64) float64
	GA           *GA
}

OES implements a simple version of the evolution strategy proposed by OpenAI. Reference: https://arxiv.org/abs/1703.03864

Example
// Instantiate DiffEvo
var oes, err = NewDefaultOES()
if err != nil {
	fmt.Println(err)
	return
}

// Fix random number generation
oes.GA.RNG = rand.New(rand.NewSource(42))

// Define function to minimize
var rastrigin = func(x []float64) (y float64) {
	y = 10 * float64(len(x))
	for _, xi := range x {
		y += math.Pow(xi, 2) - 10*math.Cos(2*math.Pi*xi)
	}
	return y
}

// Run minimization
X, y, err := oes.Minimize(rastrigin, []float64{0, 0})
if err != nil {
	fmt.Println(err)
	return
}

// Output best encountered solution
fmt.Printf("Found minimum of %.5f in %v\n", y, X)
Output:

Found minimum of 0.02270 in [0.006807861794722094 -0.008251984117745246]

func NewDefaultOES added in v0.4.2

func NewDefaultOES() (*OES, error)

NewDefaultOES calls NewOES with default values.

func NewOES added in v0.4.2

func NewOES(nPoints, nSteps uint, sigma, lr float64, parallel bool, rng *rand.Rand) (*OES, error)

NewOES instantiates and returns a OES instance after having checked for input errors.

func (*OES) Minimize added in v0.4.2

func (oes *OES) Minimize(f func([]float64) float64, x []float64) ([]float64, float64, error)

Minimize finds the minimum of a given real-valued function.

type Particle added in v0.4.2

type Particle struct {
	CurrentX []float64
	CurrentY float64
	BestX    []float64
	BestY    float64
	Velocity []float64
	SPSO     *SPSO
}

A Particle is an element of a Swarm. It tracks it's current position, the best position it has encountered, and a velocity vector. It also has a pointer to the SPSO which generated it so that it can access the function to minimize and the global best position.

func (Particle) Clone added in v0.4.2

func (p Particle) Clone() Genome

Clone returns a deep copy of the Particle.

func (*Particle) Crossover added in v0.4.2

func (p *Particle) Crossover(q Genome, rng *rand.Rand)

Crossover doesn't do anything.

func (*Particle) Evaluate added in v0.4.2

func (p *Particle) Evaluate() (float64, error)

Evaluate the Particle by computing the value of the function at the current position. If the position is better than the best position encountered by the Particle then it replaces it. Likewhise, the global best position is replaced if the current position is better.

func (*Particle) Mutate added in v0.4.2

func (p *Particle) Mutate(rng *rand.Rand)

Mutate the Particle by modifying it's velocity and it's current position.

type Population

type Population struct {
	Individuals Individuals   `json:"indis"`
	Age         time.Duration `json:"age"`
	Generations uint          `json:"generations"`
	ID          string        `json:"id"`
	RNG         *rand.Rand
}

A Population contains individuals. Individuals mate within a population. Individuals can migrate from one population to another. Each population has a random number generator to bypass the global rand mutex.

func (Population) Log

func (pop Population) Log(logger *log.Logger)

Log a Population's current statistics with a provided log.Logger.

type Populations

type Populations []Population

Populations type is necessary for migration and speciation purposes.

func (Populations) Apply added in v0.4.2

func (pops Populations) Apply(f func(pop *Population) error) error

Apply a function to a slice of Populations.

type SPSO added in v0.4.2

type SPSO struct {
	Min, Max float64 // Boundaries for initial values
	W        float64
	NDims    uint
	BestX    []float64
	BestY    float64
	F        func([]float64) float64
	GA       *GA
	// contains filtered or unexported fields
}

SPSO implements the 2011 version of Standard Particle Swarm Optimization. It can optimize single-output real-valued functions. Reference: http://clerc.maurice.free.fr/pso/SPSO_descriptions.pdf

Example
// Instantiate SPSO
var spso, err = NewDefaultSPSO()
if err != nil {
	fmt.Println(err)
	return
}

// Fix random number generation
spso.GA.RNG = rand.New(rand.NewSource(42))

// Define function to minimize
var styblinskiTang = func(x []float64) (y float64) {
	for _, xi := range x {
		y += math.Pow(xi, 4) - 16*math.Pow(xi, 2) + 5*xi
	}
	return 0.5 * y
}

// Run minimization
x, y, err := spso.Minimize(styblinskiTang, 2)
if err != nil {
	fmt.Println(err)
	return
}

// Output best encountered solution
fmt.Printf("Found minimum of %.5f in %v\n", y, x)
Output:

Found minimum of -78.23783 in [-2.8586916496046983 -2.9619895273744623]

func NewDefaultSPSO added in v0.4.2

func NewDefaultSPSO() (*SPSO, error)

NewDefaultSPSO calls NewSPSO with default values.

func NewSPSO added in v0.4.2

func NewSPSO(nParticles, nSteps uint, min, max, w float64, parallel bool, rng *rand.Rand) (*SPSO, error)

NewSPSO instantiates and returns a SPSO instance after having checked for input errors.

func (*SPSO) Minimize added in v0.4.2

func (pso *SPSO) Minimize(f func([]float64) float64, nDims uint) ([]float64, float64, error)

Minimize finds the minimum of a given real-valued function.

type SelElitism

type SelElitism struct{}

SelElitism selection returns the n best individuals of a group.

func (SelElitism) Apply

func (sel SelElitism) Apply(n uint, indis Individuals, rng *rand.Rand) (Individuals, []int, error)

Apply SelElitism.

func (SelElitism) Validate

func (sel SelElitism) Validate() error

Validate SelElitism fields.

type SelRoulette

type SelRoulette struct{}

SelRoulette samples individuals through roulette wheel selection (also known as fitness proportionate selection).

func (SelRoulette) Apply

func (sel SelRoulette) Apply(n uint, indis Individuals, rng *rand.Rand) (Individuals, []int, error)

Apply SelRoulette.

func (SelRoulette) Validate

func (sel SelRoulette) Validate() error

Validate SelRoulette fields.

type SelTournament

type SelTournament struct {
	NContestants uint
}

SelTournament samples individuals through tournament selection. The tournament is composed of randomly chosen individuals. The winner of the tournament is the chosen individual with the lowest fitness. The obtained individuals are all distinct, in other words there are no repetitions.

func (SelTournament) Apply

func (sel SelTournament) Apply(n uint, indis Individuals, rng *rand.Rand) (Individuals, []int, error)

Apply SelTournament.

func (SelTournament) Validate

func (sel SelTournament) Validate() error

Validate SelTournament fields.

type Selector

type Selector interface {
	Apply(n uint, indis Individuals, rng *rand.Rand) (selected Individuals, indexes []int, err error)
	Validate() error
}

Selector chooses a subset of size n from a group of individuals. The group of individuals a Selector is applied to is expected to be sorted.

type Slice

type Slice interface {
	At(i int) interface{}
	Set(i int, v interface{})
	Len() int
	Swap(i, j int)
	Slice(a, b int) Slice
	Split(k int) (Slice, Slice)
	Append(Slice) Slice
	Replace(Slice)
	Copy() Slice
}

A Slice is a genome with a list-like structure.

type SpecFitnessInterval

type SpecFitnessInterval struct {
	K uint // Number of intervals
}

SpecFitnessInterval speciates a population based on the fitness of each individual where each species contains m = n/k (rounded to the closest upper integer) individuals with similar fitnesses. For example, with 4 species, 30 individuals would be split into 3 groups of 8 individuals and 1 group of 6 individuals (3*8 + 1*6 = 30). More generally each group is of size min(n-i, m) where i is a multiple of m.

func (SpecFitnessInterval) Apply

func (spec SpecFitnessInterval) Apply(indis Individuals, rng *rand.Rand) ([]Individuals, error)

Apply SpecFitnessInterval.

func (SpecFitnessInterval) Validate

func (spec SpecFitnessInterval) Validate() error

Validate SpecFitnessInterval fields.

type SpecKMedoids

type SpecKMedoids struct {
	K             uint // Number of medoids
	MinPerCluster uint
	Metric        Metric // Dissimimilarity measure
	MaxIterations uint
}

SpecKMedoids (k-medoid clustering).

func (SpecKMedoids) Apply

func (spec SpecKMedoids) Apply(indis Individuals, rng *rand.Rand) ([]Individuals, error)

Apply SpecKMedoids.

func (SpecKMedoids) Validate

func (spec SpecKMedoids) Validate() error

Validate SpecKMedoids fields.

type Speciator

type Speciator interface {
	Apply(indis Individuals, rng *rand.Rand) ([]Individuals, error)
	Validate() error
}

A Speciator partitions a population into n smaller subpopulations. Each subpopulation shares the same random number generator inherited from the initial population.

type StringSlice

type StringSlice []string

StringSlice attaches the methods of Slice to []float64

func (StringSlice) Append

func (s StringSlice) Append(t Slice) Slice

Append method from Slice

func (StringSlice) At

func (s StringSlice) At(i int) interface{}

At method from Slice

func (StringSlice) Copy

func (s StringSlice) Copy() Slice

Copy method from Slice

func (StringSlice) Len

func (s StringSlice) Len() int

Len method from Slice

func (StringSlice) Replace

func (s StringSlice) Replace(t Slice)

Replace method from Slice

func (StringSlice) Set

func (s StringSlice) Set(i int, v interface{})

Set method from Slice

func (StringSlice) Slice

func (s StringSlice) Slice(a, b int) Slice

Slice method from Slice

func (StringSlice) Split

func (s StringSlice) Split(k int) (Slice, Slice)

Split method from Slice

func (StringSlice) Swap

func (s StringSlice) Swap(i, j int)

Swap method from Slice

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL