af

package module
v0.0.0-...-1a88873 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 18, 2019 License: MIT Imports: 2 Imported by: 1

README

af Build Status Go Report Card GoDoc

Activation functions for neural networks.

These activation functions are included:

  • Swish (x / (1 + exp(-x)))
  • Sigmoid (1 / (1 + exp(-x)))
  • SoftPlus (log(1 + exp(x)))
  • Gaussian01 (exp(-(x * x) / 2.0))
  • Sin (math.Sin(math.Pi * x))
  • Cos (math.Cos(math.Pi * x))
  • Linear (x)
  • Inv (-x)
  • ReLU (x >= 0 ? x : 0)
  • Squared (x * x)

These math functions are included just for convenience:

  • Abs (math.Abs)
  • Tanh (math.Tanh)

One functions that takes two arguments is also included:

  • PReLU (x >= 0 ? x : x * a)

Requirements

  • Go 1.11 or later.

General information

Documentation

Overview

Package af provides several activation functions that can be used in neural networks

Index

Constants

This section is empty.

Variables

View Source
var (
	Sigmoid    = swish.Sigmoid
	Swish      = swish.Swish
	SoftPlus   = swish.SoftPlus
	Gaussian01 = swish.Gaussian01
	Linear     = func(x float64) float64 { return x }
	Inv        = func(x float64) float64 { return -x }
	Sin        = func(x float64) float64 { return math.Sin(math.Pi * x) }
	Cos        = func(x float64) float64 { return math.Cos(math.Pi * x) }
	Squared    = func(x float64) float64 { return x * x }
	Tanh       = math.Tanh
	Abs        = math.Abs
)

The swish package offers optimized Swish, Sigmoid SoftPlus and Gaussian01 activation functions

Functions

func PReLU

func PReLU(x, a float64) float64

PReLU is the parametric rectified linear unit. `x >= 0 ? x : a * x`

func ReLU

func ReLU(x float64) float64

ReLU is the "rectified linear unit" `x >= 0 ? x : 0`

func Step

func Step(x float64) float64

Step function

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL