activations

package
v0.14.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 24, 2024 License: Apache-2.0 Imports: 7 Imported by: 0

Documentation

Overview

Package activations implements several common activations, and includes a generic Apply method to apply an activation by its type.

There is also FromName to convert an activation name (string) to its type, and ApplyFromContext that applies an activation based on the hyperparameter ParamActivation defined in a context.

Index

Constants

View Source
const (
	SeluAlpha = 1.67326324
	SeluScale = 1.05070098
)
View Source
const (
	// ParamActivation context hyperparameter defines the activation to use, for models using ApplyFromContext.
	// Available values are: `none`, `relu`, `leaky_relu`, `sigmoid`, `tanh` or `swish` (same as `silu`).
	// The default is `relu`.
	// See activations.TypeValues for complete list.
	ParamActivation = "activation"
)

Variables

This section is empty.

Functions

func Apply

func Apply(activation Type, x *Node) *Node

Apply the given activation type. The TypeNone activation is a no-op.

See TypeValues for valid values.

func ApplyFromContext

func ApplyFromContext(ctx *context.Context, x *Node) *Node

ApplyFromContext picks an activation function from the context using ParamActivation parameter, and applies it to x.

It defaults to "relu".

func Gelu added in v0.12.0

func Gelu(x *Node) *Node

Gelu activation function, the original Gelu function.

It is defined as Gelu(x) = x * 0.5 * (1 + Erf(x / √2)).

The GELU activation function was introduced in "Gaussian Error Linear Units (GELUs)" [Hendrycks et al. 2016](https://arxiv.org/abs/1606.08415).

The exact version is slower in TPUs due to the "Erf" function, but some argue it is more stable. See discussion in: https://github.com/jax-ml/jax/issues/4428

func GeluApproximate added in v0.12.0

func GeluApproximate(x *Node) *Node

GeluApproximate is a close approximation to the original Gelu function.

It is defined as Gelu(x) = x * 0.5 * (1 + Tanh(Sqrt(2/Pi) * (x+0.044715*x^3))).

The GELU activation function was introduced in "Gaussian Error Linear Units (GELUs)" [Hendrycks et al. 2016](https://arxiv.org/abs/1606.08415).

The exact version is slower in TPUs, some argue it is more stable. See discussion in: https://github.com/jax-ml/jax/issues/4428

func LeakyRelu

func LeakyRelu(x *Node) *Node

LeakyRelu activation function. It allows a small gradient when the unit is not active (x < 0). The `alpha` parameter is fixed at 0.3.

It returns `x if x >= 0; alpha*x if x < 0`.

func LeakyReluWithAlpha

func LeakyReluWithAlpha(x *Node, alpha float64) *Node

LeakyReluWithAlpha activation function. It allows a small gradient when the unit is not active (x < 0).

It returns `x if x >= 0; alpha*x if x < 0`.

func Relu

func Relu(x *Node) *Node

Relu activation function. It returns Max(x, 0), and is commonly used as an activation function in neural networks.

func Selu

func Selu(x *Node) *Node

Selu stands for Scaled Exponential Linear Unit (SELU) activation function is defined as: . $SeluScale * x$ if $x > 0$ . $SeluScale * SeluAlpha * (e^x - 1)$ if $x < 0$

Ideally, it should be matched with a "LecunNormal initializer" and the dropout variant called "AlphaDropout" -- TODO, neither are implemented yet.

func Swish

func Swish(x *Node) *Node

Swish activation (or SiLU) returns `x * Sigmoid(x)`.

The SiLU activation function was introduced in "Gaussian Error Linear Units (GELUs)" [Hendrycks et al. 2016](https://arxiv.org/abs/1606.08415) and "Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning" [Elfwing et al. 2017](https://arxiv.org/abs/1702.03118) and was independently discovered (and called swish) in "Searching for Apply Functions" [Ramachandran et al. 2017](https://arxiv.org/abs/1710.05941)

Here the beta parameter is fixed at 1.0.

func TypeStrings

func TypeStrings() []string

TypeStrings returns a slice of all String values of the enum

Types

type Type

type Type int

Type is an enum for the supported activation functions.

It is converted to snake-format strings (e.g.: TypeLeakyRelu -> "leaky_relu"), and can be converted from string by using

const (
	TypeNone Type = iota
	TypeRelu
	TypeSigmoid
	TypeLeakyRelu
	TypeSelu
	TypeSwish

	// TypeSilu is an alias to TypeSwish
	TypeSilu

	TypeTanh

	TypeGelu
	TypeGeluApprox
)

func FromName

func FromName(activationName string) Type

FromName converts the name of an activation to its type. It panics with a helpful message if name is invalid.

And empty string is converted to TypeNone.

func TypeString

func TypeString(s string) (Type, error)

TypeString retrieves an enum value from the enum constants string name. Throws an error if the param is not part of the enum.

func TypeValues

func TypeValues() []Type

TypeValues returns all values of the enum

func (Type) IsAType

func (i Type) IsAType() bool

IsAType returns "true" if the value is listed in the enum definition. "false" otherwise

func (Type) MarshalJSON

func (i Type) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaler interface for Type

func (Type) MarshalText

func (i Type) MarshalText() ([]byte, error)

MarshalText implements the encoding.TextMarshaler interface for Type

func (Type) MarshalYAML

func (i Type) MarshalYAML() (interface{}, error)

MarshalYAML implements a YAML Marshaler for Type

func (Type) String

func (i Type) String() string

func (*Type) UnmarshalJSON

func (i *Type) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaler interface for Type

func (*Type) UnmarshalText

func (i *Type) UnmarshalText(text []byte) error

UnmarshalText implements the encoding.TextUnmarshaler interface for Type

func (*Type) UnmarshalYAML

func (i *Type) UnmarshalYAML(unmarshal func(interface{}) error) error

UnmarshalYAML implements a YAML Unmarshaler for Type

func (Type) Values

func (Type) Values() []string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL