Documentation ¶
Overview ¶
Package af provides several activation functions that can be used in neural networks
Index ¶
Constants ¶
This section is empty.
Variables ¶
View Source
var ( Sigmoid = swish.Sigmoid Swish = swish.Swish SoftPlus = swish.SoftPlus Gaussian01 = swish.Gaussian01 Linear = func(x float64) float64 { return x } Inv = func(x float64) float64 { return -x } Sin = func(x float64) float64 { return math.Sin(math.Pi * x) } Cos = func(x float64) float64 { return math.Cos(math.Pi * x) } Squared = func(x float64) float64 { return x * x } Tanh = math.Tanh Abs = math.Abs )
The swish package offers optimized Swish, Sigmoid SoftPlus and Gaussian01 activation functions
Functions ¶
Types ¶
This section is empty.
Click to show internal directories.
Click to hide internal directories.