Documentation
¶
Overview ¶
Package fnn implements a generic FNN (Feedforward Neural Network) with various configurations. It should suffice for the common cases and can be extended as needed.
It also provides support for various hyperparameter configuration -- so the defaults can be given by the context parameters.
E.g: A FNN for a multi-class classification model with NumClasses classes.
func MyMode(ctx *context.Context, inputs []*Node) (outputs []*Node) {
x := inputs[0]
logits := fnn.New(ctx.In("model"), x, NumClasses).
NumHiddenLayers(3).
Apply("swish").
Dropout(0.3).
UseResidual(true).
Done()
return []*Node{logits}
}
Index ¶
- Constants
- type Config
- func (c *Config) Activation(activation activations.Type) *Config
- func (c *Config) Done() *Node
- func (c *Config) Dropout(ratio float64) *Config
- func (c *Config) Normalization(normalization string) *Config
- func (c *Config) NumHiddenLayers(numLayers, numHiddenNodes int) *Config
- func (c *Config) Regularizer(regularizer regularizers.Regularizer) *Config
- func (c *Config) Residual(useResidual bool) *Config
- func (c *Config) UseBias(useBias bool) *Config
Constants ¶
const ( // ParamNumHiddenLayers is the hyperparameter that defines the default number of hidden layers. // The default is 0 (int), so no hidden layers. ParamNumHiddenLayers = "fnn_num_hidden_layers" // ParamNumHiddenNodes is the hyperparameter that defines the default number of hidden nodes for KAN hidden layers. // Default is 10 (int). ParamNumHiddenNodes = "fnn_num_hidden_nodes" // ParamResidual is the hyperparameter that defines whether to use residual connections between each hidden layers. // If set, and the feature dimension (the last one) is the same between the input, it also adds a residual to the // input. Same with the outputDimensions. // Default is false (bool). ParamResidual = "fnn_residual" // ParamNormalization is the name of the normalization to use in between layers. // It is only applied if there are hidden layers. // See layers.KnownNormalizer: "layer" and "batch" are the most common normalization strategies. // // Defaults to the parameter "normalization" (layers.ParamNormalization) and if that is not set, to "none" // (same as ""), which is no normalization. ParamNormalization = "fnn_normalization" // ParamDropoutRate is the name of the dropout rate to use in between layers. // It is only applied if there are hidden layers. // // Defaults to the parameter "dropout_rate" (layers.ParamDropoutRate) and if that is not set, to 0.0 (no dropout) ParamDropoutRate = "fnn_dropout_rate" )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct {
// contains filtered or unexported fields
}
Config is created with New and can be configured with its methods, or simply setting the corresponding hyperparameters in the context.
func New ¶
New creates a configuration for a FNN (Feedforward Neural Network). This can be further configured through various methods and when finished, call Done to actually add the FNN computation graph and get the output.
The input is expected to have shape `[<batch dimensions...>, featureDimension]`, the output will have shape `[<batch dimensions...>, <outputDimensions...>]`.
Configuration options have defaults, but can also be configured through hyperparameters set in the context. See corresponding configuration methods for details.
E.g: A FNN for a multi-class classification model with NumClasses classes.
func MyMode(ctx *context.Context, inputs []*Node) (outputs []*Node) {
x := inputs[0]
logits := fnn.New(ctx.In("model"), x, NumClasses).
NumHiddenLayers(3).
Apply("swish").
Dropout(0.3).
Normalization("layer").
UseResidual(true).
Done()
return []*Node{logits}
}
func (*Config) Activation ¶
func (c *Config) Activation(activation activations.Type) *Config
Activation sets the activation for the FNN, in between each layer. The input and output layers don't get an activation layer.
The default is "relu", but it can be overridden by setting the hyperparameter layers.ParamActivation (="activation") in the context.
func (*Config) Done ¶
func (c *Config) Done() *Node
Done takes the configuration and apply the FNN as configured.
func (*Config) Dropout ¶
Dropout sets the dropout ratio for the FNN, in between each layer. The output layer doesn't get dropout. It uses the normalized form of dropout (see layers.DropoutNormalize).
If set to 0.0, no dropout is used.
The default is 0.0, but it can be overridden by setting the hyperparameter layers.ParamDropoutRate (="dropout_rate") in the context.
func (*Config) Normalization ¶
Normalization sets the normalization type to use in between layers. The input and output layers don't get a normalization layer.
The default is "none", but it can be overridden by setting the hyperparameter ParamNormalization (="fnn_normalization") in the context.
func (*Config) NumHiddenLayers ¶
NumHiddenLayers configure the number of hidden layers between the input and the output. Each layer will have numHiddenNodes nodes.
The default is 0 (no hidden layers), but it will be overridden if the hyperparameter ParamNumHiddenLayers is set in the context (ctx). The value for numHiddenNodes can also be configured with the hyperparameter ParamNumHiddenNodes.
func (*Config) Regularizer ¶
func (c *Config) Regularizer(regularizer regularizers.Regularizer) *Config
Regularizer to be applied to the learned weights (but not the biases). Default is none.
To use more than one type of Regularizer, use regularizers.Combine, and set the returned combined regularizer here.
The default is regularizers.FromContext, which is configured by regularizers.ParamL1 and regularizers.ParamL2.