Documentation
¶
Overview ¶
Package gonet provides a computational graph based implementation of neural net forward and backward propagation algorithm.
Index ¶
- func NodeValues(ns []*Node) []float64
- func SingleLinear(inputSize int, bias bool) *singleLinear
- func Train(model Model, samples []util.Sample, cfg *util.TrainConfig, lf LossFunction) time.Duration
- type E2ELoss
- type E2EPredictLoss
- type Embedding
- type FeedForwarder
- type Layer
- func AttentionBlockLayer(embDim, headNum int, buildAttention func(int, int) Layer) Layer
- func EmbeddingLayer(vocabSize, dim int) Layer
- func EmbeddingLayerFrom(emb *Embedding) Layer
- func KQVLayer(embDim, headSize int) Layer
- func LayerNormLayer(dim int) Layer
- func LinearLayer(fanIn, fanOut int, bias bool) Layer
- func MaskedSelfAttentionLayer(embDim, headSize int) Layer
- func MultiHeadAttentionLayer(embDim, headNum int, buildAttention func(int, int) Layer) Layer
- func PositionalEmbeddingLayer(vocabSize, ctxLen, dim int) Layer
- func ReluLayer() Layer
- func SigmoidLayer() Layer
- func SoftmaxLayer(t float64) Layer
- func TanhLayer() Layer
- func UnembeddingLayer(dim, vocabSize int, bias bool) Layer
- func UnembeddingLayerFrom(emb *Embedding, bias bool) Layer
- type LossFunction
- type Model
- type Node
- func CrossEntropyLoss(actual, predicted []*Node) *Node
- func DotProduct(left, right []*Node) *Node
- func Identity(n *Node) *Node
- func LayerNorm(xs, gamma, beta []*Node, eps float64) (ys []*Node)
- func Linear(ws, xs []*Node, bias *Node) *Node
- func MaskedAttention(ks, qs, vs [][]*Node) []*Node
- func MaxMarginLoss(actual, predicted []*Node) *Node
- func Mean(xs ...*Node) *Node
- func MeanVariance(xs ...*Node) (mean, variance *Node)
- func Multiply(prev ...*Node) *Node
- func NewInputNode(v float64, name string) *Node
- func NewInputNodeBatch(size int, nameFmt string, noGrad bool) []*Node
- func NewInputNodeNoGrad(v float64, name string) *Node
- func NewNode(v float64, name string) *Node
- func Normalize(eps float64, xs ...*Node) (ys []*Node)
- func Plus(prev ...*Node) *Node
- func RawCrossEntropyLoss(actual, predicted []*Node) *Node
- func Relu(prev *Node) *Node
- func ResidualSumSquaredLoss(actual, predicted []*Node) *Node
- func Sigmoid(prev *Node) *Node
- func Softmax(t float64, prev ...*Node) []*Node
- func Tanh(prev *Node) *Node
- func VectorAdd(left, right []*Node) (out []*Node)
- func (n *Node) Backward()
- func (n *Node) Forward()
- func (n *Node) ForwardBackward()
- func (n *Node) G() float64
- func (n *Node) Learn(delta float64)
- func (n *Node) Name() string
- func (n *Node) SetName(name string)
- func (n *Node) SetV(v float64)
- func (n *Node) String() string
- func (n *Node) V() float64
- func (n *Node) ZeroG()
- type Sample
- type SampleBatch
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func NodeValues ¶
func SingleLinear ¶
func Train ¶
func Train(model Model, samples []util.Sample, cfg *util.TrainConfig, lf LossFunction) time.Duration
Types ¶
type E2ELoss ¶
func TrainLossFunc ¶
func TrainLossFunc(model FeedForwarder, lf LossFunction) E2ELoss
type E2EPredictLoss ¶
func PredictLossFunc ¶
func PredictLossFunc(model FeedForwarder, lf LossFunction) E2EPredictLoss
type Embedding ¶
type Embedding struct {
// contains filtered or unexported fields
}
func NewEmbedding ¶
func (*Embedding) EmbeddingFeed ¶
func (*Embedding) UnembeddingFeed ¶
type FeedForwarder ¶
type Layer ¶
type Layer interface {
FeedForwarder
Parameters() []util.Parameter
Name() string
}
func AttentionBlockLayer ¶
func EmbeddingLayer ¶
func EmbeddingLayerFrom ¶
func LayerNormLayer ¶
func LinearLayer ¶
func MultiHeadAttentionLayer ¶
func SigmoidLayer ¶
func SigmoidLayer() Layer
func SoftmaxLayer ¶
func UnembeddingLayer ¶
func UnembeddingLayerFrom ¶
type LossFunction ¶
type Model ¶
func DecoderOnlyTransformer ¶
func EmbeddingModel ¶
func LinearModel ¶
func SequentialModel ¶
type Node ¶
type Node struct {
// contains filtered or unexported fields
}
func CrossEntropyLoss ¶
CrossEntropyLoss takes softmax activation into account already, so the values in `predicted` nodes are logits rather than probabilities actaully. If you would like to use the cross entropy function without softmax fused into it, use RawCrossEntoryLoss instead. Also note that, different from RawCrossEntoryLoss, all values in `actual` nodes are indexes of ground-truth. So do expect `actual` (indexes) and `predicted` (logits) to be of different lengths.
func DotProduct ¶
func MaskedAttention ¶
func MaxMarginLoss ¶
func MeanVariance ¶
func NewInputNode ¶
func NewInputNodeNoGrad ¶
func RawCrossEntropyLoss ¶
RawCrossEntropyLoss defines the cross-entropy loss function. `actual` represents the actual probability of each predefined class which should contain only one non-vanishing entry with value 1, meaning this class is observed in the data set sample (hence probability equals 1).
func ResidualSumSquaredLoss ¶
ResidualSumSquaredLoss is the Residual Sum of Squared (RSS) or Sum of Squared Errors (SSE).
func (*Node) ForwardBackward ¶
func (n *Node) ForwardBackward()
type SampleBatch ¶
type SampleBatch []*Sample
func NewSampleBatch ¶
func NewSampleBatch(inputSize, outputSize, batchSize int) SampleBatch
func (SampleBatch) Update ¶
func (sb SampleBatch) Update(samples []util.Sample)
Source Files
¶
Directories
¶
| Path | Synopsis |
|---|---|
|
Package arrimpl provides an array based implementation of Fully Connected Neural Network, which is much faster for large and deep networks.
|
Package arrimpl provides an array based implementation of Fully Connected Neural Network, which is much faster for large and deep networks. |
|
examples
|
|
|
binary_classifier
command
|
|
|
binary_classifier/data
command
|
|
|
digit_ocr
command
|
|
|
makemore
Package makemore define the common utilities that are shared by all "makemore" examples.
|
Package makemore define the common utilities that are shared by all "makemore" examples. |
|
makemore/nn_bigram
command
|
|
|
makemore/nn_quadgram
command
|
|
|
makemore/transformer
command
|
|
|
makemore/wavenet
command
|
|
|
word_embedding
command
|
|
|
Package util ...
|
Package util ... |