syntheticattention

package
v0.7.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 24, 2021 License: BSD-2-Clause Imports: 7 Imported by: 0

Documentation

Overview

Package syntheticattention provides an implementation of the Synthetic Attention described in: "SYNTHESIZER: Rethinking Self-Attention in Transformer Models" by Tay et al., 2020. (https://arxiv.org/pdf/2005.00743.pdf)

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	InputSize  int
	HiddenSize int
	ValueSize  int
	MaxLength  int
}

Config provides configuration settings for a Synthetic Attention Model.

type ContextProb

type ContextProb struct {
	// Context encodings.
	Context []ag.Node
	// Prob attention scores.
	Prob []mat.Matrix
}

ContextProb is a pair of Context encodings and Prob attention scores.

type Model

type Model struct {
	nn.BaseModel
	Config
	FFN       *stack.Model
	Value     *linear.Model
	W         nn.Param     `spago:"type:weights"`
	Attention *ContextProb `spago:"scope:processor"`
}

Model contains the serializable parameters.

func New

func New(config Config) *Model

New returns a new model with parameters initialized to zeros.

func (*Model) Forward

func (m *Model) Forward(xs ...ag.Node) []ag.Node

Forward performs the forward step for each input node and returns the result.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL