syntheticattention

package
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 9, 2020 License: BSD-2-Clause Imports: 6 Imported by: 0

Documentation

Overview

This is an implementation of the Synthetic Attention described in: "SYNTHESIZER: Rethinking Self-Attention in Transformer Models" by Tay et al., 2020. (https://arxiv.org/pdf/2005.00743.pdf)

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	InputSize  int
	HiddenSize int
	ValueSize  int
	MaxLength  int
}

type ContextProb

type ContextProb struct {
	// contains filtered or unexported fields
}

type Model

type Model struct {
	Config
	FFN   *stack.Model
	Value *linear.Model
	W     *nn.Param `type:"weights"`
}

Model contains the serializable parameters.

func New

func New(config Config) *Model

New returns a new model with parameters initialized to zeros.

func (*Model) NewProc

func (m *Model) NewProc(ctx nn.Context) nn.Processor

NewProc returns a new processor to execute the forward step.

type Processor

type Processor struct {
	nn.BaseProcessor

	Attention *ContextProb
	// contains filtered or unexported fields
}

func (*Processor) Forward

func (p *Processor) Forward(xs ...ag.Node) []ag.Node

Forward performs the forward step for each input and returns the result.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL