Documentation ¶
Overview ¶
This is an implementation of the Synthetic Attention described in: "SYNTHESIZER: Rethinking Self-Attention in Transformer Models" by Tay et al., 2020. (https://arxiv.org/pdf/2005.00743.pdf)
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ContextProb ¶
type ContextProb struct {
// contains filtered or unexported fields
}
Click to show internal directories.
Click to hide internal directories.