pkg/

directory
Version: v0.7.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 24, 2021 License: BSD-2-Clause

Directories

Path Synopsis
ml
ag
ag/encoding/dot
Package dot creates a graphviz compatible version of the ag.Graph.
Package dot creates a graphviz compatible version of the ag.Graph.
encoding/fofe
Package fofe provides an implementation of the Fixed-size Ordinally-Forgetting Encoding (FOFE) method.
Package fofe provides an implementation of the Fixed-size Ordinally-Forgetting Encoding (FOFE) method.
nn
nn/attention/lshattention
Package lshattention provides an implementation of the LSH-Attention model, as describe in `Reformer: The Efficient Transformer` by N. Kitaev, Ł. Kaiser, A. Levskaya (https://arxiv.org/pdf/2001.04451.pdf).
Package lshattention provides an implementation of the LSH-Attention model, as describe in `Reformer: The Efficient Transformer` by N. Kitaev, Ł. Kaiser, A. Levskaya (https://arxiv.org/pdf/2001.04451.pdf).
nn/attention/syntheticattention
Package syntheticattention provides an implementation of the Synthetic Attention described in: "SYNTHESIZER: Rethinking Self-Attention in Transformer Models" by Tay et al., 2020.
Package syntheticattention provides an implementation of the Synthetic Attention described in: "SYNTHESIZER: Rethinking Self-Attention in Transformer Models" by Tay et al., 2020.
nn/birnncrf
Package birnncrf provides an implementation of a Bidirectional Recurrent Neural Network (BiRNN) with a Conditional Random Fields (CRF) on tom.
Package birnncrf provides an implementation of a Bidirectional Recurrent Neural Network (BiRNN) with a Conditional Random Fields (CRF) on tom.
nn/bls
Package bls provides an implementation of the Broad Learning System (BLS) described in "Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture" by C. L. Philip Chen and Zhulin Liu, 2017.
Package bls provides an implementation of the Broad Learning System (BLS) described in "Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture" by C. L. Philip Chen and Zhulin Liu, 2017.
nn/conv1x1
Package conv1x1 implements a 1-dimensional 1-kernel convolution model
Package conv1x1 implements a 1-dimensional 1-kernel convolution model
nn/gmlp
Package gmlp implements a model composed by basic MLP layers with gating mechanism.
Package gmlp implements a model composed by basic MLP layers with gating mechanism.
nn/gnn/slstm
Package slstm implements a Sentence-State LSTM graph neural network.
Package slstm implements a Sentence-State LSTM graph neural network.
nn/gnn/startransformer
Package startransformer provides a variant implementation of the Star-Transformer model introduced by Qipeng Guo, Xipeng Qiu et al.
Package startransformer provides a variant implementation of the Star-Transformer model introduced by Qipeng Guo, Xipeng Qiu et al.
nn/normalization/adanorm
Package adanorm implements the Adaptive Normalization (AdaNorm) method.
Package adanorm implements the Adaptive Normalization (AdaNorm) method.
nn/normalization/fixnorm
Package fixnorm implements the fixnorm normalization method.
Package fixnorm implements the fixnorm normalization method.
nn/normalization/layernorm
Package layernorm implements the Layer Normalization (LayerNorm) i method.
Package layernorm implements the Layer Normalization (LayerNorm) i method.
nn/normalization/layernormsimple
Package layernormsimple implements a simple version of LayerNorm (LayerNorm-simple).
Package layernormsimple implements a simple version of LayerNorm (LayerNorm-simple).
nn/normalization/rmsnorm
Package rmsnorm implements the Root Mean Square Layer Normalization method.
Package rmsnorm implements the Root Mean Square Layer Normalization method.
nn/rae
Package rae provides an implementation of the recursive auto-encoder strategy described in "Towards Lossless Encoding of Sentences" by Prato et al., 2019.
Package rae provides an implementation of the recursive auto-encoder strategy described in "Towards Lossless Encoding of Sentences" by Prato et al., 2019.
nn/rc
Package rc contains built-in Residual Connections (RC).
Package rc contains built-in Residual Connections (RC).
nn/recurrent/horn
Package horn provides an implementation of Higher Order Recurrent Neural Networks (HORN).
Package horn provides an implementation of Higher Order Recurrent Neural Networks (HORN).
nn/recurrent/lstmsc
Package lstmsc provides an implementation of LSTM enriched with a PolicyGradient to enable Dynamic Skip Connections.
Package lstmsc provides an implementation of LSTM enriched with a PolicyGradient to enable Dynamic Skip Connections.
nn/recurrent/mist
Package mist provides an implementation of the MIST (MIxed hiSTory) recurrent network as described in "Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies" by Di Pietro et al., 2018 (https://arxiv.org/pdf/1702.07805.pdf).
Package mist provides an implementation of the MIST (MIxed hiSTory) recurrent network as described in "Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies" by Di Pietro et al., 2018 (https://arxiv.org/pdf/1702.07805.pdf).
nn/recurrent/nru
Package nru provides an implementation of the NRU (Non-Saturating Recurrent Units) recurrent network as described in "Towards Non-Saturating Recurrent Units for Modelling Long-Term Dependencies" by Chandar et al., 2019.
Package nru provides an implementation of the NRU (Non-Saturating Recurrent Units) recurrent network as described in "Towards Non-Saturating Recurrent Units for Modelling Long-Term Dependencies" by Chandar et al., 2019.
nn/recurrent/rla
Package rla provides an implementation of RLA (Recurrent Linear Attention).
Package rla provides an implementation of RLA (Recurrent Linear Attention).
nn/recurrent/srnn
Package srnn implements the SRNN (Shuffling Recurrent Neural Networks) by Rotman and Wolf, 2020.
Package srnn implements the SRNN (Shuffling Recurrent Neural Networks) by Rotman and Wolf, 2020.
nn/sgu
Package sgu implements the Spatial Gating Unit (SGU).
Package sgu implements the Spatial Gating Unit (SGU).
nlp
charlm
Package charlm provides an implementation of a character-level language model that uses a recurrent neural network as its backbone.
Package charlm provides an implementation of a character-level language model that uses a recurrent neural network as its backbone.
contextualstringembeddings
Package contextualstringembeddings provides an implementation of the "Contextual String Embeddings" of words (Akbik et al., 2018).
Package contextualstringembeddings provides an implementation of the "Contextual String Embeddings" of words (Akbik et al., 2018).
evolvingembeddings
Package evolvingembeddings provides a word embedding model that evolves by dynamically aggregating contextual embeddings over time during inference.
Package evolvingembeddings provides a word embedding model that evolves by dynamically aggregating contextual embeddings over time during inference.
sequencelabeler
Package sequencelabeler provides an implementation of a sequence labeling architecture composed by Embeddings -> BiRNN -> Scorer -> CRF.
Package sequencelabeler provides an implementation of a sequence labeling architecture composed by Embeddings -> BiRNN -> Scorer -> CRF.
stackedembeddings
Package stackedembeddings provides convenient types to stack multiple word embedding representations by concatenating them.
Package stackedembeddings provides convenient types to stack multiple word embedding representations by concatenating them.
tokenizers
Package tokenizers is an interim solution while developing `gotokenizers` (https://github.com/nlpodyssey/gotokenizers).
Package tokenizers is an interim solution while developing `gotokenizers` (https://github.com/nlpodyssey/gotokenizers).
tokenizers/basetokenizer
Package basetokenizer provides an implementation of a very simple tokenizer that splits by white-spaces (and alike) and punctuation symbols.
Package basetokenizer provides an implementation of a very simple tokenizer that splits by white-spaces (and alike) and punctuation symbols.
transformers/bart
Package bart implements the transformer model introduced by Mike et al., 2019.
Package bart implements the transformer model introduced by Mike et al., 2019.
transformers/bert
Package bert provides an implementation of BERT model (Bidirectional Encoder Representations from Transformers).
Package bert provides an implementation of BERT model (Bidirectional Encoder Representations from Transformers).
transformers/generation
Package generation implements a generation search algorithm for conditional generation.
Package generation implements a generation search algorithm for conditional generation.
ner

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL