spago

module
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 30, 2023 License: BSD-2-Clause

README



Build Coverage Go Report Card Maintainability Documentation License PRs Welcome Awesome Go


If you like the project, please ★ star this repository to show your support! 🤩

If you're interested in NLP-related functionalities, be sure to explore the Cybertron package!

Spago is a Machine Learning library written in pure Go designed to support relevant neural architectures in Natural Language Processing.

Spago is self-contained, in that it uses its own lightweight computational graph both for training and inference, easy to understand from start to finish.

It provides:

  • Automatic differentiation via dynamic define-by-run execution
  • Feed-forward layers (Linear, Highway, Convolution...)
  • Recurrent layers (LSTM, GRU, BiLSTM...)
  • Attention layers (Self-Attention, Multi-Head Attention...)
  • Gradient descent optimizers (Adam, RAdam, RMS-Prop, AdaGrad, SGD)
  • Gob compatible neural models for serialization

Usage

Requirements:

Clone this repo or get the library:

go get -u github.com/nlpodyssey/spago

Getting Started

A good place to start is by looking at the implementation of built-in neural models, such as the LSTM.

Example 1

Here is an example of how to calculate the sum of two variables:

package main

import (
	"fmt"
	"log"

	"github.com/nlpodyssey/spago/ag"
	"github.com/nlpodyssey/spago/mat"
)

func main() {
	// define the type of the elements in the tensors
	type T = float32

	// create a new node of type variable with a scalar
	a := mat.Scalar(T(2.0), mat.WithGrad(true)) // create another node of type variable with a scalar
	b := mat.Scalar(T(5.0), mat.WithGrad(true)) // create an addition operator (the calculation is actually performed here)
	c := ag.Add(a, b)

	// print the result
	fmt.Printf("c = %v (float%d)\n", c.Value(), c.Value().Item().BitSize())

	c.AccGrad(mat.Scalar(T(0.5)))

	if err := ag.Backward(c); err != nil {
		log.Fatalf("error during Backward(): %v", err)
	}

	fmt.Printf("ga = %v\n", a.Grad())
	fmt.Printf("gb = %v\n", b.Grad())
}

Output:

c = [7] (float32)
ga = [0.5]
gb = [0.5]

Example 2

Here is a simple implementation of the perceptron formula:

package main

import (
	"fmt"
	
	. "github.com/nlpodyssey/spago/ag"
	"github.com/nlpodyssey/spago/mat"
)

func main() {
	x := mat.Scalar(-0.8)
	w := mat.Scalar(0.4)
	b := mat.Scalar(-0.2)

	y := Sigmoid(Add(Mul(w, x), b))

	fmt.Printf("y = %0.3f\n", y.Value().Item())
}

Contributing

If you think something is missing or could be improved, please open issues and pull requests.

To start contributing, check the Contributing Guidelines.

Contact

We highly encourage you to create an issue as it will contribute to the growth of the community. However, if you prefer to communicate with us privately, please feel free to email Matteo Grella with any questions or comments you may have.

Acknowledgments

Spago is part of the open-source NLP Odyssey initiative initiated by members of the EXOP team (now part of Crisis24).

Sponsors

See our Open Collective page if you too are interested in becoming a sponsor.

Directories

Path Synopsis
cmd
embeddings
examples
graphviz module
mat
Package mattest provides utilities for testing code involving spaGO matrices.
Package mattest provides utilities for testing code involving spaGO matrices.
internal/f32/asm32
Package asm32 provides float32 vector primitives.
Package asm32 provides float32 vector primitives.
internal/f64/asm64
Package asm64 provides float64 vector primitives.
Package asm64 provides float64 vector primitives.
internal/matfuncs/cpu
Package cpu implements processor feature detection for various CPU architectures.
Package cpu implements processor feature detection for various CPU architectures.
internal/rand
Package rand implements pseudo-random number generators.
Package rand implements pseudo-random number generators.
nn
convolution/conv1x1
Package conv1x1 implements a 1-dimensional 1-kernel convolution model
Package conv1x1 implements a 1-dimensional 1-kernel convolution model
crf
gmlp
Package gmlp implements a model composed by basic MLP layers with gating mechanism.
Package gmlp implements a model composed by basic MLP layers with gating mechanism.
gnn/slstm
Package slstm implements a Sentence-State LSTM graph neural network.
Package slstm implements a Sentence-State LSTM graph neural network.
mlpmixer
Package mlpmixer implements the MLP-Mixer (Tolstikhin et al., 2021).
Package mlpmixer implements the MLP-Mixer (Tolstikhin et al., 2021).
normalization/adanorm
Package adanorm implements the Adaptive Normalization (AdaNorm) method.
Package adanorm implements the Adaptive Normalization (AdaNorm) method.
normalization/fixnorm
Package fixnorm implements the fixnorm normalization method.
Package fixnorm implements the fixnorm normalization method.
normalization/layernorm
Package layernorm implements the Layer Normalization (LayerNorm) i method.
Package layernorm implements the Layer Normalization (LayerNorm) i method.
normalization/layernormsimple
Package layernormsimple implements a simple version of LayerNorm (LayerNorm-simple).
Package layernormsimple implements a simple version of LayerNorm (LayerNorm-simple).
normalization/rmsnorm
Package rmsnorm implements the Root Mean Square Layer Normalization method.
Package rmsnorm implements the Root Mean Square Layer Normalization method.
sgu
Package sgu implements the Spatial Gating Unit (SGU).
Package sgu implements the Spatial Gating Unit (SGU).
approxlinear Module
sgd

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL