meta

package
v1.31.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 28, 2026 License: Apache-2.0 Imports: 3 Imported by: 0

Documentation

Overview

Experimental — this package is not yet wired into the main framework.

Package meta provides meta-learning algorithms for few-shot adaptation. (Stability: alpha)

The primary algorithm is MAML (Model-Agnostic Meta-Learning), which learns initialization weights that can be rapidly adapted to new tasks with only a few gradient steps. MAML operates with two nested optimization loops:

  • Inner loop: task-specific adaptation via gradient descent on a small support set (few-shot learning).
  • Outer loop: meta-update that optimizes the initial weights so that inner-loop adaptation generalizes well across a distribution of tasks.

Usage:

config := meta.MAMLConfig{
    InnerLR:         0.01,
    OuterLR:         0.001,
    InnerSteps:      5,
    NTasksPerBatch:  4,
}
maml := meta.NewMAML(config)
err := maml.MetaTrain(tasks, config)
adapted := maml.Adapt(newTask, 5)

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type AdaptedModel

type AdaptedModel struct {
	// contains filtered or unexported fields
}

AdaptedModel represents a model adapted to a specific task.

func (*AdaptedModel) Predict

func (a *AdaptedModel) Predict(input []float64) (float64, error)

Predict runs inference on the adapted model for a single input.

type MAML

type MAML struct {
	// contains filtered or unexported fields
}

MAML implements Model-Agnostic Meta-Learning.

func NewMAML

func NewMAML(config MAMLConfig) (*MAML, error)

NewMAML creates a new MAML instance with randomly initialized meta-parameters.

func (*MAML) Adapt

func (m *MAML) Adapt(task Task, steps int) *AdaptedModel

Adapt takes the current meta-parameters and adapts them to a new task using the specified number of inner-loop gradient steps.

func (*MAML) MetaLoss

func (m *MAML) MetaLoss(tasks []Task) float64

MetaLoss computes the average loss across tasks after inner-loop adaptation.

func (*MAML) MetaTrain

func (m *MAML) MetaTrain(tasks []Task, config MAMLConfig) error

MetaTrain runs the MAML meta-training loop across the given tasks. It initializes the network dimensions from the first task's data and runs MetaEpochs outer-loop iterations, each sampling NTasksPerBatch tasks.

type MAMLConfig

type MAMLConfig struct {
	// InnerLR is the learning rate for task-specific adaptation (inner loop).
	InnerLR float64
	// OuterLR is the learning rate for meta-parameter updates (outer loop).
	OuterLR float64
	// InnerSteps is the number of gradient steps in the inner loop.
	InnerSteps int
	// NTasksPerBatch is the number of tasks sampled per meta-update.
	NTasksPerBatch int
	// MetaEpochs is the number of outer-loop iterations.
	MetaEpochs int
	// HiddenDims specifies the hidden layer sizes for the internal MLP.
	HiddenDims []int
	// Seed, when non-nil, seeds the random number generator for reproducible
	// weight initialization and task sampling. If nil, a non-deterministic
	// source is used.
	Seed *uint64
}

MAMLConfig holds hyperparameters for MAML meta-learning.

type Task

type Task interface {
	// TrainData returns the support set (inputs and targets) for adaptation.
	TrainData() (inputs [][]float64, targets []float64)
	// TestData returns the query set (inputs and targets) for evaluation.
	TestData() (inputs [][]float64, targets []float64)
}

Task provides train/test data splits for a single meta-learning task.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL