das

package
v0.13.4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 21, 2024 License: Apache-2.0 Imports: 20 Imported by: 2

Documentation

Overview

Package das contains the most important functionality provided by celestia-node. It contains logic for running data availability sampling (DAS) routines on block headers in the network. DAS is the process of verifying the availability of block data by sampling shares of those blocks.

Package das can confirm the availability of block data in the network via the Availability interface which is implemented both in `full` and `light` mode. `Full` availability ensures the full reparation of a block's data square (meaning the instance will sample for enough shares to be able to fully repair the block's data square) while `light` availability samples for shares randomly until it is sufficiently likely that all block data is available as it is assumed that there are enough `light` availability instances active on the network doing sampling over the same block to collectively verify its availability.

The central component of this package is the `samplingCoordinator`. It launches parallel workers that perform DAS on new ExtendedHeaders in the network. The DASer kicks off this loop by loading its last DASed headers snapshot (`checkpoint`) and kicking off worker pool to DAS all headers between the checkpoint and the current network head. It subscribes to notifications about to new ExtendedHeaders, received via gossipsub. Newly found headers are being put into workers directly, without applying concurrency limiting restrictions.

Index

Constants

This section is empty.

Variables

View Source
var ErrInvalidOption = errors.New("das: invalid option")

ErrInvalidOption is an error that is returned by Parameters.Validate when supplied with invalid values. This error will also be returned by NewDASer if supplied with an invalid option

Functions

This section is empty.

Types

type DASer

type DASer struct {
	// contains filtered or unexported fields
}

DASer continuously validates availability of data committed to headers.

func NewDASer

func NewDASer(
	da share.Availability,
	hsub libhead.Subscriber[*header.ExtendedHeader],
	getter libhead.Getter[*header.ExtendedHeader],
	dstore datastore.Datastore,
	bcast fraud.Broadcaster[*header.ExtendedHeader],
	shrexBroadcast shrexsub.BroadcastFn,
	options ...Option,
) (*DASer, error)

NewDASer creates a new DASer.

func (*DASer) InitMetrics added in v0.4.0

func (d *DASer) InitMetrics() error

func (*DASer) SamplingStats added in v0.3.1

func (d *DASer) SamplingStats(ctx context.Context) (SamplingStats, error)

SamplingStats returns the current statistics over the DA sampling process.

func (*DASer) Start

func (d *DASer) Start(ctx context.Context) error

Start initiates subscription for new ExtendedHeaders and spawns a sampling routine.

func (*DASer) Stop

func (d *DASer) Stop(ctx context.Context) error

Stop stops sampling.

func (*DASer) WaitCatchUp added in v0.5.0

func (d *DASer) WaitCatchUp(ctx context.Context) error

WaitCatchUp waits for DASer to indicate catchup is done

type Option added in v0.5.0

type Option func(*DASer)

Option is the functional option that is applied to the daser instance to configure DASing parameters (the Parameters struct)

func WithBackgroundStoreInterval added in v0.5.0

func WithBackgroundStoreInterval(backgroundStoreInterval time.Duration) Option

WithBackgroundStoreInterval is a functional option to configure the daser's `backgroundStoreInterval` parameter Refer to WithSamplingRange documentation to see an example of how to use this

func WithConcurrencyLimit added in v0.5.0

func WithConcurrencyLimit(concurrencyLimit int) Option

WithConcurrencyLimit is a functional option to configure the daser's `ConcurrencyLimit` parameter Refer to WithSamplingRange documentation to see an example of how to use this

func WithSampleFrom added in v0.5.0

func WithSampleFrom(sampleFrom uint64) Option

WithSampleFrom is a functional option to configure the daser's `SampleFrom` parameter Refer to WithSamplingRange documentation to see an example of how to use this

func WithSampleTimeout added in v0.6.3

func WithSampleTimeout(sampleTimeout time.Duration) Option

WithSampleTimeout is a functional option to configure the daser's `SampleTimeout` parameter Refer to WithSamplingRange documentation to see an example of how to use this

func WithSamplingRange added in v0.5.0

func WithSamplingRange(samplingRange uint64) Option

WithSamplingRange is a functional option to configure the daser's `SamplingRange` parameter

Usage:
```
	WithSamplingRange(10)(daser)
```

or

```
	option := WithSamplingRange(10)
	// shenanigans to create daser
	option(daser)

```

func WithSamplingWindow added in v0.12.2

func WithSamplingWindow(samplingWindow time.Duration) Option

WithSamplingWindow is a functional option to configure the DASer's `SamplingWindow` parameter.

type Parameters added in v0.5.0

type Parameters struct {
	//  SamplingRange is the maximum amount of headers processed in one job.
	SamplingRange uint64

	// ConcurrencyLimit defines the maximum amount of sampling workers running in parallel.
	ConcurrencyLimit int

	// BackgroundStoreInterval is the period of time for background checkpointStore to perform a
	// checkpoint backup.
	BackgroundStoreInterval time.Duration

	// SampleFrom is the height sampling will start from if no previous checkpoint was saved
	SampleFrom uint64

	// SampleTimeout is a maximum amount time sampling of single block may take until it will be
	// canceled. High ConcurrencyLimit value may increase sampling time due to node resources being
	// divided between parallel workers. SampleTimeout should be adjusted proportionally to
	// ConcurrencyLimit.
	SampleTimeout time.Duration

	// SamplingWindow determines the time window that headers should fall into
	// in order to be sampled. If set to 0, the sampling window will include
	// all headers.
	SamplingWindow time.Duration
}

Parameters is the set of parameters that must be configured for the daser

func DefaultParameters added in v0.5.0

func DefaultParameters() Parameters

DefaultParameters returns the default configuration values for the daser parameters

func (*Parameters) Validate added in v0.5.0

func (p *Parameters) Validate() error

Validate validates the values in Parameters

All parameters must be positive and non-zero, except:
	BackgroundStoreInterval = 0 disables background storer,
	PriorityQueueSize = 0 disables prioritization of recently produced blocks for sampling

type SamplingStats added in v0.3.1

type SamplingStats struct {
	// all headers before SampledChainHead were successfully sampled
	SampledChainHead uint64 `json:"head_of_sampled_chain"`
	// all headers before CatchupHead were submitted to sampling workers. They could be either already
	// sampled, failed or still in progress. For in progress items check Workers stat.
	CatchupHead uint64 `json:"head_of_catchup"`
	// NetworkHead is the height of the most recent header in the network
	NetworkHead uint64 `json:"network_head_height"`
	// Failed contains all skipped headers heights with corresponding try count
	Failed map[uint64]int `json:"failed,omitempty"`
	// Workers has information about each currently running worker stats
	Workers []WorkerStats `json:"workers,omitempty"`
	// Concurrency amount of currently running parallel workers
	Concurrency int `json:"concurrency"`
	// CatchUpDone indicates whether all known headers are sampled
	CatchUpDone bool `json:"catch_up_done"`
	// IsRunning tracks whether the DASer service is running
	IsRunning bool `json:"is_running"`
}

SamplingStats collects information about the DASer process.

type WorkerStats added in v0.3.1

type WorkerStats struct {
	JobType jobType `json:"job_type"`
	Curr    uint64  `json:"current"`
	From    uint64  `json:"from"`
	To      uint64  `json:"to"`

	ErrMsg string `json:"error,omitempty"`
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL