downloader

package
v0.0.0-...-ddee7fb Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 9, 2021 License: Apache-2.0 Imports: 25 Imported by: 0

Documentation

Overview

Package downloader implements the pipeline to download file sets from an isolated server.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func CacheStats

func CacheStats(cache *cache.Cache) ([]byte, []byte, error)

CacheStats returns packed stats for cache miss/hit.

Types

type Downloader

type Downloader struct {
	// contains filtered or unexported fields
}

Downloader is a high level interface to an isolatedclient.Client.

Downloader provides functionality to download full isolated trees.

func New

func New(ctx context.Context, c *isolatedclient.Client, hash isolated.HexDigest, outputDir string, options *Options) *Downloader

New returns a Downloader instance, good to download one isolated.

ctx will be used for logging and clock.

The Client, hash and outputDir must be specified.

If options is nil, this will use defaults as described in the Options struct.

func (*Downloader) RootIsolated

func (d *Downloader) RootIsolated() (*isolated.Isolated, error)

RootIsolated returns Isolated for rootHash.

func (*Downloader) Start

func (d *Downloader) Start()

Start begins downloading the isolated.

func (*Downloader) Wait

func (d *Downloader) Wait() error

Wait waits for the completion of the download, and returns either `nil` if no errors occurred during the operation, or an `errors.MultiError` otherwise.

This will Start() the Downloader, if it hasn't been started already.

Calling this many times is safe (and will always return the same thing).

type FileStats

type FileStats struct {
	// These cover the files that the isolated file says to fetch.
	CountScheduled uint64
	CountCompleted uint64

	// These cover the bytes of the files that the isolated file describes, not
	// the bytes of the isolated files themselves.
	//
	// Note that these are potentially served from the local cache, and so you
	// could observe speeds much faster than the network speed :).
	BytesScheduled uint64
	BytesCompleted uint64
}

FileStats is very basic statistics about the progress of a FetchIsolatedTracked operation.

func (*FileStats) StatLine

func (f *FileStats) StatLine(previous *FileStats, span time.Duration) string

StatLine calculates a simple statistics line suitable for logging.

type Options

type Options struct {
	// FileCallback allows you to set a callback function that will be called with
	// every file name and metadata which is extracted to disk by the Downloader.
	//
	// This callback should execute quickly (e.g. push to channel, append to list),
	// as it will partially block the process of the download.
	//
	// Tarball archives behave a bit differently. The callback will be called for
	// individual files in the tarball, but the 'Digest' field will be empty. The
	// Size and Mode fields will be populated, however. The callback will ALSO be
	// called for the tarfile as a whole (but the tarfile will not actually exist
	// on disk).
	FileCallback func(string, *isolated.File)

	// FileStatsCallback is a callback function that will be called at intervals
	// with relevant statistics (see MaxFileStatsInterval).
	//
	// This callback should execute quickly (e.g. push to channel, append to list,
	// etc.) as it will partially block the process of the download.  However,
	// since it's only called once every "interval" amount of time, being a bit
	// slow here (e.g. doing console IO) isn't the worst.
	//
	// To allow this callback to actuate meaningfully for small downloads, this
	// will be called more frequently at the beginning of the download, and will
	// taper off to MaxFileStatsInterval.
	FileStatsCallback func(FileStats, time.Duration)

	// MaxFileStatsInterval changes the maximum interval that the
	// FileStatsCallback will be called at.
	//
	// At the beginning of the download the interval is 100ms, but it will ramp up
	// to the provided maxInterval. If you specify a MaxFileStatsInterval
	// smaller than 100ms, there will be no ramp up, just a fixed interval at the
	// one you specify here.
	//
	// Default: 5 seconds
	MaxFileStatsInterval time.Duration

	// MaxConcurrentJobs is the number of parallel worker goroutines the
	// downloader will have.
	//
	// Default: 8
	MaxConcurrentJobs int

	// Cache is used to save/load isolated items to/from cache.
	Cache *cache.Cache
}

Options are some optional bits you can pass to New.

type Stats

type Stats struct {
	Duration time.Duration `json:"duration"`

	ItemsCold []byte `json:"items_cold"`
	ItemsHot  []byte `json:"items_hot"`
}

Stats is stats for FetchAndMap.

func FetchAndMap

func FetchAndMap(ctx context.Context, isolatedHash isolated.HexDigest, c *isolatedclient.Client, cache *cache.Cache, outDir string) (*isolated.Isolated, Stats, error)

FetchAndMap fetches an isolated tree, create the tree and returns isolated tree.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL