scrape

package
v0.3.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 5, 2022 License: Apache-2.0 Imports: 20 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func StatisticSeries added in v0.0.5

func StatisticSeries(rows []parser.Row, rc []*relabel.Config, result *StatisticsSeriesResult)

StatisticSeries statistic load from metrics raw data

Types

type JobInfo

type JobInfo struct {
	// Config is the origin scrape config in config file
	Config *config.ScrapeConfig
	// Cli is the http.Cli for scraping
	// all scraping request will be proxy to env SCRAPE_PROXY if it is not empty
	Cli *http.Client
	// contains filtered or unexported fields
}

JobInfo contains http client for scraping target, and the origin scrape config

type Manager

type Manager struct {
	// contains filtered or unexported fields
}

Manager includes all jobs

func New

func New(keeAliveDisable bool, lg logrus.FieldLogger) *Manager

New create a Manager with specified Cli set

func (*Manager) ApplyConfig

func (s *Manager) ApplyConfig(cfg *prom.ConfigInfo) error

ApplyConfig update Manager from config

func (*Manager) GetJob

func (s *Manager) GetJob(job string) *JobInfo

GetJob search job by job name, nil will be return if job not exist

type MetricSamplesInfo added in v0.3.0

type MetricSamplesInfo struct {
	// Total is total samples appeared in this scape
	Total float64 `json:"total"`
	// Scraped is samples number after relabel
	Scraped float64 `json:"scraped"`
}

MetricSamplesInfo statistics sample about one metric

type Scraper added in v0.2.0

type Scraper struct {

	// HTTPResponse save the http response when RequestTo is called
	HTTPResponse *http.Response
	// contains filtered or unexported fields
}

Scraper do one scraping must call RequestTo befor ParseResponse

func NewScraper added in v0.2.0

func NewScraper(job *JobInfo, url string, log logrus.FieldLogger) *Scraper

NewScraper create a new Scraper

func (*Scraper) ParseResponse added in v0.2.0

func (s *Scraper) ParseResponse(do func(rows []parser.Row) error) error

ParseResponse parse metrics RequestTo must be called before ParseResponse

func (*Scraper) RequestTo added in v0.2.0

func (s *Scraper) RequestTo() error

RequestTo do http request to target response will be saved to s.HTTPResponse ParseResponse must be called if RequestTo return nil error

func (*Scraper) WithRawWriter added in v0.2.0

func (s *Scraper) WithRawWriter(w ...io.Writer)

WithRawWriter add writers data will be copy to writers when ParseResponse is processing gziped data will be decoded before write to writer

type StatisticsSeriesResult added in v0.3.0

type StatisticsSeriesResult struct {

	// ScrapedTotal is samples number total after relabel
	ScrapedTotal float64 `json:"scrapedTotal"`
	// Total is total samples appeared in this scape
	Total float64 `json:"total"`
	// MetricsTotal is samples number info about all metrics
	MetricsTotal map[string]*MetricSamplesInfo `json:"metricsTotal"`
	// contains filtered or unexported fields
}

StatisticsSeriesResult is the samples count in one scrape

func NewStatisticsSeriesResult added in v0.3.0

func NewStatisticsSeriesResult() *StatisticsSeriesResult

NewStatisticsSeriesResult return an empty StatisticsSeriesResult

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL