samplers

package
v2.0.0+incompatible Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 9, 2018 License: MIT Imports: 13 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var AggregatesLookup = map[string]Aggregate{
	"min":    AggregateMin,
	"max":    AggregateMax,
	"median": AggregateMedian,
	"avg":    AggregateAverage,
	"count":  AggregateCount,
	"sum":    AggregateSum,
	"hmean":  AggregateHarmonicMean,
}

Functions

func ValidMetric added in v1.5.2

func ValidMetric(sample UDPMetric) bool

ValidMetric takes in an SSF sample and determines if it is valid or not.

Types

type Aggregate

type Aggregate int
const (
	AggregateMin Aggregate = 1 << iota
	AggregateMax
	AggregateMedian
	AggregateAverage
	AggregateCount
	AggregateSum
	AggregateHarmonicMean
)

type Counter

type Counter struct {
	Name string
	Tags []string
	// contains filtered or unexported fields
}

Counter is an accumulator

func NewCounter

func NewCounter(Name string, Tags []string) *Counter

NewCounter generates and returns a new Counter.

func (*Counter) Combine

func (c *Counter) Combine(other []byte) error

Combine merges the values seen with another set (marshalled as a byte slice)

func (*Counter) Export

func (c *Counter) Export() (JSONMetric, error)

Export converts a Counter into a JSONMetric which reports the rate.

func (*Counter) Flush

func (c *Counter) Flush(interval time.Duration) []InterMetric

Flush generates an InterMetric from the current state of this Counter.

func (*Counter) Sample

func (c *Counter) Sample(sample float64, sampleRate float32)

Sample adds a sample to the counter.

type Gauge

type Gauge struct {
	Name string
	Tags []string
	// contains filtered or unexported fields
}

Gauge retains whatever the last value was.

func NewGauge

func NewGauge(Name string, Tags []string) *Gauge

NewGauge genearaaaa who am I kidding just getting rid of the warning.

func (*Gauge) Combine

func (g *Gauge) Combine(other []byte) error

Combine is pretty naïve for Gauges, as it just overwrites the value.

func (*Gauge) Export

func (g *Gauge) Export() (JSONMetric, error)

Export converts a Gauge into a JSONMetric.

func (*Gauge) Flush

func (g *Gauge) Flush() []InterMetric

Flush generates an InterMetric from the current state of this gauge.

func (*Gauge) Sample

func (g *Gauge) Sample(sample float64, sampleRate float32)

Sample takes on whatever value is passed in as a sample.

type Histo

type Histo struct {
	Name  string
	Tags  []string
	Value *tdigest.MergingDigest
	// these values are computed from only the samples that came through this
	// veneur instance, ignoring any histograms merged from elsewhere
	// we separate them because they're easy to aggregate on the backend without
	// loss of granularity, and having host-local information on them might be
	// useful
	LocalWeight        float64
	LocalMin           float64
	LocalMax           float64
	LocalSum           float64
	LocalReciprocalSum float64
}

Histo is a collection of values that generates max, min, count, and percentiles over time.

func NewHist

func NewHist(Name string, Tags []string) *Histo

NewHist generates a new Histo and returns it.

func (*Histo) Combine

func (h *Histo) Combine(other []byte) error

Combine merges the values of a histogram with another histogram (marshalled as a byte slice)

func (*Histo) Export

func (h *Histo) Export() (JSONMetric, error)

Export converts a Histogram into a JSONMetric

func (*Histo) Flush

func (h *Histo) Flush(interval time.Duration, percentiles []float64, aggregates HistogramAggregates) []InterMetric

Flush generates InterMetrics for the current state of the Histo. percentiles indicates what percentiles should be exported from the histogram.

func (*Histo) Sample

func (h *Histo) Sample(sample float64, sampleRate float32)

Sample adds the supplied value to the histogram.

type HistogramAggregates

type HistogramAggregates struct {
	Value Aggregate
	Count int
}

type InterMetric added in v1.7.0

type InterMetric struct {
	Name      string
	Timestamp int64
	Value     float64
	Tags      []string
	Type      MetricType

	// Sinks, if non-nil, indicates which metric sinks a metric
	// should be inserted into. If nil, that means the metric is
	// meant to go to every sink.
	Sinks RouteInformation
}

InterMetric represents a metric that has been completed and is ready for flushing by sinks.

type InvalidMetrics added in v1.7.0

type InvalidMetrics interface {
	error

	// Samples returns any samples that couldn't be parsed or validated.
	Samples() []*ssf.SSFSample
}

InvalidMetrics is an error type returned if any metric could not be parsed.

type JSONMetric

type JSONMetric struct {
	MetricKey
	Tags []string `json:"tags"`
	// the Value is an internal representation of the metric's contents, eg a
	// gob-encoded histogram or hyperloglog.
	Value []byte `json:"value"`
}

JSONMetric is used to represent a metric that can be remarshaled with its internal state intact. It is used to send metrics from one Veneur to another.

type MetricKey

type MetricKey struct {
	Name       string `json:"name"`
	Type       string `json:"type"`
	JoinedTags string `json:"tagstring"` // tags in deterministic order, joined with commas
}

MetricKey is a struct used to key the metrics into the worker's map. All fields must be comparable types.

func (*MetricKey) String added in v1.3.1

func (m *MetricKey) String() string

ToString returns a string representation of this MetricKey

type MetricScope

type MetricScope int
const (
	MixedScope MetricScope = iota
	LocalOnly
	GlobalOnly
)

type MetricType added in v1.7.0

type MetricType int

MetricType defines what kind of metric this is, so that we or our upstream sinks can do the right thing with it.

const (
	// CounterMetric is a counter
	CounterMetric MetricType = iota
	// GaugeMetric is a gauge
	GaugeMetric
)

func (MetricType) String added in v1.7.0

func (i MetricType) String() string

type RouteInformation

type RouteInformation map[string]struct{}

RouteInformation is a key-only map indicating sink names that are supposed to receive a metric. A nil RouteInformation value corresponds to the "every sink" value; an entry in a non-nil RouteInformation means that the key should receive the metric.

func (RouteInformation) RouteTo

func (ri RouteInformation) RouteTo(name string) bool

RouteTo returns true if the named sink should receive a metric according to the route table. A nil route table causes any sink to be eligible for the metric.

type Set

type Set struct {
	Name string
	Tags []string
	Hll  *hyperloglog.Sketch
}

Set is a list of unique values seen.

func NewSet

func NewSet(Name string, Tags []string) *Set

NewSet generates a new Set and returns it

func (*Set) Combine

func (s *Set) Combine(other []byte) error

Combine merges the values seen with another set (marshalled as a byte slice)

func (*Set) Export

func (s *Set) Export() (JSONMetric, error)

Export converts a Set into a JSONMetric which reports the Tags in the set.

func (*Set) Flush

func (s *Set) Flush() []InterMetric

Flush generates an InterMetric for the state of this Set.

func (*Set) Sample

func (s *Set) Sample(sample string, sampleRate float32)

Sample checks if the supplied value has is already in the filter. If not, it increments the counter!

type SplitBytes

type SplitBytes struct {
	// contains filtered or unexported fields
}

SplitBytes iterates over a byte buffer, returning chunks split by a given delimiter byte. It does not perform any allocations, and does not modify the buffer it is given. It is not safe for use by concurrent goroutines.

sb := NewSplitBytes(buf, '\n')
for sb.Next() {
    fmt.Printf("%q\n", sb.Chunk())
}

The sequence of chunks returned by SplitBytes is equivalent to calling bytes.Split, except without allocating an intermediate slice.

func NewSplitBytes

func NewSplitBytes(buf []byte, delim byte) *SplitBytes

NewSplitBytes initializes a SplitBytes struct with the provided buffer and delimiter.

func (*SplitBytes) Chunk

func (sb *SplitBytes) Chunk() []byte

Chunk returns the current chunk.

func (*SplitBytes) Next

func (sb *SplitBytes) Next() bool

Next advances SplitBytes to the next chunk, returning true if a new chunk actually exists and false otherwise.

type UDPEvent

type UDPEvent struct {
	Title       string   `json:"msg_title"`
	Text        string   `json:"msg_text"`
	Timestamp   int64    `json:"timestamp,omitempty"` // represented as a unix epoch
	Hostname    string   `json:"host,omitempty"`
	Aggregation string   `json:"aggregation_key,omitempty"`
	Priority    string   `json:"priority,omitempty"`
	Source      string   `json:"source_type_name,omitempty"`
	AlertLevel  string   `json:"alert_type,omitempty"`
	Tags        []string `json:"tags,omitempty"`
}

UDPEvent represents the structure of datadog's undocumented /intake endpoint

func ParseEvent

func ParseEvent(packet []byte) (*UDPEvent, error)

ParseEvent parses a packet that represents a UDPEvent.

type UDPMetric

type UDPMetric struct {
	MetricKey
	Digest     uint32
	Value      interface{}
	SampleRate float32
	Tags       []string
	Scope      MetricScope
}

UDPMetric is a representation of the sample provided by a client. The tag list should be deterministically ordered.

func ConvertIndicatorMetrics

func ConvertIndicatorMetrics(span *ssf.SSFSpan, timerName string) (metrics []UDPMetric, err error)

ConvertIndicatorMetrics takes a span that may be an "indicator" span and returns metrics that can be determined from that span. Currently, it converts the span to a timer metric for the duration of the span.

func ConvertMetrics added in v1.8.0

func ConvertMetrics(m *ssf.SSFSpan) ([]UDPMetric, error)

ConvertMetrics examines an SSF message, parses and returns a new array containing any metrics contained in the message. If any parse error occurs in processing any of the metrics, ExtractMetrics collects them into the error type InvalidMetrics and returns this error alongside any valid metrics that could be parsed.

func ParseMetric

func ParseMetric(packet []byte) (*UDPMetric, error)

ParseMetric converts the incoming packet from Datadog DogStatsD Datagram format in to a Metric. http://docs.datadoghq.com/guides/dogstatsd/#datagram-format

func ParseMetricSSF added in v1.5.0

func ParseMetricSSF(metric *ssf.SSFSample) (UDPMetric, error)

ParseMetricSSF converts an incoming SSF packet to a Metric.

type UDPServiceCheck

type UDPServiceCheck struct {
	Name      string   `json:"check"`
	Status    int      `json:"status"`
	Hostname  string   `json:"host_name"`
	Timestamp int64    `json:"timestamp,omitempty"` // represented as a unix epoch
	Tags      []string `json:"tags,omitempty"`
	Message   string   `json:"message,omitempty"`
}

UDPServiceCheck is a representation of the service check.

func ParseServiceCheck

func ParseServiceCheck(packet []byte) (*UDPServiceCheck, error)

ParseServiceCheck parses a packet that represents a UDPServiceCheck.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL