metrics

command module
v0.0.0-...-5b96b32 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 19, 2025 License: MIT Imports: 20 Imported by: 0

README

delivery-metrics

Delivery metrics is a service designed to handle delivery team's metrics. It implements a specialized Prometheus Pushgateway designed to be used from our CI jobs, as well as an API scraper for publishing metrics based on our pipelines.

It can easily keep track of histograms implementing an HTTP API around Prometheus metrics.

Architecture

delivery-metrics is a Go software exposing an HTTP endpoint, on the /metrics path we have a standard prometheus metrics, on the /api path we have custom handlers to add data to our metrics. API scraping is implemented in dedicated goroutines.

Deployment

delivery-metrics implements continous delivery using a bridge job to trigger a tanka deployment when a change in this directory is detected or when the BUILD_DELIVERY_METRICS variable is set to true on an OPS pipeline.

Logs

Logs for Delivery-metrics can be found at https://console.cloud.google.com/kubernetes/deployment/us-east1/ops-gitlab-gke/delivery/delivery-metrics/logs?project=gitlab-ops&pli=1.

Access control
Push Gateway

The /api path requires a token to allow write operations. The token must be provided as an HTTP header named X-Private-Token.

The software compares the user provided token with the content of the AUTH_TOKEN environment variable.

API scraper

In order to perform API requests on ops.gitlab.net, delivery-metrics requires a read only API token stored into the DELIVERY_METRICS_OPS_TOKEN environment variable.

Working with histograms

It is possible to add values to an histogram making a POST request to the metric observe method. The request is form-encoded and requires 2 parameters:

  • value: the observed value
  • labels: a comma-separated list of label values. Label values are positional.

example

curl -X POST \
    -H "X-Private-Token: MYTEST" \
    -F value=18000 \
    -F "labels=coordinator_pipeline,success" \
    "http://127.0.0.1:2112/api/deployment_duration_seconds/observe"
Metric reset

It is possible to reset a metric by sending a DELETE request, without parameters, to the metric's endpoint.

example

curl -X DELETE \
    -H "X-Private-Token: MYTEST" \
    "http://127.0.0.1:2112/api/release_pressure"
Experiments

It is possible to add temporary experimental metrics under the experiments subsystem.

To add a new metrics, create a new go file in the package with an init function that appends the new handler to the pluggable slice, then add an acceptance test for the new metric.

When the experiment is concluded, simply remove the go file and the acceptance test.

The delivery_experiments_total counter is provided to track the number of experiments.

Example code to add a new metric (metrics/internal/experiments/foo.go):

package experiments

import (
	"gitlab.com/gitlab-org/release-tools/metrics/internal/handlers"
	"gitlab.com/gitlab-org/release-tools/metrics/internal/metrics"
	"gitlab.com/gitlab-org/release-tools/metrics/internal/metrics/labels"
)

func init() {
    fooCounter, err := metrics.NewCounterVec(
        metrics.WithName("foo_total"),
        metrics.WithSubsystem(subsystem),
        metrics.WithHelp("Number of foo"),
        metrics.WithLabel(labels.SuccessOrFailed("result")),
        metrics.WithCartesianProductLabelReset(),
    )
    if err != nil {
        panic(err)
    }

    pluggables = append(pluggables, handlers.NewCounter(fooCounter))
}

Acceptance tests

Delivery metrics has acceptance tests running in CI against the docker image compiled from the branch codebase.

It is possible to run those tests locally with ./scripts/delivery-metrics.sh build, ./scripts/delivery-metrics.sh run (from another shell), and ./scripts/delivery-metrics.sh acceptance-tests.

The above solution is designed for CI environments and will run on docker (or nerdctl), it is possible to run it locally without docker using the following commands from the metrics folder.

# start delivery-metrics and let it run
JOB_WEBHOOK_RELEASE_TOOLS_OPS_TOKEN="release-tools-token" JOB_WEBHOOK_DEPLOYER_OPS_TOKEN="deployer-token" AUTH_TOKEN="acceptance tests" go run .

# open another shell and run the acceptance tests
go test -tags=acceptance -v

Webhook Sources

Delivery-metrics receives job event webhooks from multiple projects for deployment pipeline metrics. These are all projects in which the deployment pipeline triggers downstream pipelines. Webhook events are received from the following projects:

Adding new secrets/tokens as environment variables to delivery-metrics

Each project's webhook events will contain a token. Delivery-metrics will match the received token against the expected token. Expected tokens are stored in Vault, which are pulled into K8s.

The tokens need to be added to:

Generating metrics using job event webhooks

job_webhook.go defines a handler for job event webhook requests. Requests are routed to different functions based on the project the request came from. It generates metrics by using the metrics package (for example metrics.NewCounterVec) and modifies them based on the data received from incoming requests. (see examples in the file). Example MR.

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL