kafka

package
v0.0.0-...-c34bea4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 20, 2022 License: Apache-2.0 Imports: 5 Imported by: 0

Documentation

Overview

Package kafka defines the Kafka API of the Cilium network policy interface +groupName=policy

Index

Constants

View Source
const (
	ProduceKey              = 0
	FetchKey                = 1
	OffsetsKey              = 2
	MetadataKey             = 3
	LeaderAndIsr            = 4
	StopReplica             = 5
	UpdateMetadata          = 6
	OffsetCommitKey         = 8
	OffsetFetchKey          = 9
	FindCoordinatorKey      = 10
	JoinGroupKey            = 11
	CreateTopicsKey         = 19
	DeleteTopicsKey         = 20
	DeleteRecordsKey        = 21
	OffsetForLeaderEpochKey = 23
	AddPartitionsToTxnKey   = 24
	WriteTxnMarkersKey      = 27
	TxnOffsetCommitKey      = 28
	AlterReplicaLogDirsKey  = 34
	DescribeLogDirsKey      = 35
	CreatePartitionsKey     = 37
)

List of Kafka apiKeys which have a topic in their request

View Source
const (
	HeartbeatKey   = 12
	LeaveGroupKey  = 13
	SyncgroupKey   = 14
	APIVersionsKey = 18
)

List of Kafka apiKey which are not associated with any topic

View Source
const (
	ProduceRole = "produce"
	ConsumeRole = "consume"
)

List of Kafka Roles

View Source
const (
	MaxTopicLen = 255
)

MaxTopicLen is the maximum character len of a topic. Older Kafka versions had longer topic lengths of 255, in Kafka 0.10 version the length was changed from 255 to 249. For compatibility reasons we are using 255

Variables

View Source
var APIKeyMap = map[string]int16{
	"produce":              0,
	"fetch":                1,
	"offsets":              2,
	"metadata":             3,
	"leaderandisr":         4,
	"stopreplica":          5,
	"updatemetadata":       6,
	"controlledshutdown":   7,
	"offsetcommit":         8,
	"offsetfetch":          9,
	"findcoordinator":      10,
	"joingroup":            11,
	"heartbeat":            12,
	"leavegroup":           13,
	"syncgroup":            14,
	"describegroups":       15,
	"listgroups":           16,
	"saslhandshake":        17,
	"apiversions":          18,
	"createtopics":         19,
	"deletetopics":         20,
	"deleterecords":        21,
	"initproducerid":       22,
	"offsetforleaderepoch": 23,
	"addpartitionstotxn":   24,
	"addoffsetstotxn":      25,
	"endtxn":               26,
	"writetxnmarkers":      27,
	"txnoffsetcommit":      28,
	"describeacls":         29,
	"createacls":           30,
	"deleteacls":           31,
	"describeconfigs":      32,
	"alterconfigs":         33,
}

APIKeyMap is the map of all allowed kafka API keys with the key values. Reference: https://kafka.apache.org/protocol#protocol_api_keys

View Source
var ReverseAPIKeyMap = map[int16]string{
	0:  "produce",
	1:  "fetch",
	2:  "offsets",
	3:  "metadata",
	4:  "leaderandisr",
	5:  "stopreplica",
	6:  "updatemetadata",
	7:  "controlledshutdown",
	8:  "offsetcommit",
	9:  "offsetfetch",
	10: "findcoordinator",
	11: "joingroup",
	12: "heartbeat",
	13: "leavegroup",
	14: "syncgroup",
	15: "describegroups",
	16: "listgroups",
	17: "saslhandshake",
	18: "apiversions",
	19: "createtopics",
	20: "deletetopics",
	21: "deleterecords",
	22: "initproducerid",
	23: "offsetforleaderepoch",
	24: "addpartitionstotxn",
	25: "addoffsetstotxn",
	26: "endtxn",
	27: "writetxnmarkers",
	28: "txnoffsetcommit",
	29: "describeacls",
	30: "createacls",
	31: "deleteacls",
	32: "describeconfigs",
	33: "alterconfigs",
}

ReverseApiKeyMap is the map of all allowed kafka API keys with the key values. Reference: https://kafka.apache.org/protocol#protocol_api_keys

View Source
var TopicValidChar = regexp.MustCompile(`^[a-zA-Z0-9\\._\\-]+$`)

TopicValidChar is a one-time regex generation of all allowed characters in kafka topic name.

Functions

func ApiKeyToString

func ApiKeyToString(apiKey int16) string

Types

type PortRule

type PortRule struct {
	// Role is a case-insensitive string and describes a group of API keys
	// necessary to perform certain higher-level Kafka operations such as "produce"
	// or "consume". A Role automatically expands into all APIKeys required
	// to perform the specified higher-level operation.
	//
	// The following values are supported:
	//  - "produce": Allow producing to the topics specified in the rule
	//  - "consume": Allow consuming from the topics specified in the rule
	//
	// This field is incompatible with the APIKey field, i.e APIKey and Role
	// cannot both be specified in the same rule.
	//
	// If omitted or empty, and if APIKey is not specified, then all keys are
	// allowed.
	//
	// +kubebuilder:validation:Enum=produce;consume
	// +kubebuilder:validation:Optional
	Role string `json:"role,omitempty"`

	// APIKey is a case-insensitive string matched against the key of a
	// request, e.g. "produce", "fetch", "createtopic", "deletetopic", et al
	// Reference: https://kafka.apache.org/protocol#protocol_api_keys
	//
	// If omitted or empty, and if Role is not specified, then all keys are allowed.
	//
	// +kubebuilder:validation:Optional
	APIKey string `json:"apiKey,omitempty"`

	// APIVersion is the version matched against the api version of the
	// Kafka message. If set, it has to be a string representing a positive
	// integer.
	//
	// If omitted or empty, all versions are allowed.
	//
	// +kubebuilder:validation:Optional
	APIVersion string `json:"apiVersion,omitempty"`

	// ClientID is the client identifier as provided in the request.
	//
	// From Kafka protocol documentation:
	// This is a user supplied identifier for the client application. The
	// user can use any identifier they like and it will be used when
	// logging errors, monitoring aggregates, etc. For example, one might
	// want to monitor not just the requests per second overall, but the
	// number coming from each client application (each of which could
	// reside on multiple servers). This id acts as a logical grouping
	// across all requests from a particular client.
	//
	// If omitted or empty, all client identifiers are allowed.
	//
	// +kubebuilder:validation:Optional
	ClientID string `json:"clientID,omitempty"`

	// Topic is the topic name contained in the message. If a Kafka request
	// contains multiple topics, then all topics must be allowed or the
	// message will be rejected.
	//
	// This constraint is ignored if the matched request message type
	// doesn't contain any topic. Maximum size of Topic can be 249
	// characters as per recent Kafka spec and allowed characters are
	// a-z, A-Z, 0-9, -, . and _.
	//
	// Older Kafka versions had longer topic lengths of 255, but in Kafka 0.10
	// version the length was changed from 255 to 249. For compatibility
	// reasons we are using 255.
	//
	// If omitted or empty, all topics are allowed.
	//
	// +kubebuilder:validation:MaxLength=255
	// +kubebuilder:validation:Optional
	Topic string `json:"topic,omitempty"`
}

PortRule is a list of Kafka protocol constraints. All fields are optional, if all fields are empty or missing, the rule will match all Kafka messages.

func (*PortRule) DeepEqual

func (in *PortRule) DeepEqual(other *PortRule) bool

DeepEqual is an autogenerated deepequal function, deeply comparing the receiver with other. in must be non-nil.

func (*PortRule) Exists

func (k *PortRule) Exists(rules []PortRule) bool

Exists returns true if the Kafka rule already exists in the list of rules

func (*PortRule) GetAPIKeys

func (kr *PortRule) GetAPIKeys() []int32

GetAPIKeys() returns a slice of numeric apikeys for the PortRule

func (*PortRule) GetAPIVersion

func (kr *PortRule) GetAPIVersion() int32

GetAPIVersion() returns the numeric API version for the PortRule

func (*PortRule) Sanitize

func (kr *PortRule) Sanitize() error

Sanitize sanitizes Kafka rules TODO we need to add support to check wildcard and prefix/suffix later on.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL