package module
Version: v0.0.0-...-1dbe30f Latest Latest

This package is not in the latest version of its module.

Go to latest
Published: Oct 9, 2020 License: MIT Imports: 6 Imported by: 0



Generic web frontend to view, search, explore and visualize keys, headers and payloads flying through your Kafka broker. Fancy project name was derived from words: Kafka + X-Ray.


Experimental, born from hackthon project. Mostly useful as personal development tool. But we also have been running it within 'test' environment for couple years.

Idea & history

  1. Need a better tool to have insights on what's going on with kafka during development or testing stage (may be production too)

    • troubleshoot basic stuff (why bad data, what's getting written to topic)
    • examine topics (especially debugging kafka streams, why we have x20 more message than from source)
    • onboard new team members quicker by visually
    • everything else :)
  2. Standalone command line tools never enough:

    • have to combine them to get something meaningul about data,
    • ending up writing custom python/ruby/bash/whatever in 90% of real cases
    • hard to explain 'black woodoo magic' to other people

just take a look how bad it can become very quickly:

kafkacat -L -b internal-test-cluster-alpha-17312321.us-east-1.elb.amazonaws.com:9092
-X security.protocol=SSL
-X ssl.ca.location=/var/certs/kafka-test-cluster/ca.cer
| grep topic
| awk '{print $2}'
| sed 's/\"//g'
| awk '/summary.daily|summary.60m|metrics.daily|metrics.60m/'
| xargs -I {} /usr/local/kafka/bin/kafka-topics.sh
--zookeeper,, --delete --topic {}

So, wouldn't it be nice to have a tool with modern web ui where you can explore topics, messages and all other details during your development work?


We started the tool with following goals in mind:

  1. Single binary (drop & run) server, can run locally or on broker node, no complex installation.
  2. Docker friendly (run anywhere)
  3. Broker auto discovery, zero configuration.
  4. Slick UI, ability to view structured payloads as json or avro. Different representations for binary payloads (as hex dumps, integeres, e.t.c.)
  5. Extensible enrichments (plugins) to provide special visualization meaning for your specific data


The main idea is to capture every single message from kafka broker and store in embeddable key-value database. Project is leveraging awesome BadgerDB by https://dgraph.io/ as persistent layer: https://github.com/dgraph-io/badger

Bird eye view is shown on diagram below:

Adjust timeline

The approach provides number of interesting opportunities, like:

  1. time travel, search and explore messages from any time range
  2. examine what's going on a wire with system topics or kafka-stream internal ones
  3. fix and re-stream broker message
  4. backup / restore development broker to any point of time
  5. and countless other possibilities :)

Show me how it looks like?

Please checkout screenshots from UI interface in frontend part of project: https://github.com/dvsekhvalnov/k-ray-ui

How to build it


The project is consist of 3 different repos:

  1. Go sync library for kafka: https://github.com/dvsekhvalnov/sync4kafka
  2. Backend engine, storage and REST API (given project): https://github.com/dvsekhvalnov/k-ray
  3. Frontend SPA part living at: https://github.com/dvsekhvalnov/k-ray-ui
Build steps
  1. clone backend repo (https://github.com/dvsekhvalnov/k-ray)
  2. clone frontend repo from (https://github.com/dvsekhvalnov/k-ray-ui)
  3. install all dependencies with go get -u that project is complaining about
  4. run ./build.sh to produce combined backend + frontend binary
  5. adjust ./k-ray.yaml as needed
  6. start it with go run main/main.go
  7. access it on http://localhost:8080/

Maintainers wanted

As with many former hackaton projects nobody actively working on it fulltime. If somebody from community have interest in supporting or evolving it please drop me a message i whould be happy if project gets second chance.

We also appreciate fancy Mascot for the project ;)




This section is empty.


This section is empty.


func Persist

func Persist(in <-chan *sarama.ConsumerMessage, engine *Engine) <-chan *sarama.ConsumerMessage


type Config

type Config struct {
	Agent *agent.Config

	// Path to woking dir where database files will be stored,
	// if not exists will be auto-created on start
	DataDir string

	// HTTP port to start embedded web interface
	Port string

	// Internal message processing spool configuration
	Spool struct {

		// Spool queue size
		Size int

		// Number of concurrent workers processing spool queue
		MaxWorkers int

func NewConfig

func NewConfig() *Config

type Engine

type Engine struct {
	Db         db.Datastore
	Enrichment EnrichmentPipeline
	// contains filtered or unexported fields

func NewEngine

func NewEngine() *Engine

func (*Engine) Close

func (engine *Engine) Close()

func (*Engine) Start

func (engine *Engine) Start(cfg *Config) (<-chan *sarama.ConsumerMessage, error)

type Enrichment

type Enrichment interface {
	Process(msg *db.Message) *db.Message

type EnrichmentPipeline

type EnrichmentPipeline struct {
	// contains filtered or unexported fields

func (*EnrichmentPipeline) Process

func (p *EnrichmentPipeline) Process(msg *db.Message) *db.Message

func (*EnrichmentPipeline) Register

func (p *EnrichmentPipeline) Register(e Enrichment)

type LookupTable

type LookupTable struct {
	// contains filtered or unexported fields


Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL