batchqueue

command
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 19, 2026 License: MIT Imports: 1 Imported by: 0

README

BatchQueue

Queue it up. Process it later.

BatchQueue provides async request queuing with configurable concurrency control. Submit jobs, get results via polling or webhook.

Quickstart

export OPENAI_API_KEY=sk-...
npx @stockyard/batchqueue

# Your app:   http://localhost:5000/v1/chat/completions
# Dashboard:  http://localhost:5000/ui

What You Get

  • Async request queue
  • Configurable concurrency limits
  • Job status polling API
  • Webhook on completion
  • Priority levels
  • Dashboard with queue depth

Config

# batchqueue.yaml
port: 5000
providers:
  openai:
    api_key: ${OPENAI_API_KEY}
queue:
  max_concurrent: 5
  max_queue_depth: 1000
  timeout: 120s
  webhook_on_complete: ""

Docker

docker run -p 5000:5000 -e OPENAI_API_KEY=sk-... stockyard/batchqueue

Part of Stockyard

BatchQueue is part of Stockyard — an open-source LLM proxy and control plane. MIT licensed.

Documentation

Overview

BatchQueue — "Fire-and-forget LLM requests with automatic retries."

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL