llmkit

package module
v0.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 12, 2025 License: MIT Imports: 5 Imported by: 0

README

LLM Kit

Minimal Go library for calling LLM APIs using only the standard library - no external dependencies required.

Providers

  • Anthropic Claude - Chat completions and structured output
  • OpenAI GPT - Chat completions and structured output
  • Google Gemini - Chat completions and structured output

Structure

├── cmd/                    # Command-line interfaces
│   ├── llmkit-anthropic/   # Anthropic CLI
│   ├── llmkit-openai/      # OpenAI CLI
│   └── llmkit-google/      # Google CLI
├── anthropic/              # Anthropic (Claude) API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── openai/                 # OpenAI API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── google/                 # Google (Gemini) API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── docs/                   # API documentation
├── examples/               # Example JSON schemas
└── errors.go               # Structured error types

Installation

Install using Homebrew:

brew install aktagon/llmkit/llmkit

This installs the llmkit binary and all provider-specific CLI tools (llmkit-anthropic, llmkit-openai, llmkit-google).

Install CLI Tools

Install the command-line tools globally:

# Install Anthropic CLI
go install github.com/aktagon/llmkit/cmd/llmkit-anthropic@latest

# Install OpenAI CLI
go install github.com/aktagon/llmkit/cmd/llmkit-openai@latest

# Install Google CLI
go install github.com/aktagon/llmkit/cmd/llmkit-google@latest

Make sure your $GOPATH/bin is in your $PATH to use the installed binaries:

export PATH=$PATH:$(go env GOPATH)/bin

Check installation location:

# See where Go installs binaries
echo $(go env GOPATH)/bin

# List installed llmkit tools
ls -la $(go env GOPATH)/bin/llmkit-*
Use as Library

Add to your Go project:

go get github.com/aktagon/llmkit

Quick Start

Anthropic

Using installed CLI:

export ANTHROPIC_API_KEY="your-key"
llmkit-anthropic "You are helpful" "Hello Claude"

Using go run:

export ANTHROPIC_API_KEY="your-key"
go run cmd/llmkit-anthropic/main.go "You are helpful" "Hello Claude"

Structured output:

llmkit-anthropic \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/openai/schemas/weather-schema.json)"
OpenAI

Using installed CLI:

export OPENAI_API_KEY="your-key"
llmkit-openai "You are helpful" "Hello GPT"

Structured output:

llmkit-openai \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/openai/schemas/weather-schema.json)"
Google

Using installed CLI:

export GOOGLE_API_KEY="your-key"
llmkit-google "You are helpful" "Hello Gemini"

Structured output:

llmkit-google \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/google/schemas/weather-schema.json)"

Programmatic Usage

As a Library
package main

import (
    "fmt"
    "log"
    "os"

    "github.com/aktagon/llmkit/anthropic"
    "github.com/aktagon/llmkit/openai"
    "github.com/aktagon/llmkit/google"
)

func main() {
    // Anthropic example
    anthropicKey := os.Getenv("ANTHROPIC_API_KEY")
    response, err := anthropic.Prompt(
        "You are a helpful assistant",
        "What is the capital of France?",
        "", // no schema for simple prompt
        anthropicKey,
    )
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("Anthropic:", response)

    // OpenAI example
    openaiKey := os.Getenv("OPENAI_API_KEY")
    response, err = openai.Prompt(
        "You are a helpful assistant",
        "What is the capital of France?",
        "", // no schema for simple prompt
        openaiKey,
    )
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("OpenAI:", response)

    // Google example
    googleKey := os.Getenv("GOOGLE_API_KEY")
    response, err = google.Prompt(
        "You are a helpful assistant",
        "What is the capital of France?",
        "", // no schema for simple prompt
        googleKey,
    )
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("Google:", response)
}
Structured Output Example
package main

import (
    "fmt"
    "log"
    "os"

    "github.com/aktagon/llmkit/openai"
)

func main() {
    schema := `{
        "name": "weather_info",
        "description": "Weather information extraction",
        "strict": true,
        "schema": {
            "type": "object",
            "properties": {
                "location": {"type": "string"},
                "temperature": {"type": "number"},
                "unit": {"type": "string", "enum": ["C", "F"]}
            },
            "required": ["location", "temperature", "unit"],
            "additionalProperties": false
        }
    }`

    response, err := openai.Prompt(
        "You are a weather assistant.",
        "What's the weather in Tokyo? Use Celsius.",
        schema,
        os.Getenv("OPENAI_API_KEY"),
    )
    if err != nil {
        log.Fatal(err)
    }

    fmt.Println(response)
}

Features

  • Standard chat completions
  • Structured output with JSON schema validation/support
  • Pure Go standard library implementation
  • Command-line interface
  • Structured error types for better error handling
  • Programmatic API for library usage

Error Handling

The library provides structured error types:

  • APIError - Errors from LLM APIs
  • ValidationError - Input validation errors
  • RequestError - Request building/sending errors
  • SchemaError - JSON schema validation errors
response, err := openai.Prompt(systemPrompt, userPrompt, schema, apiKey)
if err != nil {
    switch e := err.(type) {
    case *errors.APIError:
        fmt.Printf("API error: %s (status %d)\n", e.Message, e.StatusCode)
    case *errors.SchemaError:
        fmt.Printf("Schema validation error: %s\n", e.Message)
    case *errors.ValidationError:
        fmt.Printf("Input validation error: %s\n", e.Message)
    default:
        fmt.Printf("Unknown error: %v\n", err)
    }
    return
}

Each provider directory contains detailed examples and usage instructions.

Support

Commercial support is available. Contact christian@aktagon.com.

License

MIT

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Prompt added in v0.2.1

func Prompt(opts PromptOptions) (string, error)

Prompt sends a prompt request to the specified LLM provider

func PromptAnthropic added in v0.2.1

func PromptAnthropic(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptAnthropic is a convenience function for Anthropic prompts

func PromptGoogle added in v0.2.1

func PromptGoogle(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptGoogle is a convenience function for Google prompts

func PromptOpenAI added in v0.2.1

func PromptOpenAI(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptOpenAI is a convenience function for OpenAI prompts

Types

type PromptOptions added in v0.2.1

type PromptOptions struct {
	Provider     Provider // Which LLM provider to use
	SystemPrompt string   // System prompt for the request
	UserPrompt   string   // User prompt for the request
	JSONSchema   string   // Optional JSON schema for structured output
	APIKey       string   // API key for the provider
}

PromptOptions configures the prompt request

type Provider added in v0.2.1

type Provider string

Provider represents the LLM provider type

const (
	ProviderOpenAI    Provider = "openai"
	ProviderAnthropic Provider = "anthropic"
	ProviderGoogle    Provider = "google"
)

Directories

Path Synopsis
cmd
llmkit command
llmkit-google command
llmkit-openai command
examples
anthropic/files command
workflows/simple_workflow command
Example program demonstrating the workflow package
Example program demonstrating the workflow package
Package workflow provides a minimal API for creating and executing workflows consisting of tasks with conditional transitions.
Package workflow provides a minimal API for creating and executing workflows consisting of tasks with conditional transitions.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL