go-llms

module
v0.3.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 12, 2025 License: MIT

README

Go-LLMs: Unified Go Library for LLM Integration

A lightweight Go library providing a simplified, unified interface to interact with various LLM providers while offering robust data validation and agent tooling and multi-agent orchestration via workflows and state management.

Features

  • Unified API across OpenAI, Anthropic, Google Gemini, Vertex AI, Ollama, OpenRouter, and compatible providers
  • Structured outputs with JSON schema validation and type coercion
  • Agent system with state management, hooks, and workflow patterns
  • 32 built-in tools for web, file, system, data, datetime, and feed operations
  • Tool enhancement with LLM guidance metadata and MCP (Model Context Protocol) support
  • Multimodal content support for text, images, files, videos, and audio
  • Multi-provider strategies including fastest, primary, and consensus approaches
  • Type-safe configuration with interface-based provider options
  • Minimal dependencies leveraging Go's standard library

What's New in v0.3.3

See CHANGELOG.md for the complete version history.

v0.3.3 (January 11, 2025) - Major Provider Expansion
  • Ollama: Local model hosting with GPU acceleration and model management
  • OpenRouter: Access to 400+ models (68 free) from multiple providers via unified API
  • Vertex AI: Enterprise Google Cloud deployment with Gemini and partner models (Claude)
  • Full integration with utilities, CLI, and configuration systems
  • Enhanced documentation for all new providers
v0.3.2 (January 11, 2025) - Documentation Update
  • Complete documentation restructuring for better user experience
  • Modularized API documentation with dedicated files for each component
  • Improved user guide following natural learning progression
  • Enhanced technical documentation with new guides for providers and tools
  • Consolidated redundant content and improved cross-linking
v0.3.1 (January 10, 2025) - Tool System Enhancement
  • Enhanced ToolBuilder pattern for all 32 built-in tools
  • Comprehensive LLM guidance metadata (examples, constraints, error handling)
  • MCP (Model Context Protocol) compatibility
  • Advanced authentication support for web tools
  • Performance improvements

Installation

go get github.com/lexlapax/go-llms

Quick Start

Basic Usage
// Create a provider
provider := provider.NewOpenAIProvider(
    os.Getenv("OPENAI_API_KEY"),
    "gpt-4o",
)

// Generate text
response, err := provider.Generate(context.Background(), "Explain quantum computing")
if err != nil {
    log.Fatal(err)
}
fmt.Println(response)
Using Agents with Tools
// Create an agent with built-in tools
agent, err := core.NewAgentFromString("assistant", "openai/gpt-4o")
if err != nil {
    log.Fatal(err)
}

// Add built-in tools
agent.AddTool(web.WebSearch())
agent.AddTool(file.FileRead())

// Execute with state
state := domain.NewState()
state.Set("prompt", "Search for Go programming tutorials and save the results")
result, err := agent.Run(context.Background(), state)
Structured Output
// Define a schema
schema := &domain.Schema{
    Type: "object",
    Properties: map[string]domain.Property{
        "name":  {Type: "string"},
        "age":   {Type: "integer"},
        "email": {Type: "string", Format: "email"},
    },
    Required: []string{"name", "email"},
}

// Generate structured data
result, err := provider.GenerateWithSchema(
    context.Background(),
    "Generate a person's information",
    schema,
)

Documentation

Supported Providers

  • OpenAI - GPT-4o, GPT-4o-mini, GPT-4 Turbo, GPT-3.5 Turbo
  • Anthropic - Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
  • Google Gemini - Gemini 2.0 Flash Lite, Gemini Pro, Gemini Pro Vision
  • Google Vertex AI - Enterprise Gemini models, Claude (partner models), regional deployment
  • Ollama - Llama 3.2, Mistral, Phi-3, CodeLlama, and more (local hosting)
  • OpenRouter - Access to 400+ models from various providers (68 free models)
  • OpenAI-Compatible - LM Studio, vLLM, and any OpenAI-compatible API

Examples

The cmd/examples/ directory contains 40+ examples demonstrating various features:

  • Provider examples: OpenAI, Anthropic, Gemini, OpenRouter, Ollama, multi-provider strategies
  • Agent examples: Tool usage, workflows, state management, sub-agents
  • Built-in tools: Web search, file operations, API client, data processing
  • Advanced patterns: Structured output, multimodal content, custom agents

Architecture

Go-LLMs follows a clean architecture with vertical feature slicing:

pkg/
├── schema/      # JSON schema validation
├── llm/         # Provider implementations
├── structured/  # Output processing
└── agent/       # Agent system with tools and workflows

Contributing

See CONTRIBUTING.md for development guidelines.

License

MIT License - see LICENSE file for details.

Acknowledgments

Special thanks to the LLM-based coding tools that helped with documentation and testing: Aider, Claude Code, ChatGPT, Claude Desktop, and Gemini Code.

Directories

Path Synopsis
cmd
examples/simple command
pkg
agent/domain
Package domain defines the core domain models and interfaces for agents.
Package domain defines the core domain models and interfaces for agents.
agent/tools
Package tools provides implementations of agent tools.
Package tools provides implementations of agent tools.
internal/debug
Package debug provides conditional debug logging that is only compiled when the -tags debug build flag is used.
Package debug provides conditional debug logging that is only compiled when the -tags debug build flag is used.
llm/provider
Package provider implements various LLM providers.
Package provider implements various LLM providers.
schema/domain
Package domain defines the core domain models and interfaces for schema validation.
Package domain defines the core domain models and interfaces for schema validation.
structured/domain
Package domain defines core domain models and interfaces for structured LLM outputs.
Package domain defines core domain models and interfaces for structured LLM outputs.
structured/processor
Package processor implements structured output processing functionality
Package processor implements structured output processing functionality
testutils
Package testutils provides testing utilities for the Go-LLMs library.
Package testutils provides testing utilities for the Go-LLMs library.
util/json
Package json provides an optimized JSON implementation with multiple backends
Package json provides an optimized JSON implementation with multiple backends
util/llmutil
Package llmutil provides utility functions for common LLM operations.
Package llmutil provides utility functions for common LLM operations.
util/llmutil/modelinfo
Package modelinfo provides structures and functions to fetch, aggregate, and cache information about available Large Language Models (LLMs) from various providers.
Package modelinfo provides structures and functions to fetch, aggregate, and cache information about available Large Language Models (LLMs) from various providers.
util/llmutil/modelinfo/cache
Package cache provides file-based caching functionality for the model inventory.
Package cache provides file-based caching functionality for the model inventory.
util/llmutil/modelinfo/domain
Package domain contains the core data structures representing the model inventory, including details about models, their capabilities, pricing, and metadata.
Package domain contains the core data structures representing the model inventory, including details about models, their capabilities, pricing, and metadata.
util/llmutil/modelinfo/fetchers
Package fetchers provides specific implementations for fetching model information from different LLM providers like OpenAI, Google, and Anthropic.
Package fetchers provides specific implementations for fetching model information from different LLM providers like OpenAI, Google, and Anthropic.
util/llmutil/modelinfo/service
Package service contains the aggregation logic for model information from multiple fetchers, combining them into a single inventory.
Package service contains the aggregation logic for model information from multiple fetchers, combining them into a single inventory.
util/metrics
Package metrics provides utilities for collecting and reporting performance metrics in the Go-LLMs project.
Package metrics provides utilities for collecting and reporting performance metrics in the Go-LLMs project.
util/profiling
Package profiling provides utilities for CPU and memory profiling in the Go-LLMs project.
Package profiling provides utilities for CPU and memory profiling in the Go-LLMs project.
tests

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL