flowllm

package module
v0.0.0-...-1414e4b Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 2, 2023 License: MIT Imports: 10 Imported by: 0

README

FlowLLM

Build Go Report Card GoDoc License

NOTICE: This is still a work-in-progress. The interfaces provided by this project are still subject to change.

FlowLLM is a Go framework for developing applications that leverage the power of language models. It uses composability and chain of responsibility patterns, and offers tools to access LLMs, build prompts, and chain calls together. The package also includes parsers and database integrations, making it an ideal solution for developers working with language models.

FlowLLM is heavily inspired by the LangChain project.

Usage

In the example below we use FlowLLM to build a simple chain that generates a company name and slogan based on a product name. The application uses two different LLMs, one for generating the company name and another for generating the slogan. The slogan LLM is a chat model, so we use a different template to interact with it. the LLMs are called in parallel, and the results are then combined into a single output.

package main

import (
    "context"
    "fmt"

    . "github.com/deluan/flowllm"
    "github.com/deluan/flowllm/llms/openai"
)

func main() {
    // Build a chain that will generate a company name and slogan, and then use them 
    // to generate a sentence. Calls to the OpenAI API are made in parallel, and the 
    // results are merged into a single result.
    chain := Chain(
        ParallelChain(
            2,
            Chain(
                Template("What is a good name for a company that makes {product}?"),
                LLM(openai.NewCompletionModel(openai.Options{Model: "text-davinci-003", Temperature: 1})),
                MapOutputTo("name"),
            ),
            Chain(
                ChatTemplate{UserMessage("What is a good slogan for a company that makes {product}?")},
                ChatLLM(openai.NewChatModel(openai.Options{Model: "gpt-3.5-turbo", Temperature: 1})),
                MapOutputTo("slogan"),
            ),
        ),
        // You can modify the LLMs outputs using some string transformation handlers
        TrimSpace("name", "slogan"),
        TrimSuffix(".", "name"),
        Template("The company {name} makes {product} and their slogan is {slogan}."),
    )

    // Run the chain
    res, err := chain(context.Background(), Values{"product": "colorful sockets"})
    fmt.Println(res, err)

    // Output:
    // The company Rainbow Socks Co makes colorful socks and their slogan is "Life is too short for boring socks – let us add some color to your steps!". <nil>
}

For more features and advanced usage, please check the examples folder.

Installation

To install FlowLLM, use the following command:

go get -u github.com/deluan/flowllm

Features

  • Access to LLMs and their capabilities
  • Tools to build prompts and parse outputs
  • Database integrations for seamless data storage and retrieval
  • Inspired by the langchain project, but striving to stay true to Go idioms and patterns

Usage

For examples and detailed usage instructions, please refer to the documentation (WIP). Also check the examples folder.

Contributing

We welcome contributions from the community! Please read our contributing guidelines for more information on how to get started.

License

FlowLLM is released under the MIT License.

Documentation

Index

Constants

View Source
const (
	DefaultKey     = "text"
	DefaultChatKey = "_chat_messages"
)

Variables

This section is empty.

Functions

This section is empty.

Types

type ChatLanguageModel

type ChatLanguageModel interface {
	Chat(ctx context.Context, msgs []ChatMessage) (string, error)
}

ChatLanguageModel interface is implemented by all chat language models.

type ChatMessage

type ChatMessage struct {
	Role    string
	Content string
}

ChatMessage is a struct that represents a message in a chat conversation.

type ChatMessages

type ChatMessages []ChatMessage

ChatMessages is a list of ChatMessage.

func (ChatMessages) Last

func (m ChatMessages) Last(size int) ChatMessages

Last returns the last N messages from the list.

func (ChatMessages) String

func (m ChatMessages) String() string

type ChatTemplate

type ChatTemplate []MessageTemplate

ChatTemplate is a prompt that can be used with Chat-style LLMs. It will format a list of messages, each with a role and a prompt.

func (ChatTemplate) Call

func (t ChatTemplate) Call(_ context.Context, values ...Values) (Values, error)

type Document

type Document struct {
	ID          string
	PageContent string
	Metadata    map[string]any
}

Document represents a document to be stored in a VectorStore.

func LoadDocs

func LoadDocs(n int, loader DocumentLoader) ([]Document, error)

LoadDocs loads the next n documents from the given DocumentLoader.

type DocumentLoader

type DocumentLoader interface {
	LoadNext(ctx context.Context) (Document, error)
}

DocumentLoader is the interface implemented by types that can load documents. The LoadNext method should the next available document, or io.EOF if there are no more documents.

type DocumentLoaderFunc

type DocumentLoaderFunc func(ctx context.Context) (Document, error)

DocumentLoaderFunc is an adapter to allow the use of ordinary functions as DocumentLoaders.

func (DocumentLoaderFunc) LoadNext

func (f DocumentLoaderFunc) LoadNext(ctx context.Context) (Document, error)

type Embeddings

type Embeddings interface {
	// EmbedString returns the embedding for the given string
	EmbedString(context.Context, string) ([]float32, error)

	// EmbedStrings returns the embeddings for multiple strings
	EmbedStrings(context.Context, []string) ([][]float32, error)
}

Embeddings can be used to create a numerical representation of textual data. This numerical representation is useful when searching for similar documents.

type Handler

type Handler interface {
	Call(ctx context.Context, values ...Values) (Values, error)
}

Handler is the interface implemented by all composable modules in the library.

type HandlerFunc

type HandlerFunc func(context.Context, ...Values) (Values, error)

HandlerFunc is a function that implements the Handler interface.

func Chain

func Chain(handlers ...Handler) HandlerFunc

Chain is a special handler that executes a list of handlers in sequence. The output of each chain is passed as input to the next one. The output of the last chain is returned as the output of the Sequential chain.

func ChatLLM

func ChatLLM(model ChatLanguageModel) HandlerFunc

ChatLLM is a handler that can be used to add a chat model to a chain. It is similar to the LLM handler, but it has a few differences: It will use the value of the DefaultChatKey key (usually set by the ChatTemplate) as input to the model, if available. If not, it will use the value of the DefaultKey key.

func LLM

func LLM(model LanguageModel) HandlerFunc

LLM is a handler that can be used to add a language model to a chain.

func MapOutputTo

func MapOutputTo(key string) HandlerFunc

MapOutputTo renames the output of the chain (DefaultKey) to the given key.

func ParallelChain

func ParallelChain(maxParallel int, handlers ...Handler) HandlerFunc

ParallelChain executes a list of handlers in parallel, up to a maximum number of concurrent executions. If any of the handlers returns an error, the execution is stopped and the error is returned. The results of all handlers are merged into a single Values object.

func TrimSpace

func TrimSpace(keys ...string) HandlerFunc

TrimSpace trims all spaces from the values of the given keys.

func TrimSuffix

func TrimSuffix(suffix string, keys ...string) HandlerFunc

TrimSuffix trims the given suffix from the values of the given keys.

func WithMemory

func WithMemory(memory Memory, handler Handler) HandlerFunc

WithMemory is a wrapper that loads the previous conversation from the memory, injects it into the chain as the value of the DefaultChatKey key, calls the wrapped handler, and adds the last question/answer to the memory.

func (HandlerFunc) Call

func (f HandlerFunc) Call(ctx context.Context, values ...Values) (Values, error)

type LanguageModel

type LanguageModel interface {
	Call(ctx context.Context, input string) (string, error)
}

LanguageModel interface is implemented by all language models.

type Memory

type Memory interface {
	// Load returns previous conversations from the memory
	Load(context.Context) (ChatMessages, error)

	// Save last question/answer to the memory
	Save(ctx context.Context, input, output string) error
}

Memory is an interface that can be used to store and retrieve previous conversations.

type MessageTemplate

type MessageTemplate struct {
	Template Handler
	Role     string
}

MessageTemplate is a prompt template that can be used with Chat-style LLMs. It is similar to Template, but it also specifies the role of the message.

func AssistantMessage

func AssistantMessage(template string) MessageTemplate

func MessageHistoryPlaceholder

func MessageHistoryPlaceholder(variableName string) MessageTemplate

MessageHistoryPlaceholder is a special message template that can be used with the WithMemory handler and Chat-style LLMs. It will be replaced with the history of messages in the conversation.

func SystemMessage

func SystemMessage(template string) MessageTemplate

func UserMessage

func UserMessage(template string) MessageTemplate

type ScoredDocument

type ScoredDocument struct {
	Document
	Score float32
}

ScoredDocument represents a document along with its similarity score.

type Splitter

type Splitter = func(string) ([]string, error)

Splitter is a function that splits a string into a slice of strings.

func MarkdownSplitter

func MarkdownSplitter(opts SplitterOptions) Splitter

MarkdownSplitter returns a Splitter that splits a document into chunks using a set of MarkdownSplitter-specific separators. It is a recursive splitter, meaning that it will split each chunk into smaller chunks using the same separators.

func RecursiveTextSplitter

func RecursiveTextSplitter(opts SplitterOptions) Splitter

RecursiveTextSplitter splits a text into chunks of a given size, trying to split at the given separators. If the text is smaller than the chunk size, it will be returned as a single chunk. If the text is larger than the chunk size, it will be split into chunks of the given size, trying to split at the given separators. If the text cannot be split at any of the given separators, it will be split at the last separator.

type SplitterOptions

type SplitterOptions struct {
	// ChunkSize is the maximum size of each chunk
	ChunkSize int
	// ChunkOverlap is the number of characters that will be repeated in each
	ChunkOverlap int
	// LenFunc is the length function to be used to calculate the chunk size
	LenFunc func(string) int
	// Separators is a list of strings that will be used to split the text
	Separators []string
}

SplitterOptions for the RecursiveTextSplitter splitter

type Template

type Template string

Template can be used to format a string with variables. Useful for creating prompts. It uses a simple template syntax, where variables are enclosed in curly braces.

func (Template) Call

func (t Template) Call(_ context.Context, values ...Values) (Values, error)

type Values

type Values map[string]any

Values is a map of string to any value. This is the type used to pass values between handlers.

func (Values) Get

func (value Values) Get(key string) string

Get returns the value for a given key as a string. If the key does not exist, it returns an empty string.

func (Values) Keys

func (value Values) Keys() []string

Keys returns the keys of the Values object.

func (Values) Merge

func (value Values) Merge(values ...Values) Values

Merge merges multiple Values into one.

func (Values) String

func (value Values) String() string

String returns a string representation of the Values object. If the Values object has only one key, it returns the value of that key. If the Values object has multiple keys, it returns a JSON representation.

type VectorStore

type VectorStore interface {
	// AddDocuments adds the given documents to the store
	AddDocuments(context.Context, ...Document) error
	// SimilaritySearch returns the k most similar documents to the query
	SimilaritySearch(ctx context.Context, query string, k int) ([]Document, error)
	// SimilaritySearchVectorWithScore returns the k most similar documents to the query, along with their similarity score
	SimilaritySearchVectorWithScore(ctx context.Context, query []float32, k int) ([]ScoredDocument, error)
}

VectorStore is a particular type of database optimized for storing documents and their embeddings, and then fetching of the most relevant documents for a particular query, i.e. those whose embeddings are most similar to the embedding of the query.

Directories

Path Synopsis
llms
Package pl implements some Data Pipeline helper functions.
Package pl implements some Data Pipeline helper functions.
Package tiktoken implements a wrapper around the github.com/tiktoken-go/tokenizer library.
Package tiktoken implements a wrapper around the github.com/tiktoken-go/tokenizer library.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL