splitans

package
v0.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 18, 2026 License: GPL-3.0 Imports: 11 Imported by: 0

Documentation

Overview

Package splitans provides a public API for parsing and exporting ANSI art files.

This package provides functions to:

  • Convert between character encodings (CP437, CP850, ISO-8859-1, UTF-8)
  • Tokenize ANSI and Neotex format files
  • Export to various formats (ANSI, plain text, Neotex)
  • Process tokens through a virtual terminal

Example usage:

import "github.com/badele/splitans/pkg/splitans"

data, _ := os.ReadFile("art.ans")
utf8Data, _ := splitans.ConvertToUTF8(data, "cp437")
tokenizer := splitans.NewANSITokenizer(utf8Data)
tokens := tokenizer.Tokenize()
output, _ := splitans.ExportFlattenedANSI(80, 25, tokens, "utf8", true)

Index

Constants

View Source
const (
	TokenText          = types.TokenText
	TokenC0            = types.TokenC0
	TokenC1            = types.TokenC1
	TokenCSI           = types.TokenCSI
	TokenCSIInterupted = types.TokenCSIInterupted
	TokenSGR           = types.TokenSGR
	TokenDCS           = types.TokenDCS
	TokenOSC           = types.TokenOSC
	TokenEscape        = types.TokenEscape
	TokenSauce         = types.TokenSauce
	TokenUnknown       = types.TokenUnknown
)

Token type constants

View Source
const (
	ColorDefault  = types.ColorDefault
	ColorStandard = types.ColorStandard
	ColorIndexed  = types.ColorIndexed
	ColorRGB      = types.ColorRGB
)

Color type constants

Variables

View Source
var C0Names = types.C0Names

C0Names maps C0 control codes to their names

View Source
var VGAPalette = types.VGAPalette

VGAPalette contains the 16 standard VGA colors

Functions

func ConvertToEncoding

func ConvertToEncoding(data []byte, targetEncoding string) ([]byte, error)

ConvertToEncoding converts UTF-8 data to the target encoding. Supported encodings: "utf8", "cp437", "cp850", "iso-8859-1"

func ConvertToUTF8

func ConvertToUTF8(data []byte, sourceEncoding string) ([]byte, error)

ConvertToUTF8 converts byte data from a source encoding to UTF-8. Supported encodings: "utf8", "cp437", "cp850", "iso-8859-1" The UTF-8 BOM (Byte Order Mark) is automatically stripped if present.

func DiffSGRToNeotex

func DiffSGRToNeotex(current, previous *SGR) []string

DiffSGRToNeotex generates minimal neotex codes to transition between SGR states.

func ExportFlattenedANSI

func ExportFlattenedANSI(width, nblines int, tokens []Token, outputEncoding string, useVGAColors bool, crop *CropRegion) (string, int, error)

ExportFlattenedANSI exports tokens to a flattened ANSI string. This processes tokens through a virtual terminal to resolve cursor positioning and produces clean ANSI output. If crop is non-nil, the output will be cropped to the specified region. Returns (output, effectiveWidth, error) where effectiveWidth is the VT width after crop.

func ExportFlattenedANSIInline

func ExportFlattenedANSIInline(width, nblines int, tokens []Token, outputEncoding string, useVGAColors bool, crop *CropRegion) (string, int, error)

ExportFlattenedANSIInline exports tokens to a single-line ANSI string. Returns (output, effectiveWidth, error) where effectiveWidth is the VT width after crop.

func ExportFlattenedNeotex

func ExportFlattenedNeotex(width, nblines int, tokens []Token, crop *CropRegion) (string, string, int, error)

ExportFlattenedNeotex exports tokens to Neotex format. Returns (text, sequences, effectiveWidth, error) where:

  • text is the plain text content
  • sequences is the neotex format sequences with positions
  • effectiveWidth is the VT width after crop

If crop is non-nil, the output will be cropped to the specified region.

func ExportFlattenedNeotexInline

func ExportFlattenedNeotexInline(width, nblines int, tokens []Token, crop *CropRegion) (string, string, int, error)

ExportFlattenedNeotexInline exports tokens to inline Neotex format. This flattens all lines into a single line and adjusts sequence positions. Returns (text, sequences, effectiveWidth, error) where effectiveWidth is the VT width after crop.

func ExportFlattenedText

func ExportFlattenedText(width, nblines int, tokens []Token, outputEncoding string, crop *CropRegion) (string, int, error)

ExportFlattenedText exports tokens to plain text without ANSI codes. This processes tokens through a virtual terminal and outputs only the text content. If crop is non-nil, the output will be cropped to the specified region. Returns (text, effectiveWidth, error) where effectiveWidth is the VT width after crop.

func ExportFlattenedTextInline

func ExportFlattenedTextInline(width, nblines int, tokens []Token, outputEncoding string, crop *CropRegion) (string, int, error)

ExportFlattenedTextInline exports tokens to plain text on a single line. Returns (text, effectiveWidth, error) where effectiveWidth is the VT width after crop.

func ExportToInlineNeotex

func ExportToInlineNeotex(vt *VirtualTerminal) (string, string)

ExportToInlineNeotex exports VirtualTerminal buffer to inline neotex format. All lines are flattened into a single line with adjusted sequence positions. Returns (text, sequences) where text is plain content and sequences contains position-based style codes.

func ExportToNeotex

func ExportToNeotex(vt *VirtualTerminal) (string, string)

ExportToNeotex exports VirtualTerminal buffer to neotex format with differential encoding. Returns (text, sequences) where text is plain content and sequences contains position-based style codes.

func NormalizeANSIUTF8Input

func NormalizeANSIUTF8Input(data []byte, width int) []byte

NormalizeANSIUTF8Input cleans UTF-8 ANSI data by stripping carriage returns when a width is provided. When width is zero or negative, the data is returned untouched.

func SGRToNeotex

func SGRToNeotex(sgr *SGR) []string

SGRToNeotex converts an SGR struct to neotex format strings.

Types

type ANSITokenizer

type ANSITokenizer = ansi.Tokenizer

ANSITokenizer is the tokenizer for ANSI format files

func NewANSITokenizer

func NewANSITokenizer(input []byte) *ANSITokenizer

NewANSITokenizer creates a new tokenizer for ANSI format data. The input should be UTF-8 encoded (use ConvertToUTF8 if needed).

type ColorType

type ColorType = types.ColorType

ColorType represents the type of color encoding

type ColorValue

type ColorValue = types.ColorValue

ColorValue represents a color (standard, indexed, or RGB)

type ContentBounds

type ContentBounds = processor.ContentBounds

ContentBounds represents the bounding box of actual content in the buffer

type CropRegion

type CropRegion = types.CropRegion

CropRegion defines a rectangular region for cropping

func ParseCropRegion

func ParseCropRegion(s string) (*CropRegion, error)

ParseCropRegion parses a crop string in format "x,y:x1,y1". Returns nil if the string is empty.

type LineWithSequences

type LineWithSequences = types.LineWithSequences

LineWithSequences contains a line of text and all SGR changes within that line

type NeotexTokenizer

type NeotexTokenizer = neotex.Tokenizer

NeotexTokenizer is the tokenizer for Neotex format files

func NewNeotexTokenizer

func NewNeotexTokenizer(data []byte, width int) (int, *NeotexTokenizer)

NewNeotexTokenizer creates a new tokenizer for Neotex format data. The width parameter specifies the expected line width. Returns the parsed width (overrides when !TWxx/yy is present) and the tokenizer.

type SGR

type SGR = types.SGR

SGR represents Select Graphic Rendition attributes (colors, styles)

func NewSGR

func NewSGR() *SGR

NewSGR creates a new SGR with default values.

type SGRSequence

type SGRSequence = types.SGRSequence

SGRSequence represents a style change at a specific position

type Token

type Token = types.Token

Token represents a parsed ANSI token (text, control code, escape sequence, etc.)

type TokenStats

type TokenStats = types.TokenStats

TokenStats contains statistics about parsed tokens

type TokenType

type TokenType = types.TokenType

TokenType represents the type of a token

type Tokenizer

type Tokenizer = types.Tokenizer

Tokenizer is the interface for all tokenizers

type TokenizerWithStats

type TokenizerWithStats = types.TokenizerWithStats

TokenizerWithStats is a tokenizer that also provides statistics

type VirtualTerminal

type VirtualTerminal = processor.VirtualTerminal

VirtualTerminal provides a virtual terminal buffer for processing tokens

func NewVirtualTerminal

func NewVirtualTerminal(width, height int, outputEncoding string, useVGAColors bool) *VirtualTerminal

NewVirtualTerminal creates a new virtual terminal with the specified dimensions. outputEncoding specifies the output encoding ("utf8", "cp437", "cp850", "iso-8859-1"). useVGAColors enables true VGA colors (not affected by terminal themes).

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL