cryptography

package module
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 27, 2021 License: MIT Imports: 2 Imported by: 2

README

Go classical cryptography

Go implementations of classical cryptography algorithms and cryptanalysis techniques.

Documentation

Index

Constants

View Source
const (
	// EntropyLow is the threshold below which the Shannon Entropy implies very little variation in the input
	EntropyLow = 0.5
	// EntropyEnglishStart is the lower-bound for the Shannon Entropy of typical English Text.
	EntropyEnglishStart = 3.5
	// EntropyEnglishEnd is the upper-bound for the Shannon Entropy of typical English Text.
	EntropyEnglishEnd = 5
	// EntropyCompressed is the threshold above which the Shannon Entropy implies the data is random/encrypted/compressed.
	EntropyCompressed = 7.5
)
View Source
const IndexOfCoincidenceEnglish = float64(1.73) / 26

IndexOfCoincidenceEnglish is the expected Index of Coincidence for English text.

Variables

This section is empty.

Functions

func CaesarShift

func CaesarShift(input []byte, count uint8) []byte

CaesarShift performs a caesar shift of the given amount on all a-z/A-Z characters. All other characters are left intact.

func CaesarShifts

func CaesarShifts(input []byte) [26][]byte

CaesarShifts performs all 25 possible caesar shifts on the input.

func IndexOfCoincidence

func IndexOfCoincidence(text []byte) float64

IndexOfCoincidence calculates the Index of Coincidence of the given text. The IoC is a measure of how likely it is for two randomly-drawn letters to be identical.

Some sources include a normalising factor for comparing different alphabets. This is not included in this implementation, but can be trivially obtained by multiplying the result by 26.

func LetterDistribution

func LetterDistribution(input []byte) [26]int

LetterDistribution counts the number of the occurrences of each English letter (ignoring case).

func ShannonEntropy

func ShannonEntropy(input []byte) float64

ShannonEntropy calculates the Shannon Entropy of the input. The Shannon Entropy is a measure of how much "information" is represented by the input.

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL