confusion

package
v1.4.31 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 27, 2023 License: BSD-3-Clause Imports: 7 Imported by: 3

README

Docs: GoDoc

Confusion implements a confusion matrix: records output responses for discrete categories / classes.

  • Rows (outer dimension) are for each class as the ground truth, correct answer.

  • Columns (inner dimension) are the response generated for each ground-truth class.

The main result is in the Prob field, computed from the Sum and N values added incrementally.

Main API:

  • InitFromLabels to initialize with list of class labels and display font size.
  • Incr on each trial with network's response index and correct target index.
  • Probs when done, to compute probabilities from accumulated data.
  • SaveCSV / OpenCSV for saving / loading data (for nogui usage).

The TFPN matrix keeps a record of true/false positives (tp/fp) and true/false negatives (tn/fn) for each category/class. This table is used to calculate F1 scores either by class or across classes

A beginner’s guide on how to calculate Precision, Recall, F1-score for a multi-class classification problem can be found at https://towardsdatascience.com/confusion-matrix-for-your-multi-class-machine-learning-model-ff9aa3bf7826

API:

  • SumTFPN to calculate the tp, fp, fn and tn scores for each class
  • ScoreClass calculates the precision and recall scores that are needed for the F1 score
  • ScoreMatrix uses the values calculated by ScoreClass to generate 3 different F1 scores for the entire matrix
    • F1 Micro
    • F1 Macro
    • F1 Weighted

Documentation

Index

Constants

This section is empty.

Variables

View Source
var KiT_Matrix = kit.Types.AddType(&Matrix{}, MatrixProps)
View Source
var MatrixProps = ki.Props{
	"ToolBar": ki.PropSlice{
		{"SaveCSV", ki.Props{
			"label": "Save CSV...",
			"icon":  "file-save",
			"desc":  "Save CSV-formatted confusion probabilities (Probs)",
			"Args": ki.PropSlice{
				{"CSV File Name", ki.Props{
					"ext": ".csv",
				}},
			},
		}},
		{"OpenCSV", ki.Props{
			"label": "Open CSV...",
			"icon":  "file-open",
			"desc":  "Open CSV-formatted confusion probabilities (Probs)",
			"Args": ki.PropSlice{
				{"Weights File Name", ki.Props{
					"ext": ".csv",
				}},
			},
		}},
	},
}

Functions

This section is empty.

Types

type Matrix

type Matrix struct {

	// [view: no-inline] normalized probability of confusion: Row = ground truth class, Col = actual response for that class.
	Prob etensor.Float64 `view:"no-inline" desc:"normalized probability of confusion: Row = ground truth class, Col = actual response for that class."`

	// [view: no-inline] incremental sums
	Sum etensor.Float64 `view:"no-inline" desc:"incremental sums"`

	// [view: no-inline] counts per ground truth (rows)
	N etensor.Float64 `view:"no-inline" desc:"counts per ground truth (rows)"`

	// [view: no-inline] visualization using SimMat
	Vis simat.SimMat `view:"no-inline" desc:"visualization using SimMat"`

	// [view: no-inline] true pos/neg, false pos/neg for each class, generated from the confusion matrix
	TFPN etensor.Float64 `view:"no-inline" desc:"true pos/neg, false pos/neg for each class, generated from the confusion matrix"`

	// [view: no-inline] precision, recall and F1 score by class
	ClassScores etensor.Float64 `view:"no-inline" desc:"precision, recall and F1 score by class"`

	// [view: no-inline] micro F1, macro F1 and weighted F1 scores for entire matrix ignoring class
	MatrixScores etensor.Float64 `view:"no-inline" desc:"micro F1, macro F1 and weighted F1 scores for entire matrix ignoring class"`
}

Matrix computes the confusion matrix, with rows representing the ground truth correct class, and columns representing the actual answer produced. Correct answers are along the diagonal.

func (*Matrix) Incr

func (cm *Matrix) Incr(class, resp int)

Incr increments the data for given class ground truth and response.

func (*Matrix) Init

func (cm *Matrix) Init(n int)

Init initializes the Matrix for given number of classes, and resets the data to zero.

func (*Matrix) InitFromLabels added in v1.1.57

func (cm *Matrix) InitFromLabels(lbls []string, fontSize int)

InitFromLabels does initialization based on given labels. Calls Init on len(lbls) and SetLabels. Default fontSize = 12 if 0 or -1 passed

func (*Matrix) OpenCSV

func (cm *Matrix) OpenCSV(filename gi.FileName)

OpenCSV opens Prob result from a CSV file, comma separated

func (*Matrix) Probs

func (cm *Matrix) Probs()

Probs computes the probabilities based on accumulated data

func (*Matrix) Reset added in v1.1.57

func (cm *Matrix) Reset()

Reset resets the data to zero

func (*Matrix) SaveCSV

func (cm *Matrix) SaveCSV(filename gi.FileName)

SaveCSV saves Prob result to a CSV file, comma separated

func (*Matrix) ScoreClass added in v1.3.0

func (cm *Matrix) ScoreClass(class int)

func (*Matrix) ScoreMatrix added in v1.3.0

func (cm *Matrix) ScoreMatrix()

func (*Matrix) SetLabels added in v1.1.34

func (cm *Matrix) SetLabels(lbls []string)

SetLabels sets the class labels, for visualization in Vis

func (*Matrix) SumTFPN added in v1.3.0

func (cm *Matrix) SumTFPN(class int)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL