Back to godoc.org
github.com/FePhyFoFum/gophy

Package gophy

v0.0.0-...-b0eb792
Latest Go to latest

The highest tagged major version is .

Published: Aug 5, 2020 | License: GPL3 | Module: github.com/FePhyFoFum/gophy

Index

func ADPoissonTreeLoglike

func ADPoissonTreeLoglike(nodels []*Node, lam float64) float64

ADPoissonTreeLoglike calculates the ancestor - descendent

for a set of stratigraphic ranges

func AdjustBLNR

func AdjustBLNR(node *Node, x *DNAModel, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNR This is a single edge NR

func AdjustBLNRMS

func AdjustBLNRMS(node *Node, x StateModel, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNRMS This is a single edge NR

func AdjustBLNRMSMul

func AdjustBLNRMSMul(node *Node, models []StateModel, nodemodels map[*Node]int, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNRMSMul This is a single edge NR

func AdjustBLNRMult

func AdjustBLNRMult(node *Node, models []*DNAModel, nodemodels map[*Node]int, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNRMult This is a single edge NR

func AncTritomyML

func AncTritomyML(tree *Node, sites []int)

AncTritomyML will calculate the MLEs for the branch lengths of a tifurcating 3-taxon tree assuming that direct ancestors may be in the tree

func AssertUnrootedTree

func AssertUnrootedTree(tree *Node)

AssertUnrootedTree is a quick check to make sure the tree passed is unrooted

func BMCalcLensBackFront

func BMCalcLensBackFront(t *Tree, sites []int)

BMCalcLensBackFront will do one pass of the EM branch length estimation

func BMOptimBLEM

func BMOptimBLEM(t *Tree, niter int)

BMOptimBLEM will calculate the BM branch lengths using an iterative EM calculation that imputes missing data using PICs

func BMPruneRooted

func BMPruneRooted(n *Node)

BMPruneRooted will prune BM branch lens and PICs down to a rooted node root node should be a real (ie. bifurcating) root

func BMPruneRootedSingle

func BMPruneRootedSingle(n *Node, i int)

BMPruneRootedSingle will prune BM branch lens and calculate PIC of a single trait down to a rooted node root node should be a real (ie. bifurcating) root

func BipartSliceContains

func BipartSliceContains(bps []Bipart, bp Bipart) (ind int)

BipartSliceContains checks to see if the bipart slice contains the bipart and returns the index

func CalcAIC

func CalcAIC(ln float64, k float64) (x float64)

CalcAIC k=numparams

func CalcAICC

func CalcAICC(lnl float64, k float64, n int) (x float64)

CalcAICC k=numparams,n=samplesize

func CalcAncStates

func CalcAncStates(x *DNAModel, tree *Tree, patternval []float64) (retstates map[*Node][][]float64)

CalcAncStates for each node based on the calculations above

func CalcAncStatesMS

func CalcAncStatesMS(x StateModel, tree *Tree, patternval []float64) (retstates map[*Node][][]float64)

CalcAncStatesMS for each node based on the calculations above

func CalcAncStatesMSMUL

func CalcAncStatesMSMUL(models []StateModel, nodemodels map[*Node]int, tree *Tree, patternval []float64) (retstates map[*Node][][]float64)

CalcAncStatesMSMUL for each node based on the calculations above

func CalcAscBiasVal

func CalcAscBiasVal(t *Tree, x StateModel) float64

CalcAscBiasVal creates dummy invariant characters for each possible state and sums their likelihoods to yield a Prob(invariant) Used in ascertainment bias correction like: L_marginal = L_unconditioned / Prob(invariant)

func CalcBIC

func CalcBIC(ln float64, k float64, n int) (x float64)

CalcBIC k=numparams, n=samplesize

func CalcExpectedTraits

func CalcExpectedTraits(tree *Node)

CalcExpectedTraits will plug in the expected values for missing traits under BM using the pruning/PIC ancestral state estimation approach

func CalcLikeFrontBack

func CalcLikeFrontBack(x *DNAModel, tree *Tree, patternval []float64)

CalcLikeFrontBack ...

func CalcLikeFrontBackMS

func CalcLikeFrontBackMS(x StateModel, tree *Tree, patternval []float64)

CalcLikeFrontBackMS ...

func CalcLikeFrontBackMSMUL

func CalcLikeFrontBackMSMUL(models []StateModel, nodemodels map[*Node]int, tree *Tree, patternval []float64)

CalcLikeFrontBackMSMUL ...

func CalcLikeFrontBackMult

func CalcLikeFrontBackMult(models []*DNAModel, nodemodels map[*Node]int, tree *Tree, patternval []float64)

CalcLikeFrontBackMult ...

func CalcLikeNode

func CalcLikeNode(nd *Node, model *DNAModel, site int)

CalcLikeNode calculate the likelihood of a node

func CalcLikeNodeMS

func CalcLikeNodeMS(nd *Node, model StateModel, site int)

CalcLikeNodeMS calculate the likelihood of a node for multistate

func CalcLikeNodeMSMUL

func CalcLikeNodeMSMUL(nd *Node, models []StateModel, nodemodels map[*Node]int, site int)

CalcLikeNodeMSMUL calculate the likelihood of a node for multistate

func CalcLikeOneSite

func CalcLikeOneSite(t *Tree, x *DNAModel, site int) float64

CalcLikeOneSite just one site

func CalcLikeOneSiteMS

func CalcLikeOneSiteMS(t *Tree, x StateModel, site int) float64

CalcLikeOneSiteMS is used to calculate the like of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLikeOneSiteMSMUL

func CalcLikeOneSiteMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, site int) float64

CalcLikeOneSiteMSMUL is used to calculate the like of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLikeOneSiteMarked

func CalcLikeOneSiteMarked(t *Tree, x *DNAModel, site int) float64

CalcLikeOneSiteMarked this uses the marked machinery to recalculate

func CalcLikeOneSiteMarkedMS

func CalcLikeOneSiteMarkedMS(t *Tree, x StateModel, site int) float64

CalcLikeOneSiteMarkedMS calculates likelihood for one site using CalcLikeNodeMS

This is outside of teh worker pool (probably doing this first to populate the P matrix)

func CalcLikeOneSiteMarkedMul

func CalcLikeOneSiteMarkedMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, site int) float64

CalcLikeOneSiteMarked this uses the marked machinery to recalculate

func CalcLikeWork

func CalcLikeWork(t *Tree, x *DNAModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWork this is the worker

func CalcLikeWorkMS

func CalcLikeWorkMS(t *Tree, x StateModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMS calculates likelihood for one site using CalcLikeNodeMS

Work refers to this being part of the worker pool

func CalcLikeWorkMSMUL

func CalcLikeWorkMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMSMUL calculates likelihood for one site using CalcLikeNodeMS

Work refers to this being part of the worker pool

func CalcLikeWorkMarked

func CalcLikeWorkMarked(t *Tree, x *DNAModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarked this is intended to calculate only on the marked nodes back to teh root

func CalcLikeWorkMarkedMS

func CalcLikeWorkMarkedMS(t *Tree, x StateModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarkedMS this should only calculate like of the marked nodes back to the root

Work refers to this being part of the worker pool

func CalcLikeWorkMarkedMul

func CalcLikeWorkMarkedMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarked this is intended to calculate only on the marked nodes back to teh root

func CalcLogLikeNode

func CalcLogLikeNode(nd *Node, model *DNAModel, site int)

CalcLogLikeNode calculates likelihood for node

func CalcLogLikeNodeMS

func CalcLogLikeNodeMS(nd *Node, model StateModel, site int)

CalcLogLikeNodeMS calculates log likelihood for node for multistate

func CalcLogLikeNodeMSMUL

func CalcLogLikeNodeMSMUL(nd *Node, models []StateModel, nodemodels map[*Node]int, site int)

CalcLogLikeNodeMSMUL calculates log likelihood for node for multistate

func CalcLogLikeOneSite

func CalcLogLikeOneSite(t *Tree, x *DNAModel, site int) float64

CalcLogLikeOneSite just calculate the likelihood of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLogLikeOneSiteBack

func CalcLogLikeOneSiteBack(t *Tree, nb *Node, x *DNAModel, site int) float64

CalcLogLikeOneSiteBack like the one above but from nb to the root only

func CalcLogLikeOneSiteBackMS

func CalcLogLikeOneSiteBackMS(t *Tree, nb *Node, x StateModel, site int) float64

CalcLogLikeOneSiteBackMS is used to calculate the loglike of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations This starts at node nb and goes to the root.

func CalcLogLikeOneSiteMS

func CalcLogLikeOneSiteMS(t *Tree, x StateModel, site int) float64

CalcLogLikeOneSiteMS is used to calculate the loglike of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLogLikeOneSiteMSMUL

func CalcLogLikeOneSiteMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, site int) float64

CalcLogLikeOneSiteMSMUL is used to calculate the loglike of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLogLikeOneSiteMarked

func CalcLogLikeOneSiteMarked(t *Tree, x *DNAModel, site int) float64

CalcLogLikeOneSiteMarked this uses the marked machinery to recalculate

func CalcLogLikeOneSiteMarkedMS

func CalcLogLikeOneSiteMarkedMS(t *Tree, x StateModel, site int) float64

CalcLogLikeOneSiteMarkedMS calculates log likelihood for one site using CalcLogLikeNodeMS

This is outside of teh worker pool (probably doing this first to populate the P matrix)

func CalcLogLikeOneSiteMul

func CalcLogLikeOneSiteMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, site int) float64

CalcLogLikeOneSiteMul just one site

func CalcLogLikeWork

func CalcLogLikeWork(t *Tree, x *DNAModel, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWork this is intended for a worker that will be executing this per site

func CalcLogLikeWorkBack

func CalcLogLikeWorkBack(t *Tree, nb *Node, x *DNAModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkBack this is intended for a worker that will be executing this per site

func CalcLogLikeWorkBackMS

func CalcLogLikeWorkBackMS(t *Tree, nb *Node, x StateModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkBackMS only calculates the log likelihood back to the root for multistate

Work refers to this being part of the worker pool

func CalcLogLikeWorkMS

func CalcLogLikeWorkMS(t *Tree, x StateModel, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWorkMS calculates log likelihood for one site using CalcLogLikeNodeMS

Work refers to this being part of the worker pool

func CalcLogLikeWorkMSMUL

func CalcLogLikeWorkMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWorkMSMUL calculates log likelihood for one site using CalcLogLikeNodeMS

Work refers to this being part of the worker pool

func CalcLogLikeWorkMarked

func CalcLogLikeWorkMarked(t *Tree, x *DNAModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkMarked this is intended to calculate only on the marked nodes back to teh root

func CalcLogLikeWorkMarkedMS

func CalcLogLikeWorkMarkedMS(t *Tree, x StateModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkMarkedMS this should only calculate log like of the marked nodes back to the root

Work refers to this being part of the worker pool

func CalcLogLikeWorkMul

func CalcLogLikeWorkMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWorkMul this is the worker

func CalcRootedLogLike

func CalcRootedLogLike(n *Node, nlikes *float64, startFresh bool)

CalcRootedLogLike will return the BM likelihood of a tree assuming that no data are missing from the tips.

func CalcSankParsNode

func CalcSankParsNode(nd *Node, site int)

CalcSankParsNode ...

func CalcSankParsNodeMultState

func CalcSankParsNodeMultState(nd *Node, numstates int, site int)

CalcSankParsNodeMultState ...

func CalcSankParsWork

func CalcSankParsWork(t *Tree, jobs <-chan int, results chan<- ParsResult)

CalcSankParsWork ...

func CalcSankParsWorkMultState

func CalcSankParsWorkMultState(t *Tree, numstates int, jobs <-chan int, results chan<- ParsResult)

CalcSankParsWorkMultState ...

func CalcSiteMeans

func CalcSiteMeans(nodes []*Node) (siteSum []float64)

CalcSiteMeans will calculate the mean value for all the sites in the matrix for which the site is not missing

func CalcSliceIntDifference

func CalcSliceIntDifference(a, b []int) []int

CalcSliceIntDifference calculate the difference (set) between two int slices

func CalcSliceIntDifferenceInt

func CalcSliceIntDifferenceInt(a, b []int) int

CalcSliceIntDifferenceInt calculate the size of the difference (set) between two int slices

func CalcStochMapMS

func CalcStochMapMS(x StateModel, tree *Tree, patternval []float64, time bool, from int, to int) (retstates map[*Node][][]float64)

CalcStochMapMS for each node based on the calculations above

func CalcStochMapMSMUL

func CalcStochMapMSMUL(models []StateModel, nodemodels map[*Node]int, tree *Tree, patternval []float64, time bool, from int, to int) (retstates map[*Node][][]float64)

CalcStochMapMSMUL for each node based on the calculations above

func CalcUnrootedLogLike

func CalcUnrootedLogLike(tree *Node, startFresh bool) (chll float64)

CalcUnrootedLogLike will calculate the log-likelihood of an unrooted tree, while assuming that no sites have missing data.

func ClusterCalcExpectedTraits

func ClusterCalcExpectedTraits(tree *Node, sites []int)

ClusterCalcExpectedTraits will plug in the expected values for missing traits under BM using the pruning/PIC ancestral state estimation approach

func ClusterMissingTraitsEM

func ClusterMissingTraitsEM(t *Tree, cluster *Cluster, niter int)

ClusterMissingTraitsEM will calculate the BM branch lengths using an iterative EM calculation that imputes missing data using PICs using the traits in a single cluster

func CompareTreeToBiparts

func CompareTreeToBiparts(bps []Bipart, comptreebps []Bipart, workers int, mapints map[int]string, verbose bool, treeverbose bool)

CompareTreeToBiparts take biparts from a set , comparetreebps, and compre them to another set bps this one is complicated so keep with it

func ConfInt95TF

func ConfInt95TF(nums []float64) (float64, float64)

ConfInt95TF returns 95% conf int t-stat

func EstParsBL

func EstParsBL(t *Tree, patternval []float64, totalsites int)

EstParsBL estimate the parsimony branch lengths

func EstParsBLMultState

func EstParsBLMultState(t *Tree, numstates int, patternval []float64, totalsites int)

EstParsBLMultState ....

this will just pick one when there are equivalent

func GetEmpiricalBaseFreqs

func GetEmpiricalBaseFreqs(seqs map[string]string) (bf []float64)

GetEmpiricalBaseFreqs get the empirical base freqs from the seqs

func GetEmpiricalBaseFreqsMS

func GetEmpiricalBaseFreqsMS(seqs []MSeq, numstates int) (bf []float64)

GetEmpiricalBaseFreqsMS get the empirical base freqs from the seqs

func GetGammaCats

func GetGammaCats(alpha float64, cats int, median bool) []float64

GetGammaCats for likelihood calculators

func GetMap

func GetMap(numStates int) (charMap map[string][]int)

GetMap based on states without the MultStateModel struct

func GetNucMap

func GetNucMap() (charMap map[string][]int)

GetNucMap get the int map for DNA with ambiguities

func GetRevNucMap

func GetRevNucMap() (charMap map[int]string)

GetRevNucMap ...

func GetSitePatterns

func GetSitePatterns(seqs map[string]string, nsites int, seqnames []string) (patterns map[string][]int,
	patternsint map[int]float64, gapsites []int, constant []int, uninformative []int, fullpattern []int)

GetSitePatterns return site pattens when the datatype for the alignment is a map[string]string

func GetSitePatternsMS

func GetSitePatternsMS(seqs []MSeq, charMap map[string][]int, numstates int) (patterns map[string][]int,
	patternsint map[int]float64, gapsites []int, constant []int, uninformative []int, fullpattern []int)

GetSitePatternsMS return site pattens when the datatype for the alignment is a map[string]string

func GreedyIterateLengthsMissing

func GreedyIterateLengthsMissing(t *Tree, sites []int, niter int)

GreedyIterateLengthsMissing will calculate the BM branch lengths using an iterative EM calculation that imputes missing data using PICs using the traits in a single cluster

func InitMissingValues

func InitMissingValues(tree []*Node)

InitMissingValues will find the missing sites in a data matrix and plug in values corresponding to the mean of the remaining sites

func IntMapDifferenceRet

func IntMapDifferenceRet(a, b map[int]bool) []int

IntMapDifferenceRet calculate the difference (set) between two int slices

func IntMapIntersects

func IntMapIntersects(s1 map[int]bool, s2 map[int]bool) (in bool)

IntMapIntersects checks to see if the two map[int]bool intersect (in the set sense)

func IntMapIntersects2

func IntMapIntersects2(s1 map[int]bool, s2 map[int]bool) (in bool)

IntMapIntersects2 checks to see if the two map[int]bool intersect (in the set sense) with at least 2 matches

func IntMapIntersectsRet

func IntMapIntersectsRet(s1, s2 map[int]bool) (r []int)

IntMapIntersectsRet checks to see if the two map[int]bool intersect and returns the intersection (in the set sense)

func IntMapSetString

func IntMapSetString(intmap map[int]bool) (s string)

IntMapSetString get a string for printing off a set

func IntSliceContains

func IntSliceContains(is []int, s int) (rb bool)

IntSliceContains checks to see if the int slice contains an int and returns the bool

func IntSliceIntersects

func IntSliceIntersects(a, b []int) (rb bool)

IntSliceIntersects checks to see whether two int slices intersect

func IterateLengthsWeighted

func IterateLengthsWeighted(tree *Tree, cluster *Cluster, niter int)

IterateLengthsWeighted will iteratively calculate the ML branch lengths for a particular topology and cluster when doing the greedy site clustering procedure.

func Log1exp

func Log1exp(x float64) float64

Log1exp returns log(0+e^x) when e^x is in range

func LogFact

func LogFact(k float64) float64

LogFact calculate the log factorial - this is based on Stirling's approx and is faster than LogFactorial

func LogFactorial

func LogFactorial(val int) (x float64)

LogFactorial slow method to calculate log factorial log(1) + log(2) + ... + log(n)

func MakeMissingMeansTip

func MakeMissingMeansTip(n *Node, means []float64)

MakeMissingMeansTip will replace missing values with the mean across all tips for a single tip

func MakeStratHeights

func MakeStratHeights(tree *Tree)

MakeStratHeights assigns the strat heights

func MapContinuous

func MapContinuous(t *Tree, traitfl string)

MapContinuous maps the traits from a file to the tips of a tree and initializes slices of the same length for the internal nodes

func MaxClustLab

func MaxClustLab(l map[int]*Cluster) (biggest int)

MaxClustLab returns the maximum value in a map of ints used like a set

func MaxF

func MaxF(n []float64) float64

MaxF max

func MedianF

func MedianF(n []float64) float64

MedianF calculate the "median" value

func MinF

func MinF(n []float64) float64

MinF max

func NNIMoves

func NNIMoves(tr *Tree) [][]*Node

NNIMoves looks at the root and returns the NNIs

func NW

func NW(seqs []Seq, in1 int, in2 int)

NW toy example, scores are all default

func NodeNamesSliceIntersects

func NodeNamesSliceIntersects(a, b []*Node) (rb bool)

NodeNamesSliceIntersects checks to see whether two node slices intersect by name

func NodeSliceContains

func NodeSliceContains(s []*Node, e *Node) bool

NodeSliceContains tells you whether the e string is in the slice

func NodeSlicePosition

func NodeSlicePosition(sl []*Node, nd *Node) (x int)

NodeSlicePosition take a *[]Node slice and teh get the index of the element node

func OldestChildAge

func OldestChildAge(node *Node) float64

OldestChildAge returns the oldest Child

func OptimizeBF

func OptimizeBF(t *Tree, x *DNAModel, patternvals []float64, log bool, wks int)

OptimizeBF optimizing the basefreq model but for a clade

func OptimizeBFRMSubClade

func OptimizeBFRMSubClade(t *Tree, n *Node, excl bool, x *DNAModel, patternvals []float64, wks int)

OptimizeBFRMSubClade optimizing the basefreq model but for a subclade

func OptimizeBFSubClade

func OptimizeBFSubClade(t *Tree, n *Node, excl bool, x *DNAModel, patternvals []float64, log bool, wks int)

OptimizeBFSubClade optimizing the basefreq model but for a subclade

func OptimizeBL

func OptimizeBL(nd *Node, t *Tree, x *DNAModel, patternvals []float64, wks int)

OptimizeBL This uses the standard gonum optimizers. Not great.

func OptimizeBLNR

func OptimizeBLNR(t *Tree, x *DNAModel, patternvals []float64, wks int)

OptimizeBLNR Newton-Raphson for each branch. Does 4 passes

func OptimizeBLNRGN

func OptimizeBLNRGN(t *Tree, x *DNAModel, patternvals []float64, wks int)

func OptimizeBLNRMS

func OptimizeBLNRMS(t *Tree, x StateModel, patternvals []float64, wks int)

OptimizeBLNRMS Newton-Raphson for each branch. Does 4 passes

func OptimizeBLNRMSMul

func OptimizeBLNRMSMul(t *Tree, models []StateModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeBLNRMSMul Newton-Raphson for each branch. Does 4 passes

func OptimizeBLNRMult

func OptimizeBLNRMult(t *Tree, models []*DNAModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeBLNRMult Newton-Raphson for each branch. Does 4 passes

func OptimizeBLS

func OptimizeBLS(t *Tree, x *DNAModel, patternvals []float64, wks int)

OptimizeBLS optimize all branch lengths

func OptimizeBLSMul

func OptimizeBLSMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, patternvals []float64, wks int) float64

OptimizeBLSMul optimize all branch lengths

func OptimizeGTR

func OptimizeGTR(t *Tree, x *DNAModel, patternvals []float64, sup bool, wks int)

OptimizeGTR optimize GTR

func OptimizeGTRBPMSMul

func OptimizeGTRBPMSMul(t *Tree, models []StateModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGTRBPMul optimize GTR and base composition for the different parts

func OptimizeGTRBPMul

func OptimizeGTRBPMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, usemodelvals bool,
	patternvals []float64, log bool, wks int)

OptimizeGTRBPMul optimize GTR and base composition for the different parts

func OptimizeGTRCompSharedRM

func OptimizeGTRCompSharedRM(t *Tree, models []*DNAModel, nodemodels map[*Node]int,
	usemodelvals bool, patternvals []float64, log bool, wks int)

OptimizeGTRCompSharedRM optimize GTR base composition but share rate matrix for the different parts

func OptimizeGTRCompSharedRMSingleModel

func OptimizeGTRCompSharedRMSingleModel(t *Tree, models []*DNAModel,
	nodemodels map[*Node]int, usemodelvals bool, whichmodel int,
	patternvals []float64, log bool, wks int)

OptimizeGTRCompSharedRMSingleModel optimize GTR base composition but share rate matrix for the different parts

you send an int that will identify which model can be adjusted and only that one will be

func OptimizeGTRCompSharedRMSubClade

func OptimizeGTRCompSharedRMSubClade(t *Tree, n *Node, excl bool, models []*DNAModel, nodemodels map[*Node]int,
	usemodelvals bool, patternvals []float64, wks int)

OptimizeGTRCompSharedRMSubClade optimize GTR base composition but share rate matrix for the different parts

func OptimizeGTRMSCompSharedRM

func OptimizeGTRMSCompSharedRM(t *Tree, models []StateModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGTRMSCompSharedRM optimize GTR base composition but share rate matrix for the different parts

func OptimizeGTRMSMul

func OptimizeGTRMSMul(t *Tree, models []StateModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGTRMSMul optimize GTR

func OptimizeGTRMul

func OptimizeGTRMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGTRMul optimize GTR

func OptimizeGTRSubClade

func OptimizeGTRSubClade(t *Tree, n *Node, excl bool, x *DNAModel, patternvals []float64, wks int)

OptimizeGTRSubClade optimizing the GTR model but for a subclade

func OptimizeMKMS

func OptimizeMKMS(t *Tree, x StateModel, startv float64, patternvals []float64, sym bool, wks int)

OptimizeMKMS optimize GTR

symmetrical and scale the last rate to 1

func OptimizeMKMSMul

func OptimizeMKMSMul(t *Tree, models []StateModel, nodemodels map[*Node]int, startv float64, patternvals []float64, sym bool, wks int)

OptimizeMKMSMul optimize "unscaled" GTR

symmetrical and scale the last rate to 1

func OptimizeMS1R

func OptimizeMS1R(t *Tree, x StateModel, patternvals []float64, wks int)

OptimizeMS1R ...

func OptimizeMS1RMul

func OptimizeMS1RMul(t *Tree, models []StateModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeMS1RMul multistate one rate JC Multiple models reconstruction

func OptimizePreservationLam

func OptimizePreservationLam(tree *Tree) (float64, float64)

OptimizePreservationLam will optimize the poisson rate parameter in the preservation model

func OutputEdges

func OutputEdges(mapints map[int]string, bps []Bipart, ntrees int, verb bool)

OutputEdges just print the edges mapints are int to string names for the taxa bps list of biparts ntrees number of trees

func PBMLogLikeRt

func PBMLogLikeRt(tree *Node, startFresh bool, workers int) (sitelikes float64)

PBMLogLikeRt will calculate the BM log like on a rooted tree

func PCalcLike

func PCalcLike(t *Tree, x *DNAModel, nsites int, wks int) (fl float64)

PCalcLike parallel calculate likelihood

func PCalcLikeMS

func PCalcLikeMS(t *Tree, x StateModel, nsites int, wks int) (fl float64)

PCalcLikeMS this will calculate log like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLikeOneSiteMS and the rest in the pool with CalcLikeWorkMS
This is without patterns.

func PCalcLikeMSMUL

func PCalcLikeMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, nsites int, wks int) (fl float64)

PCalcLikeMSMUL this will calculate log like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLikeOneSiteMS and the rest in the pool with CalcLikeWorkMS
This is without patterns.

func PCalcLikePatterns

func PCalcLikePatterns(t *Tree, x *DNAModel, patternval []float64, wks int) (fl float64)

PCalcLikePatterns parallel caclulation of likelihood with patterns

func PCalcLikePatternsMS

func PCalcLikePatternsMS(t *Tree, x StateModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMS this will calculate like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLikeOneSiteMS and the rest in the pool with CalcLikeWorkMS
This is with patterns.

func PCalcLikePatternsMSMUL

func PCalcLikePatternsMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMSMUL this will calculate like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLikeOneSiteMS and the rest in the pool with CalcLikeWorkMS
This is with patterns.

func PCalcLikePatternsMarked

func PCalcLikePatternsMarked(t *Tree, x *DNAModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarked parallel likelihood caclulation with patterns and just update the values

func PCalcLikePatternsMarkedMS

func PCalcLikePatternsMarkedMS(t *Tree, x StateModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarkedMS this will calculate like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLikeOneSiteMarkedMS and the rest in the pool with CalcLikeWorkMarkedMS
This is with patterns. It will assume that nodes are going to be marked in between
calls to this function.

func PCalcLikePatternsMarkedMul

func PCalcLikePatternsMarkedMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarkedMul parallel likelihood caclulation with patterns and just update the values

func PCalcLikePatternsMul

func PCalcLikePatternsMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int,
	patternval []float64, wks int) (fl float64)

PCalcLikePatternsMul parallel caclulation of likelihood with patterns

func PCalcLikePatternsMulSubClade

func PCalcLikePatternsMulSubClade(t *Tree, n *Node, excl bool, models []*DNAModel,
	nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMulSubClade parallel log likeliohood calculation including patterns

func PCalcLikePatternsSubClade

func PCalcLikePatternsSubClade(t *Tree, n *Node, excl bool, x *DNAModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsSubClade parallel log likeliohood calculation including patterns

func PCalcLogLike

func PCalcLogLike(t *Tree, x *DNAModel, nsites int, wks int) (fl float64)

PCalcLogLike this will calculate log like in parallel

func PCalcLogLikeBack

func PCalcLogLikeBack(t *Tree, n *Node, x *DNAModel, nsites int, wks int) (fl float64)

PCalcLogLikeBack a bit of a shortcut. Could do better, but walks back from the n node to the root

func PCalcLogLikeBackMS

func PCalcLogLikeBackMS(t *Tree, n *Node, x StateModel, nsites int, wks int) (fl float64)

PCalcLogLikeBackMS this will calculate loglike in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteBackMS and the rest in the pool with CalcLogLikeWorkBackMS
This is without patterns. It will start at a node n and go back to the root.

func PCalcLogLikeMS

func PCalcLogLikeMS(t *Tree, x StateModel, nsites int, wks int) (fl float64)

PCalcLogLikeMS this will calculate log like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteMS and the rest in the pool with CalcLogLikeWorkMS
This is without patterns.

func PCalcLogLikeMSMUL

func PCalcLogLikeMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, nsites int, wks int) (fl float64)

PCalcLogLikeMSMUL this will calculate log like in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteMS and the rest in the pool with CalcLogLikeWorkMS
This is without patterns.

func PCalcLogLikeMarked

func PCalcLogLikeMarked(t *Tree, x *DNAModel, nsites int, wks int) (fl float64)

PCalcLogLikeMarked parallel calculation of loglike with just updating

func PCalcLogLikeMarkedMS

func PCalcLogLikeMarkedMS(t *Tree, x StateModel, nsites int, wks int) (fl float64)

PCalcLogLikeMarkedMS this will calculate loglike in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteMarkedMS and the rest in the pool with CalcLogLikeWorkMarkedMS
This is without patterns. It will assume that nodes are going to be marked in between
calls to this function.

func PCalcLogLikePatterns

func PCalcLogLikePatterns(t *Tree, x *DNAModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatterns parallel log likeliohood calculation including patterns

func PCalcLogLikePatternsMS

func PCalcLogLikePatternsMS(t *Tree, x StateModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsMS this will calculate loglike in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteMS and the rest in the pool with CalcLogLikeWorkMS
This is with patterns. Probably use this one

func PCalcLogLikePatternsMSMUL

func PCalcLogLikePatternsMSMUL(t *Tree, models []StateModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsMSMUL this will calculate loglike in parallel

It will make a thread pool then calculate the LogLike for the first site
with CalcLogLikeOneSiteMS and the rest in the pool with CalcLogLikeWorkMS
This is with patterns. Probably use this one

func PCalcLogLikePatternsMul

func PCalcLogLikePatternsMul(t *Tree, models []*DNAModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsMul ...

func PCalcLogLikePatternsSubClade

func PCalcLogLikePatternsSubClade(t *Tree, n *Node, excl bool, x *DNAModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsSubClade parallel log likeliohood calculation including patterns

func PCalcRFDistancesPartial

func PCalcRFDistancesPartial(bpts map[int][]int, bps []Bipart, jobs <-chan []int, results chan<- []int)

PCalcRFDistancesPartial calculates the partial rf, bpts is the tree index, bipart list, bps is the list of biparts

func PCalcRFDistancesPartialWeighted

func PCalcRFDistancesPartialWeighted(bpts map[int][]int, tippenalty bool, bps []Bipart, jobs <-chan []int, results chan<- Rfwresult)

PCalcRFDistancesPartialWeighted includes the branch lengths

func PCalcSankParsPatterns

func PCalcSankParsPatterns(t *Tree, patternval []float64, wks int) (fl float64)

PCalcSankParsPatterns parallel caclulation of fitch parsimony costs with patterns

func PCalcSankParsPatternsMultState

func PCalcSankParsPatternsMultState(t *Tree, numstates int, patternval []float64, wks int) (fl float64)

PCalcSankParsPatternsMultState parallel caclulation of fitch parsimony costs with patterns

func PCalcSliceIntDifferenceInt

func PCalcSliceIntDifferenceInt(bpts map[int][]int, jobs <-chan []int, results chan<- []int)

PCalcSliceIntDifferenceInt calculate the size of the difference (set) between two int slices in parallel used for: RF distance where the bpts is the tree index -> bipart index list map

func PConcordance

func PConcordance(bps []Bipart, jobs <-chan []int, results chan<- []int)

PConcordance is similar to the other comparison code but for concordance. The input jobs are the i, j for the bipart comparisons. The results are the i, j, and 0 for not concordant and 1 for concordant

func PConcordanceTwoSets

func PConcordanceTwoSets(comp []Bipart, bps []Bipart, jobs <-chan []int, results chan<- []int)

PConcordanceTwoSets same as the one above but where there are two sets

func PConflicts

func PConflicts(bps []Bipart, jobs <-chan []int, results chan<- []int)

PConflicts is a parallel conflict check. The slice is sent. The jobs are the two indices to check. The results are the two indicies and an int 1 for conflict 0 for no conflict

func PConflictsCompTree

func PConflictsCompTree(bps []Bipart, comptreebps []Bipart, jobs <-chan []int, results chan<- []int)

PConflictsCompTree is similar to PConflict but the first index refers to the first Bipart slice and the second referts to the second Bipart slice

func PLCalcLogLike

func PLCalcLogLike(rates []float64, durations []float64, chardurations []float64,
	lgFacCharDurations []float64) (ll float64)

PLCalcLogLike penalized likelihood calculator

func PNW

func PNW(seqs []Seq, jobs <-chan []int, results chan<- float32)

PNW NW but parallel

func PoissonTreeLoglike

func PoissonTreeLoglike(tree *Tree) float64

PoissonTreeLoglike calculates the Poisson LogLike based on

stratigraphic ranges

func PreparePatternVecs

func PreparePatternVecs(t *Tree, patternsint map[int]float64, seqs map[string]string) (patternval []float64, patternvec []int)

PreparePatternVecs for tree calculations

func PreparePatternVecsMS

func PreparePatternVecsMS(t *Tree, patternsint map[int]float64, seqs map[string][]string,
	charMap map[string][]int, numstates int) (patternval []float64, patternvec []int)

PreparePatternVecsMS for tree calculations

func PrintMatrix

func PrintMatrix(d *mat.Dense, diag bool)

func RTMultconditionals

func RTMultconditionals(models []*DNAModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RTMultconditionals ...

func RTconditionals

func RTconditionals(x *DNAModel, node *Node, patternval []float64)

RTconditionals tipconds calculated to the rt (including BL)

func RTconditionalsMS

func RTconditionalsMS(x StateModel, node *Node, patternval []float64)

RTconditionalsMS calculate RtConds vector for multistate

func RTconditionalsMSMUL

func RTconditionalsMSMUL(models []StateModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RTconditionalsMSMUL calculate RtConds vector for multistate

func RVMultconditionals

func RVMultconditionals(models []*DNAModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RVMultconditionals ...

func RVTPconditionals

func RVTPconditionals(node *Node, patternval []float64)

RVTPconditionals

func RVTPconditionalsMS

func RVTPconditionalsMS(x StateModel, node *Node, patternval []float64)

RVTPconditionalsMS calculate RvConds vector multistate

func RVTPconditionalsMSMUL

func RVTPconditionalsMSMUL(models []StateModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RVTPconditionalsMSMUL calculate RvConds vector multistate

func RVconditionals

func RVconditionals(x *DNAModel, node *Node, patternval []float64)

RVconditionals take par RvTpConds and put get bl

func RVconditionalsMS

func RVconditionalsMS(x StateModel, node *Node, patternval []float64)

RVconditionalsMS calculate RvTpConds vector for the Parent of node and

rely on the Par RvConds for multistate

func RVconditionalsMSMUL

func RVconditionalsMSMUL(models []StateModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RVconditionalsMSMUL calculate RvTpConds vector for the Parent of node and

rely on the Par RvConds for multistate

func ReadLine

func ReadLine(path string) (ln []string)

ReadLine is like the Python readline() and readlines()

func ReadMCLoutput

func ReadMCLoutput(clfl string) (clusters map[int]*Cluster)

func ReadPatternsMSeqsFromFile

func ReadPatternsMSeqsFromFile(sfn string) (seqs map[string][]string,
	patternsint map[int]float64, nsites int, bf []float64, numstates int)

ReadPatternsMSeqsFromFile return the seqs and patternsint

func ReadPatternsSeqsFromFile

func ReadPatternsSeqsFromFile(sfn string) (seqs map[string]string, patternsint map[int]float64, nsites int, bf []float64)

ReadPatternsSeqsFromFile return the seqs and patternsint

func ReadStrat

func ReadStrat(stratfl string, t *Tree)

ReadStrat reads stratigraphic ranges from a file and assigns those

data to a tree

func Reroot

func Reroot(inroot *Node, tr *Tree)

Reroot basic reroot function

func Round

func Round(val float64, roundOn float64, places int) (newVal float64)

Round to the nearest place probably val=num, roundOn = 0.5 places = 5

func SetHeights

func SetHeights(tree *Tree)

SetHeights set tree height

func SingleSiteLL

func SingleSiteLL(tree *Node, site int) (sitelike float64)

SingleSiteLL will return the likelihood of a single site

func StochasticNNI

func StochasticNNI()

StochasticNNI should make NNI moves

func StringSliceContains

func StringSliceContains(s []string, e string) bool

StringSliceContains tells you whether the e string is in the slice

func SubUnrootedLogLikeParallel

func SubUnrootedLogLikeParallel(tree *Node, sites []int, workers int) (sitelikes float64)

SubUnrootedLogLikeParallel will calculate the log-likelihood of an unrooted tree, while assuming that some sites have missing data. This can be used to calculate the likelihoods of trees that have complete trait sampling, but it will be slower than CalcRootedLogLike.

func SumFloatVec

func SumFloatVec(x []float64) (s float64)

SumFloatVec sum the float vectors

func SumLogExp

func SumLogExp(a, b float64) float64

SumLogExp sum log of exps

func SwapBranch

func SwapBranch(nd1 *Node, nd2 *Node) bool

SwapBranch will to get the branches that would be NNIs, use the function and then send the relevant nodes here

func TPconditionals

func TPconditionals(node *Node, patternval []float64)
* calculate the conditionals for ancestral calc or branch lengths

toward tip

tpcond X

| | rvcond
| ^
| |
v |
| | rvtpcond

rtcond x

toward root

TPconditionals regular tip conditionals

func TPconditionalsMS

func TPconditionalsMS(x StateModel, node *Node, patternval []float64)

TPconditionalsMS calculate TpConds vector for multistate

func TPconditionalsMSMUL

func TPconditionalsMSMUL(models []StateModel, nodemodels map[*Node]int, node *Node, patternval []float64)

TPconditionalsMSMUL calculate TpConds vector for multistate

func TritomyRoot

func TritomyRoot(tr *Tree)

TritomyRoot makes the root a tritomy

func WeightedUnrootedLogLikeParallel

func WeightedUnrootedLogLikeParallel(tree *Node, startFresh bool, weights []float64, workers int) (sitelikes float64)

WeightedUnrootedLogLikeParallel will calculate the log-likelihood of an unrooted tree, while assuming that some sites have missing data. This can be used to calculate the likelihoods of trees that have complete trait sampling, but it will be slower than CalcRootedLogLike.

type Bipart

type Bipart struct {
	Lt          map[int]bool
	Rt          map[int]bool
	Ct          int           // counts
	TreeIndices []int         // index of which trees this is in
	Nds         []*Node       // nodes associated with the bipart
	NdsM        map[int]*Node // nodes associated with the bipart by treeindex
	Index       int           // just a unique id
}

Bipart are represented as map[int]bools, one for the left and one for the right

func (Bipart) CompatibleWith

func (b Bipart) CompatibleWith(ib Bipart) (con bool)

CompatibleWith checks that it isn't conflicting but can be nested

func (Bipart) ConcordantWith

func (b Bipart) ConcordantWith(ib Bipart) (con bool)

ConcordantWith tests whether something is concordant (not conflicting or nested, etc)

func (Bipart) ConflictsWith

func (b Bipart) ConflictsWith(ib Bipart) (con bool)

ConflictsWith checks whether two biparts conflict

func (Bipart) Equals

func (b Bipart) Equals(ib Bipart) (eq bool)

Equals trying to be faster

func (Bipart) NewickWithNames

func (b Bipart) NewickWithNames(nmmap map[int]string) (ret string)

NewickWithNames does similar things to StringWithNames but sends a newick back

func (Bipart) StringWithNames

func (b Bipart) StringWithNames(nmmap map[int]string) (ret string)

StringWithNames converts the ints to the the strings from nmmap

type Cluster

type Cluster struct {
	Sites         []int // this stores all of the sites with a preference for this cluster
	BranchLengths []float64
	LogLike       float64
	SiteWeights   map[int]float64 // this will store the probability that each site in the MATRIX belongs here.
}

Cluster structure for sealiontomo stuff

func (*Cluster) CalcLL

func (c *Cluster) CalcLL(tree *Node)

CalcLL calculate the ll

func (*Cluster) WriteClusterPhylip

func (c *Cluster) WriteClusterPhylip(nodes []*Node) string

WriteClusterPhylip write the cluster file

type DNAModel

type DNAModel struct {
	BF      []float64 // base frequencies
	R       *mat.Dense
	Q       *mat.Dense // common use
	CharMap map[string][]int
	//sync.RWMutex
	Ps  map[float64]*mat.Dense
	PsL map[float64]*mat.Dense
	X   *mat.Dense
	P   mat.Dense
	//for decomposing
	QS         *mat.Dense
	EigenVals  []float64  // to be exponentiated
	EigenVecs  *mat.Dense //
	EigenVecsI *mat.Dense
	X1         *mat.Dense
	X2         *mat.Dense
	NumStates  int
}

DNAModel standard DNA struct

func NewDNAModel

func NewDNAModel() *DNAModel

NewDNAModel get new DNAModel pointer

func (*DNAModel) DecomposeQ

func (d *DNAModel) DecomposeQ()

DecomposeQ this is just for NR optimization for branch lengths

func (*DNAModel) DeepCopyDNAModel

func (d *DNAModel) DeepCopyDNAModel() *DNAModel

DeepCopyDNAModel ...

func (*DNAModel) EmptyPDict

func (d *DNAModel) EmptyPDict()

EmptyPDict save memory perhaps?

func (*DNAModel) EmptyPLDict

func (d *DNAModel) EmptyPLDict()

EmptyPLDict the logged one

func (*DNAModel) ExpValue

func (d *DNAModel) ExpValue(iv []float64, blen float64)

ExpValue used for the matrix exponential

func (*DNAModel) ExpValueFirstD

func (d *DNAModel) ExpValueFirstD(blen float64) (x *mat.Dense)

ExpValueFirstD get the first derivaite for NR

func (*DNAModel) ExpValueSecondD

func (d *DNAModel) ExpValueSecondD(blen float64) (x *mat.Dense)

ExpValueSecondD get the second derivaite for NR

func (*DNAModel) GetBF

func (d *DNAModel) GetBF() []float64

func (*DNAModel) GetCharMap

func (d *DNAModel) GetCharMap() map[string][]int

func (*DNAModel) GetNumStates

func (d *DNAModel) GetNumStates() int

func (*DNAModel) GetPCalc

func (d *DNAModel) GetPCalc(blen float64) *mat.Dense

GetPCalc calculate P matrix

func (*DNAModel) GetPMap

func (d *DNAModel) GetPMap(blen float64) *mat.Dense

GetPMap get the Ps from the dictionary

func (*DNAModel) GetPMapLogged

func (d *DNAModel) GetPMapLogged(blen float64) *mat.Dense

GetPMapLogged get the Ps from the dictionary

func (*DNAModel) GetStochMapMatrices

func (d *DNAModel) GetStochMapMatrices(dur float64, from int, to int) (summed *mat.Dense, summedR *mat.Dense)

func (*DNAModel) SetBaseFreqs

func (d *DNAModel) SetBaseFreqs(basefreq []float64)

SetBaseFreqs needs to be done before doing SetupQGTR

func (*DNAModel) SetMap

func (d *DNAModel) SetMap()

SetMap for getting the position in the array

func (*DNAModel) SetP

func (d *DNAModel) SetP(blen float64)

SetP use the standard spectral decom

func (*DNAModel) SetPSimple

func (d *DNAModel) SetPSimple(blen float64)

SetPSimple use the gonum matrixexp (seems faster)

func (*DNAModel) SetRateMatrix

func (d *DNAModel) SetRateMatrix(params []float64)

SetRateMatrix needs to be done before doing SetupQGTR just send along the 5 rates and this will make them the whole matrix

func (*DNAModel) SetScaledRateMatrix

func (d *DNAModel) SetScaledRateMatrix(params []float64, sym bool)

SetScaledRateMatrix needs to be done before doing SetupQGTR just send along the rates and this will make them the whole matrix the scaled is that this is assuming that the last rate is 1

func (*DNAModel) SetupQGTR

func (d *DNAModel) SetupQGTR()

SetupQGTR setup Q matrix

func (*DNAModel) SetupQJC

func (d *DNAModel) SetupQJC()

SetupQJC setup Q matrix

This is scaled so that change is reflected in the branch lengths
You don't need to use the SetScaledRateMatrix

func (*DNAModel) SetupQJC1Rate

func (d *DNAModel) SetupQJC1Rate(rt float64)

SetupQJC1Rate setup Q matrix with one rate, probably for anc multi state

These are unscaled so the branch lengths are going to be time or something else
and not relative to these rates
Will take BF from something else

func (*DNAModel) SetupQMk

func (d *DNAModel) SetupQMk(rt []float64, sym bool)

SetupQMk setup Q matrix

This is unscaled (so the branch lengths are going to be proportion to some other change
and not to these branch lengths)
Will take the BF from something else

type HCSearch

type HCSearch struct {
	Tree            *Tree
	PreorderNodes   []*Node
	Clusters        map[int]*Cluster
	SiteAssignments map[int]int
	Gen             int
	Threads         int
	Workers         int
	RunName         string
	LogOutFile      string
	K               int
	PrintFreq       int
	CurrentAIC      float64
	NumTraits       float64
	Criterion       int
	SavedConfig     []*SiteConfiguration
	CurBestAIC      float64
	JoinLikes       map[int]map[int]float64
	SplitGen        int
	Alpha           float64
	NumPoints       float64
	ExpandPenalty   float64
	MinK            int
}

func InitEMSearch

func InitEMSearch(tree *Tree, gen int, k int, pr int, alpha float64) *HCSearch

func InitGreedyHC

func InitGreedyHC(tree *Tree, gen int, pr int, crit int, rstart bool, k int, runName string, splitgen int, alpha float64, minK int) *HCSearch

func TransferGreedyHC

func TransferGreedyHC(tree *Tree, gen int, pr int, crit int, clus map[int]*Cluster, siteAssign map[int]int, runName string, splitgen int, alpha float64) *HCSearch

func (*HCSearch) CalcRelLikes

func (s *HCSearch) CalcRelLikes() (denom float64)

func (*HCSearch) CheckCluster

func (s *HCSearch) CheckCluster(checkConfig *SiteConfiguration) (keep bool)

func (*HCSearch) ClusterString

func (s *HCSearch) ClusterString() string

ClusterString will return a string of the current set of clusters

func (*HCSearch) NewSiteConfig

func (s *HCSearch) NewSiteConfig() *SiteConfiguration

func (*HCSearch) PerturbedRun

func (s *HCSearch) PerturbedRun()

func (*HCSearch) RefineSavedClusterings

func (s *HCSearch) RefineSavedClusterings()

func (*HCSearch) RunEM

func (s *HCSearch) RunEM()

func (*HCSearch) RunSingleHC

func (s *HCSearch) RunSingleHC()

func (*HCSearch) SplitEM

func (s *HCSearch) SplitEM()

func (*HCSearch) WriteBestClusters

func (s *HCSearch) WriteBestClusters()

func (*HCSearch) WriteClusterTrees

func (s *HCSearch) WriteClusterTrees()

type LikeResult

type LikeResult struct {
	// contains filtered or unexported fields
}

LikeResult likelihood value and site

type MSeq

type MSeq struct {
	NM  string
	SQ  string
	SQs []string //SQ seperated by spaces
}

MSeq minimal seq struct

func ReadMSeqsFromFile

func ReadMSeqsFromFile(filen string) (seqs []MSeq, numstates int)

ReadMSeqsFromFile obvious file should be fasta in format with starts seperated by spaces and starting at 0 going to whatever

type MultStateModel

type MultStateModel struct {
	BF        []float64 // base frequencies
	EBF       []float64 //empirical base freqs
	R         *mat.Dense
	Q         *mat.Dense // common use
	CharMap   map[string][]int
	NumStates int
	//sync.RWMutex
	Ps map[float64]*mat.Dense
	X  *mat.Dense
	P  mat.Dense
	//for decomposing
	QS         *mat.Dense
	EigenVals  []float64  // to be exponentiated
	EigenVecs  *mat.Dense //
	EigenVecsI *mat.Dense
	X1         *mat.Dense
	X2         *mat.Dense
}

MultStateModel multistate model struct

func NewMultStateModel

func NewMultStateModel() *MultStateModel

NewMultStateModel get new MULTModel pointer

func (*MultStateModel) DecomposeQ

func (d *MultStateModel) DecomposeQ()

DecomposeQ this is just for NR optimization for branch lengths

func (*MultStateModel) EmptyPDict

func (d *MultStateModel) EmptyPDict()

EmptyPDict save memory perhaps?

func (*MultStateModel) ExpValue

func (d *MultStateModel) ExpValue(iv []float64, blen float64)

ExpValue used for the matrix exponential

func (*MultStateModel) ExpValueFirstD

func (d *MultStateModel) ExpValueFirstD(blen float64) (x *mat.Dense)

ExpValueFirstD get the first derivaite for NR

func (*MultStateModel) ExpValueSecondD

func (d *MultStateModel) ExpValueSecondD(blen float64) (x *mat.Dense)

ExpValueSecondD get the second derivaite for NR

func (*MultStateModel) GetBF

func (d *MultStateModel) GetBF() []float64

func (*MultStateModel) GetCharMap

func (d *MultStateModel) GetCharMap() map[string][]int

func (*MultStateModel) GetNumStates

func (d *MultStateModel) GetNumStates() int

func (*MultStateModel) GetPCalc

func (d *MultStateModel) GetPCalc(blen float64) *mat.Dense

GetPCalc calculate P matrix

func (*MultStateModel) GetPMap

func (d *MultStateModel) GetPMap(blen float64) *mat.Dense

GetPMap get the Ps from the dictionary

func (*MultStateModel) GetStochMapMatrices

func (d *MultStateModel) GetStochMapMatrices(dur float64, from int, to int) (summed *mat.Dense, summedR *mat.Dense)

func (*MultStateModel) SetBaseFreqs

func (d *MultStateModel) SetBaseFreqs(basefreq []float64)

SetBaseFreqs needs to be done before doing other things

func (*MultStateModel) SetEmpiricalBF

func (d *MultStateModel) SetEmpiricalBF()

SetEmpiricalBF set all to empirical

func (*MultStateModel) SetEqualBF

func (d *MultStateModel) SetEqualBF()

SetEqualBF set all equal

func (*MultStateModel) SetMap

func (d *MultStateModel) SetMap()

SetMap for getting the position in the array

func (*MultStateModel) SetP

func (d *MultStateModel) SetP(blen float64)

SetP use the standard spectral decom

func (*MultStateModel) SetPSimple

func (d *MultStateModel) SetPSimple(blen float64)

SetPSimple use the gonum matrixexp (seems faster)

func (*MultStateModel) SetRateMatrix

func (d *MultStateModel) SetRateMatrix(params []float64)

SetRateMatrix length is (((numstates*numstates)-numstates)/2) - 1 or (numstates * numstates) - numstates this is for scaled branch lengths and matrices CHANGE THIS TO UNSCALED

func (*MultStateModel) SetScaledRateMatrix

func (d *MultStateModel) SetScaledRateMatrix(params []float64, sym bool)

SetScaledRateMatrix needs to be done before doing SetupQGTR just send along the rates and this will make them the whole matrix the scaled is that this is assuming that the last rate is 1 THIS IS THE SAME? AS SETRATEMATRIX?

func (*MultStateModel) SetupQGTR

func (d *MultStateModel) SetupQGTR()

SetupQGTR setup Q matrix

This is scaled (so the branch lengths are going to be proportional to these changes)
Use SetScaledRateMatrix and then do this

func (*MultStateModel) SetupQJC

func (d *MultStateModel) SetupQJC()

SetupQJC setup Q matrix

This is scaled so that change is reflected in the branch lengths
You don't need to use the SetScaledRateMatrix

func (*MultStateModel) SetupQJC1Rate

func (d *MultStateModel) SetupQJC1Rate(rt float64)

SetupQJC1Rate setup Q matrix with one rate, probably for anc multi state

These are unscaled so the branch lengths are going to be time or something else
and not relative to these rates
Will take BF from something else

func (*MultStateModel) SetupQMk

func (d *MultStateModel) SetupQMk(rt []float64, sym bool)

SetupQMk setup Q matrix

This is unscaled (so the branch lengths are going to be proportion to some other change
and not to these branch lengths)
Will take the BF from something else

type Node

type Node struct {
	Par       *Node   //parent
	Chs       []*Node //childs
	Nam       string  //name
	SData     map[string]string
	FData     map[string]float64
	IData     map[string]int
	Num       int
	Len       float64     //branch length
	Data      [][]float64 // [site][states]
	BData     [][]*SupFlo //[site][states]
	ContData  []float64   //[site] cont
	Mis       []bool      //[site] missing site
	Marked    bool        //just for like calculations
	Height    float64
	MarkedMap map[float64]bool //the float is for an id for the query
	//anc+bl
	// X------>--X
	// TP        RT
	//
	// X--<------X
	// RvTp      RV
	TpConds   [][]float64 //[site][states]
	RtConds   [][]float64
	RvConds   [][]float64
	RvTpConds [][]float64
	//TODO: need comments for these below
	//FAD         float64
	//LAD         float64
	//FINDS       float64
	//TimeLen     float64
	PruneLen    float64
	ConPruneLen []float64 // prevent race condition when calculating BM likelihood
	BMLen       float64
	Anc         bool
	ClustLen    map[int]float64
}

Node minimal node struct

func GetMrca

func GetMrca(nds []*Node, root *Node) (mrca *Node)

GetMrca get the mrca

func GetSubtreeRoot

func GetSubtreeRoot(n *Node, higherNode *Node) (subtreeRoot *Node)

GetSubtreeRoot will return the root of the subtree to which a node belongs that descends from a higher node (up to the root)

func ReadNewickString

func ReadNewickString(ts string) (root *Node)

ReadNewickString given a string it will return a pointer to the root node

func TimeTraverse

func TimeTraverse(preNodes []*Node, internalOnly bool) (ret []*Node)

TimeTraverse will visit all descendant nodes in order of their heights (earliest -> latest)

func (Node) BMPhylogram

func (n Node) BMPhylogram() (ret string)

BMPhylogram returns a string newick with brownian motion branch lengths

func (*Node) GetBackbone

func (n *Node) GetBackbone(higherNode *Node) (backbone []*Node)

GetBackbone TODO: what is this

func (*Node) GetSib

func (n *Node) GetSib() *Node

GetSib returns the sibling of a node

func (Node) GetTipNames

func (n Node) GetTipNames() (tips []string)

GetTipNames returns a slice with node pointers

func (Node) GetTips

func (n Node) GetTips() (tips []*Node)

GetTips returns a slice with node pointers

func (Node) Newick

func (n Node) Newick(bl bool) (ret string)

Newick returns a string newick

func (Node) NewickFData

func (n Node) NewickFData(bl bool, FD string) (ret string)

NewickFData returns a string newick

func (Node) NewickFloatBL

func (n Node) NewickFloatBL(fl string) (ret string)

NewickFloatBL returns a string newick with branch lengths of the data in FData[fl]

func (Node) NewickPaint

func (n Node) NewickPaint(bl bool, rid float64) (ret string)

NewickPaint returns a string newick

func (*Node) NumIntNodes

func (n *Node) NumIntNodes(count *int)

NumIntNodes is a helper method that will return the number of internal nodes descending from n (including n)

func (*Node) PostorderArray

func (n *Node) PostorderArray() (ret []*Node)

PostorderArray will return a postordered array of all the nodes starting at n

func (*Node) PostorderArrayExcl

func (n *Node) PostorderArrayExcl(x *Node) (ret []*Node)

PostorderArrayExcl will return a postordered array of all the nodes starting at n

excluding node x

func (*Node) PreorderArray

func (n *Node) PreorderArray() (ret []*Node)

PreorderArray will return a preordered array of all the nodes in a tree

func (*Node) Reroot

func (n *Node) Reroot(oldroot *Node) *Node

Reroot reroots all the nodes represented in a graph on n

func (*Node) RerootBM

func (n *Node) RerootBM(oldroot *Node) *Node

RerootBM reroots and moves the BMLen

func (Node) String

func (n Node) String() string

type NodeStack

type NodeStack struct {
	// contains filtered or unexported fields
}

NodeStack makes a node stack for pushing and pulling.

func NewNodeStack

func NewNodeStack() *NodeStack

NewNodeStack returns a pointer to a node stack

func (*NodeStack) Empty

func (s *NodeStack) Empty() (ret bool)

Empty is the node stack empty?

func (*NodeStack) Pop

func (s *NodeStack) Pop() (*Node, error)

Pop a node off the stack

func (*NodeStack) Push

func (s *NodeStack) Push(v *Node)

Push push the node into the stack

type PLObj

type PLObj struct {
	Rates                []float64
	Durations            []float64
	Dates                []float64
	Mins                 []float64
	PenMins              []float64
	PenMaxs              []float64
	Maxs                 []float64
	CharDurations        []float64
	LogFactCharDurations []float64
	ParentsNdsInts       []int
	ChildrenVec          [][]int
	NumNodes             int
	LogPen               bool
	PenaltyBoundary      float64
	Smoothing            float64
}

PLObj things for PL

func (*PLObj) CalcPL

func (p *PLObj) CalcPL(params []float64) float64

CalcPL ...

func (*PLObj) CalcPenalty

func (p *PLObj) CalcPenalty() (rkt float64)

CalcPenalty ...

func (*PLObj) CalcRateLike

func (p *PLObj) CalcRateLike() (ll float64)

CalcRateLike ...

func (*PLObj) CalcRoughnessPenalty

func (p *PLObj) CalcRoughnessPenalty() (su float64)

CalcRoughnessPenalty ...

func (*PLObj) OptimizePL

func (p *PLObj) OptimizePL(params []float64)

OptimizePL PL optimization

func (*PLObj) SetDurations

func (p *PLObj) SetDurations() (success bool)

SetDurations for nodes

func (*PLObj) SetValues

func (p *PLObj) SetValues(t Tree, numsites float64, minmap map[*Node]float64,
	maxmap map[*Node]float64)

SetValues ...

type ParsResult

type ParsResult struct {
	// contains filtered or unexported fields
}

ParsResult ...

type Quartet

type Quartet struct {
	Lt    map[int]bool
	Rt    map[int]bool
	Lts   []map[int]bool
	Rts   []map[int]bool
	Len   float64
	Index int
}

Quartet are represented as map[int]bools, one for the left and one for the right

func GetQuartet

func GetQuartet(nd *Node, tree Tree, maptips map[string]int) (Quartet, error)

GetQuartet for node

func GetQuartets

func GetQuartets(tree Tree, maptips map[string]int) (bps []Quartet)

GetQuartets return slice of quartets

func (Quartet) ConflictsWith

func (b Quartet) ConflictsWith(ib Quartet) (con bool)

ConflictsWith checks whether two biparts conflict

func (Quartet) Match

func (b Quartet) Match(inq Quartet) (mat bool)

Match for matching quartets

func (Quartet) StringWithNames

func (b Quartet) StringWithNames(nmmap map[int]string) (ret string)

StringWithNames converts the ints to the the strings from nmmap

type Rfwresult

type Rfwresult struct {
	Tree1  int
	Tree2  int
	Weight float64
	MaxDev float64
}

Rfwresult struct for holding info from rfwp analysis

type Seq

type Seq struct {
	NM string
	SQ string
}

Seq minimal seq struct

func ReadSeqsFromFile

func ReadSeqsFromFile(filen string) (seqs []Seq)

ReadSeqsFromFile obvious

func (Seq) GetFasta

func (s Seq) GetFasta() (ret string)

GetFasta return a fasta string

type SiteConfiguration

type SiteConfiguration struct {
	Sites         map[int]map[int]bool
	AIC           float64
	ClusterTrees  map[int]string
	ClusterSizes  map[int]int
	ClusterString string
}

func (*SiteConfiguration) CalcClusterSizes

func (c *SiteConfiguration) CalcClusterSizes()

func (*SiteConfiguration) Equals

func (c *SiteConfiguration) Equals(check *SiteConfiguration) (equal bool)

type SortedIntIdxSlice

type SortedIntIdxSlice struct {
	sort.IntSlice
	Idx []int
}

SortedIntIdxSlice for sorting indices of ints

func NewSortedIdxSlice

func NewSortedIdxSlice(n []int) *SortedIntIdxSlice

NewSortedIdxSlice usage

func NewSortedIdxSliceD

func NewSortedIdxSliceD(n ...int) *SortedIntIdxSlice

NewSortedIdxSliceD usage

s := NewSlice(1, 25, 3, 5, 4)

sort.Sort(s) will give s.IntSlice = [1 3 4 5 25] s.idx = [0 2 4 3 1]

func (SortedIntIdxSlice) Swap

func (s SortedIntIdxSlice) Swap(i, j int)

Swap for sorting indices

type StateModel

type StateModel interface {
	GetPCalc(float64) *mat.Dense
	GetPMap(float64) *mat.Dense
	SetMap()
	GetNumStates() int
	GetBF() []float64
	EmptyPDict()
	GetStochMapMatrices(float64, int, int) (*mat.Dense, *mat.Dense)
	SetupQJC1Rate(float64)
	SetupQMk([]float64, bool)            // bool = false is AsyMK
	SetScaledRateMatrix([]float64, bool) // before setupQGTR
	SetupQGTR()                          //
	DecomposeQ()                         //
	ExpValueFirstD(float64) *mat.Dense   //BL
	ExpValueSecondD(float64) *mat.Dense  //BL
	GetCharMap() map[string][]int
	SetRateMatrix([]float64)
	SetBaseFreqs([]float64)
}

StateModel ...

type SupFlo

type SupFlo struct {
	// contains filtered or unexported fields
}

SupFlo super float

func NewSupFlo

func NewSupFlo(m float64, e int) *SupFlo

NewSupFlo get a new one

func (*SupFlo) Abs

func (s *SupFlo) Abs() *SupFlo

Abs absolute value

func (*SupFlo) Add

func (s *SupFlo) Add(x *SupFlo) *SupFlo

Add s+x return result

func (*SupFlo) AddEq

func (s *SupFlo) AddEq(x *SupFlo)

AddEq +=

func (*SupFlo) Div

func (s *SupFlo) Div(x *SupFlo) *SupFlo

Div s/x return sup

func (*SupFlo) DivEq

func (s *SupFlo) DivEq(x *SupFlo)

DivEq /=

func (*SupFlo) Float64

func (s *SupFlo) Float64() float64

Float64 just get it

func (*SupFlo) GetExp

func (s *SupFlo) GetExp() int

GetExp just get it

func (*SupFlo) GetLn

func (s *SupFlo) GetLn() *SupFlo

GetLn ln

func (*SupFlo) GetMant

func (s *SupFlo) GetMant() float64

GetMant just get it

func (SupFlo) Greater

func (s SupFlo) Greater(x SupFlo) bool

Greater >

func (SupFlo) GreaterEq

func (s SupFlo) GreaterEq(x SupFlo) bool

GreaterEq >=

func (SupFlo) Less

func (s SupFlo) Less(x SupFlo) bool

Less <

func (SupFlo) LessEq

func (s SupFlo) LessEq(x SupFlo) bool

LessEq <=

func (*SupFlo) MinMin

func (s *SupFlo) MinMin()

MinMin --

func (*SupFlo) Mul

func (s *SupFlo) Mul(x *SupFlo) *SupFlo

Mul s * x return result

func (*SupFlo) MulEq

func (s *SupFlo) MulEq(x *SupFlo)

MulEq *=

func (*SupFlo) MulEqFloat

func (s *SupFlo) MulEqFloat(x float64)

MulEqFloat *=

func (*SupFlo) MulFloat

func (s *SupFlo) MulFloat(x float64) *SupFlo

MulFloat s * x return result and x is a float64

func (*SupFlo) PlusPlus

func (s *SupFlo) PlusPlus()

PlusPlus ++

func (*SupFlo) SetFloat64

func (s *SupFlo) SetFloat64(m float64)

SetFloat64 set it to a float64 value

func (*SupFlo) Sub

func (s *SupFlo) Sub(x *SupFlo) *SupFlo

Sub s-x return result

func (*SupFlo) SubEq

func (s *SupFlo) SubEq(x SupFlo)

SubEq -=

func (*SupFlo) SwitchSign

func (s *SupFlo) SwitchSign()

SwitchSign change the sign

type Tree

type Tree struct {
	Rt    *Node
	Post  []*Node
	Pre   []*Node
	Tips  []*Node
	Index int // if there is an identifying index
}

Tree minimal phylogenetic tree struct don't need this, can use root but here if you want these convienence functions below

func NewTree

func NewTree() *Tree

NewTree return a tree

func ReadTreeFromFile

func ReadTreeFromFile(tfn string) (tree *Tree)

ReadTreeFromFile read a single tree from a file

func ReadTreesFromFile

func ReadTreesFromFile(tfn string) (trees []*Tree)

ReadTreesFromFile read multi single tree from a file

func (*Tree) GetTipByName

func (t *Tree) GetTipByName(name string) (*Node, error)

GetTipByName get

func (*Tree) Instantiate

func (t *Tree) Instantiate(rt *Node)

Instantiate will preorder and postorder

Package Files

Documentation was rendered with GOOS=linux and GOARCH=amd64.

Jump to identifier

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to identifier