Documentation ¶
Overview ¶
various utils, including methods for matrices in plaintext, model definitions and activation functions
Index ¶
- func Predict(Y []int, labels int, result [][]float64) (int, float64, []int)
- func ReLU(x float64) float64
- func SetupDirectory()
- func SiLU(x float64) float64
- func Sigmoid(x float64) float64
- func SoftReLu(x float64) float64
- func ThrowErr(err error)
- type ApproxParam
- type ApproxParams
- type Bias
- type ChebyPolyApprox
- type Kernel
- type Layer
- type PolyApprox
- type Stats
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func SetupDirectory ¶
func SetupDirectory()
Types ¶
type ApproxParam ¶
Stores interval and degree of approximation in chebychev basis. Deg is set via SetDegOfParam
type ApproxParams ¶
type ApproxParams struct {
Params []ApproxParam `json:"intervals"`
}
Approximation parameters for each layers
func SetDegOfParam ¶
func SetDegOfParam(Params ApproxParams) ApproxParams
decides the degree of approximation for each Param
type ChebyPolyApprox ¶
type ChebyPolyApprox struct { PolyApprox A, B float64 Degree int Poly *ckks.Polynomial ChebyBase bool F func(x float64) float64 }
Polynomial approximation from ckks Approximate
func InitActivationCheby ¶
func InitActivationCheby(act string, a, b float64, deg int) *ChebyPolyApprox
Initiliazes activation layer with function to approximate
func InitReLU ¶
func InitReLU(deg int) *ChebyPolyApprox
Initialize ReLU with coeffs not in cheby form from Matlab -> used by cryptonet
func (*ChebyPolyApprox) ActivatePlain ¶
func (activation *ChebyPolyApprox) ActivatePlain(X *mat.Dense)
applies the activation function elementwise. Needs rescaling first
func (*ChebyPolyApprox) LevelsOfAct ¶
func (approx *ChebyPolyApprox) LevelsOfAct() int
computes how many levels are consumed by activation func
type Kernel ¶
Matrix M s.t X @ M = conv(X, layer).flatten() where X is a row-flattened data sample Clearly it can be generalized to a simple dense layer
type Layer ¶
A Kernel (convolution in Toeplitz form or dense) and A Bias