inference

command module
v0.0.0-...-9e2d205 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 24, 2023 License: Apache-2.0 Imports: 8 Imported by: 0

README

Documentation

Use help.sh to spawn the documentation on your browser. If the script does not work, make sure you have godoc and Python installed. The page should be open at localhost:6060

Dependencies

The code has been tested using go1.18.3. To install Go.

The main library used by the package is Lattigo.

Examples

In /cryptonet[1] and nn[2] you can find *_test.go files that shows many examples on how to use our framework.

Tutorial: CryptoNet [1]

Have a look at TestCryptonet_EvalBatchEncrypted here

Defining your own network

Let's have a look at how you can define your own network by looking at some examples:

type CNLoader struct {
	network.NetworkLoader
}

type CryptoNet struct {
	network.Network
}

type CryptoNetHE struct {
	*network.HENetwork
}

First we define a Loader wrapper for our network and wrapper structs for the network in cleartext and the encoded/encrypted version for the HE pipeline.

func InitActivations(args ...interface{}) []utils.ChebyPolyApprox {
	approx := utils.InitReLU(3)
	return []utils.ChebyPolyApprox{*approx, *approx}
}

Now we define a custom method to initiate the activation functions. In this case it's very simple, all the approximation are degree 3 approximation of the ReLU function obtained by approximating a smooth version of ReLU.

//Taken from nn/nn.go

//Initialize activation function
func InitActivations(args ...interface{}) []utils.ChebyPolyApprox {
	layers := args[0].(int)
	HEtrain := args[1].(bool)
	var suffix string
	var act string
	activations := make([]utils.ChebyPolyApprox, layers)
	if HEtrain {
		suffix = "_poly"
		act = "silu"
	} else {
		suffix = ""
		act = "soft relu"
	}
	jsonFile, err := os.Open(fmt.Sprintf("nn%d%s_intervals.json", layers, suffix))
	utils.ThrowErr(err)
	defer jsonFile.Close()
	byteValue, _ := ioutil.ReadAll(jsonFile)
	var intervals utils.ApproxParams
	json.Unmarshal([]byte(byteValue), &intervals)
	intervals = utils.SetDegOfParam(intervals)
	for i := range intervals.Params {
		interval := intervals.Params[i]
		activations[i] = *utils.InitActivationCheby(act, interval.A, interval.B, interval.Deg)
	}
	return activations
}

In this case things get more tricky! What we do here is to load information about the approximation intervals and degree from some custom .json files. Also we use different activations function depending on the model we are loading (NN20 or NN50).

Remember: whatever custom method you code, you should always return an array of utils.ChebyPolyApprox to stick with the method signature!

func (l *CNLoader) Load(path string, initActivations network.Initiator) network.NetworkI {

	jsonFile, err := os.Open(path)
	utils.ThrowErr(err)
	defer jsonFile.Close()
	byteValue, _ := ioutil.ReadAll(jsonFile)

	nj := new(network.NetworkJ)
	err = json.Unmarshal([]byte(byteValue), nj)
	utils.ThrowErr(err)
	cn := new(CryptoNet)
	cn.SetLayers(nj.Layers)
	activations := initActivations()
	cn.SetActivations(activations)

	return cn
}

Finally, here is our custom Loader.Load() method. The structure is always the same, whatever network you define:

cn := new(CryptoNet)
cn.SetLayers(nj.Layers)
activations := initActivations()
cn.SetActivations(activations)

First you create a new network.Network, you set the layers from the json wrapper, you get the activations from your custom InitActivations method and you set them in the network with the setter method. That's it!

Advanced:

Have a look at TestCryptonet_EvalBatchClearModelEnc here and to the examples with NN for more advanced usages.

Methods

Understand how things work under the hood.

Matrix Multiplication

The algorithm simply consists in evaluating the matrix multiplication between a transposed input in row-major order and a weight matrix in diagonal order: every diagonal is multiplied elementwise with a rotated version of the input ciphertext. The algorithm is very efficient, as it use hoisted rotations. Moreover, it produces an output which is fully compatible with subsequent multiplication, with no need for repacking. image

Input packing

It follows from the matrix multiplication:

  • each image in batch (a tensor) is row-flattened (appending the channels next to each other)
  • you get a matrix NxD
  • matrix is transposed (DxN) and flattened

image

Additionally, we use the complex trick to get a reduction in the size of a factor 2: adjacent columns of the input matrix are compressed in one, exploiting the imaginary part of complex numbers in the slots. Ciphertext is then replicated by replicaFactor in order to simulate cyclic rotations:

image

Weight packing

The weight matrix is represented in diagonal form. We use the generalized diagonals for non-square matrices. Additionally, each element in the diagonals is replicated a number of times equal to the rows in the input matrix. We use again the complex trick, where pairs of diagonals are compressed in one. If the weight matrix has an odd number of rows, the last diagonal carries 0 in the imaginary part

image

Block Matrices

Every matrix is represented in a block representation, i.e it is splitted in sub-matrices and operations are carried indipendently between blocks, following regular block matrix arithmetics. This has 2 benefits:

  • allows the encryption/encoding of "big" matrices (which would not fit in the number of slots of the CKKS parameters)
  • allows for parallelization, i.e we can perform operations between blocks in a parallel fashion.

image

As an example, when evaluating a dense or convolutional layers, the input matrix is splitted in blocks, where each blocks is a matrix packed following the input packing approach, whereas the weight matrix is a block matrix where each block is packed in the diagonal form. The multiplication algorithm is used to multiply every block in a parallel fashion.

Experiments

We tested our framework in 4 experiments, representing 3 scenarios.

Scenarios
  • Scenario 1: The querier encrypts its data with its own public key, and sends the data to a server holding the neural network in clear. Inference is evaluated using HE properties, and the result (encrypted) is sent back to the querier for decryption. image

  • Scenario 2: In a setup phase, we assume that a cohort of nodes generates a collective public key and a private key of which each node holds a share. We assume that the model itself has been trained by the cohort of nodes under encryption, so to preserve the confidentiality of each nodes' training data and of the model itself. After training, the cohort uses the model to perform oblivious prediction as a service:

    • the querier encrypts its data under the cohort public key
    • the master node performs the computation. It invokes a distributed bootstrapping algorithm if needed.
    • the master node invokes a distributed key switch algorithm to switch to obtain a prediction encrypted under the querier's public key
    • prediction is sent back to the querier for decryption

    image

  • Scenario 3: We assume that in a setup phase, the model owner sends the model, encrypted, to the client. The client can thus use the model to perform the inference on its own data (in cleartext!). Finally, the client sends back the result, masked with a 128 bit random mask, to the model owner, who offers an oblivious decryption service. image

Experiments
  • Experiment 1: Evaluation of Cryptonet following scenario 1
  • Experiment 2: Evaluation of Cryptonet following scenario 3
  • Experiment 3: Evaluation of NN50 following scenario 1. Centralized bootstrapping is needed
  • Experiment 4: Evaluation of NN20 modified to be trained under encryption. Distributed bootstrapping and key switch
Results
Hardware Specifications
  • 2x Intel Xeon E5-2680 v3, 48 vCPUs
  • 16GB DDR4 RAM
Network Setting
  • Localhost interface wrapped with latency go package: Simulated LAN with 1500B MTU (Ethernet) and 200ms latency
  • LAN of Remote CLUSTER
How to setup and run on cluster
  • Step 0: go to /cluster and populate config.json as needed
  • Step 1: run ip_scan.sh -- this will collect the ip addresses of the servers in cluster
  • Step 2: run setup.sh -- this will upload the necessary files and configure the servers
  • Step 3: before running experiments, run remote.sh -- this will start the servers listening for experiment configuration
  • Step 4: upon ending the experiment (if fail), run cleanup.sh -- this will kill the processes holding the ports
Experiment 1
Experiment 1
Batch LogN LogQP Latency(s)
83 14 365 5.3
Experiment 2 -- Cluster LAN
Batch LogN LogQP Latency(s)
83 14 365 7.1
Experiment 3
Batch LogN LogQP Latency(s)
525 16 1546 2562
Experiment 4 -- Cluster LAN
Batch LogN LogQP Parties Latency(s)
292 15 874 10 273
Notes
  • Experiment 1 runs in 3.516s for 1 sample
  • Experiment 3 : Accuracy 0.8964 (-1.1%)
  • Experiment 4 : Accuracy = 95.6 (-1.2%)

[1] R. Gilad-Bachrach, N. Dowlin, K. Laine, K. Lauter, M. Naehrig, and J. Wernsing. Cryptonets: Applying neural networks to encrypted data with high throughput and accuracy. In ICML, 2016.

[2] I. Chillotti, M. Joye, and P. Paillier. Programmable bootstrapping enables efficient homomorphic inference of deep neural networks. Cryptology ePrint Archive, Paper 2021/091, 2021. https://eprint.iacr.org/2021/091.

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis
Package contains the logic for all operations between ciphertext and plaintexts
Package contains the logic for all operations between ciphertext and plaintexts
Contains configuration variables for tests on the iccluster
Contains configuration variables for tests on the iccluster
Contains declaration and experiments for CryptoNet model
Contains declaration and experiments for CryptoNet model
handles data providing
handles data providing
defines all protocols and entities for distributed refresh, key switch and oblivious decryption
defines all protocols and entities for distributed refresh, key switch and oblivious decryption
implements neural networks and he neural networks interfaces
implements neural networks and he neural networks interfaces
definitions and experiments for zama nn models
definitions and experiments for zama nn models
utils for plaintext operations and poc for block matrices
utils for plaintext operations and poc for block matrices
various utils, including methods for matrices in plaintext, model definitions and activation functions
various utils, including methods for matrices in plaintext, model definitions and activation functions

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL