package module
v0.4.1-0...-955c0ba Latest Latest

This package is not in the latest version of its module.

Go to latest
Published: May 17, 2020 License: NCSA Imports: 18 Imported by: 3



Build Status Build StatusGo Report CardLicense

Go binding for TensorRT C predict API. This is used by the TensorRT agent in MLModelScope to perform model inference in Go.


Download and install go-tensorrt:

go get -v github.com/rai-project/go-tensorrt

The binding requires TensorRT and other Go packages.


TensorRT currently only works in linux and requires GPU. Please refer to Installing TensorRT to install TensorRT on your system.

Note: TensorRT is expected to be installed in either the system path or /opt/tensorrt. See lib.go for details.

If you get an error about not being able to write to /opt then perform the following

sudo mkdir -p /opt/tensorrt
sudo chown -R `whoami` /opt/tensorrt

If you are using TensorRT docker images or other libary paths, change CGO_CFLAGS, CGO_CXXFLAGS and CGO_LDFLAGS enviroment variables. Refer to Using cgo with the go command.

For example,

    export CGO_CFLAGS="${CGO_CFLAGS} -I/tmp/tensorrt/include"
    export CGO_CXXFLAGS="${CGO_CXXFLAGS} -I/tmp/tensorrt/include"
    export CGO_LDFLAGS="${CGO_LDFLAGS} -L/tmp/tensorrt/lib"
Go Packages

You can install the dependency through go get.

cd $GOPATH/src/github.com/rai-project/tensorflow
go get -u -v ./...

Or use Dep.

dep ensure -v

This installs the dependency in vendor/.

Configure Environmental Variables

Configure the linker environmental variables since the TensorRT C library is under a non-system directory. Place the following in either your ~/.bashrc or ~/.zshrc file


export LIBRARY_PATH=$LIBRARY_PATH:/tensorrt/tensorrt/lib
export LD_LIBRARY_PATH=/opt/tensorrt/lib:$LD_LIBRARY_PATH


export LIBRARY_PATH=$LIBRARY_PATH:/opt/tensorrt/lib
export LD_LIBRARY_PATH=/opt/tensorrt/lib:$DYLD_LIBRARY_PATH

Check the Build

Run go build in to check the dependences installation and library paths set-up.

Note : The CGO interface passes go pointers to the C API. This is an error by the CGO runtime. Disable the error by placing

export GODEBUG=cgocheck=0

in your ~/.bashrc or ~/.zshrc file and then run either source ~/.bashrc or source ~/.zshrc


The example shows how to use the MLModelScope tracer to profile the inference. Refer to Set up the external services to start the tracer.

If running on GPU, you can use nvprof to verify the profiling result. Refer to Profiler User's Guide for using nvprof.




This section is empty.


View Source
var (
	Version   = "0.4.0"
	BuildDate = "undefined"
	GitCommit = "undefined"


This section is empty.


type DType

type DType C.TensorRT_DType

DType tensor scalar data type

const (
	UnknownType DType = C.TensorRT_Unknown
	// Byte byte tensors (go type uint8)
	Byte DType = C.TensorRT_Byte
	// Char char tensor (go type int8)
	Char DType = C.TensorRT_Char
	// Int int tensor (go type int32)
	Int DType = C.TensorRT_Int
	// Long long tensor (go type int64)
	Long DType = C.TensorRT_Long
	// Float tensor (go type float32)
	Float DType = C.TensorRT_Float
	// Double tensor  (go type float64)
	Double DType = C.TensorRT_Double

type ModelFormat

type ModelFormat int

ModelFormat ...

const (
	ModelFormatCaffe   ModelFormat = 1
	ModelFormatOnnx    ModelFormat = 2
	ModelFormatUff     ModelFormat = 3
	ModelFormatUnknown ModelFormat = 999

func ClassifyModelFormat

func ClassifyModelFormat(path string) ModelFormat

type Predictor

type Predictor struct {
	// contains filtered or unexported fields

func New

func New(ctx context.Context, opts ...options.Option) (*Predictor, error)

func (*Predictor) Close

func (p *Predictor) Close()

func (*Predictor) EndProfiling

func (p *Predictor) EndProfiling() error

func (*Predictor) GetOptions

func (p *Predictor) GetOptions() *options.Options

func (*Predictor) Predict

func (p *Predictor) Predict(ctx context.Context, data []float32) error

func (*Predictor) ReadPredictionOutput

func (p *Predictor) ReadPredictionOutput(name string) []float32

ReadPredictionOutput ...

func (*Predictor) ReadPredictionOutputs

func (p *Predictor) ReadPredictionOutputs(ctx context.Context) ([][]float32, error)

func (*Predictor) ReadProfile

func (p *Predictor) ReadProfile() (string, error)

func (*Predictor) StartProfiling

func (p *Predictor) StartProfiling(name, metadata string) error


Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL