Documentation ¶
Index ¶
- Variables
- type BenchmarkError
- func (*BenchmarkError) Descriptor() ([]byte, []int)deprecated
- func (x *BenchmarkError) GetErrorCode() *ErrorCode
- func (x *BenchmarkError) GetErrorMessage() string
- func (*BenchmarkError) ProtoMessage()
- func (x *BenchmarkError) ProtoReflect() protoreflect.Message
- func (x *BenchmarkError) Reset()
- func (x *BenchmarkError) String() string
- type BenchmarkEventType
- func (BenchmarkEventType) Descriptor() protoreflect.EnumDescriptor
- func (x BenchmarkEventType) Enum() *BenchmarkEventType
- func (BenchmarkEventType) EnumDescriptor() ([]byte, []int)deprecated
- func (x BenchmarkEventType) Number() protoreflect.EnumNumber
- func (x BenchmarkEventType) String() string
- func (BenchmarkEventType) Type() protoreflect.EnumType
- type BenchmarkMetric
- func (*BenchmarkMetric) Descriptor() ([]byte, []int)deprecated
- func (x *BenchmarkMetric) GetName() string
- func (x *BenchmarkMetric) GetValue() float32
- func (*BenchmarkMetric) ProtoMessage()
- func (x *BenchmarkMetric) ProtoReflect() protoreflect.Message
- func (x *BenchmarkMetric) Reset()
- func (x *BenchmarkMetric) String() string
- type ErrorCode
- type LatencyCriteria
- func (*LatencyCriteria) Descriptor() ([]byte, []int)deprecated
- func (x *LatencyCriteria) GetAverageInferenceMaxRegressionPercentageAllowed() float32
- func (x *LatencyCriteria) GetStartupOverheadMaxRegressionPercentageAllowed() float32
- func (*LatencyCriteria) ProtoMessage()
- func (x *LatencyCriteria) ProtoReflect() protoreflect.Message
- func (x *LatencyCriteria) Reset()
- func (x *LatencyCriteria) String() string
- type LatencyResults
- func (*LatencyResults) Descriptor() ([]byte, []int)deprecated
- func (x *LatencyResults) GetError() *BenchmarkError
- func (x *LatencyResults) GetEventType() BenchmarkEventType
- func (x *LatencyResults) GetMetrics() []*BenchmarkMetric
- func (*LatencyResults) ProtoMessage()
- func (x *LatencyResults) ProtoReflect() protoreflect.Message
- func (x *LatencyResults) Reset()
- func (x *LatencyResults) String() string
Constants ¶
This section is empty.
Variables ¶
var ( BenchmarkEventType_name = map[int32]string{ 0: "BENCHMARK_EVENT_TYPE_UNDEFINED", 1: "BENCHMARK_EVENT_TYPE_START", 2: "BENCHMARK_EVENT_TYPE_END", 3: "BENCHMARK_EVENT_TYPE_ERROR", } BenchmarkEventType_value = map[string]int32{ "BENCHMARK_EVENT_TYPE_UNDEFINED": 0, "BENCHMARK_EVENT_TYPE_START": 1, "BENCHMARK_EVENT_TYPE_END": 2, "BENCHMARK_EVENT_TYPE_ERROR": 3, } )
Enum value maps for BenchmarkEventType.
var File_tensorflow_lite_tools_benchmark_experimental_delegate_performance_android_proto_delegate_performance_proto protoreflect.FileDescriptor
Functions ¶
This section is empty.
Types ¶
type BenchmarkError ¶
type BenchmarkError struct { // Handled tflite error. ErrorCode *ErrorCode `protobuf:"bytes,1,opt,name=error_code,json=errorCode,proto3,oneof" json:"error_code,omitempty"` ErrorMessage *string `protobuf:"bytes,2,opt,name=error_message,json=errorMessage,proto3,oneof" json:"error_message,omitempty"` // contains filtered or unexported fields }
An error that occurred during benchmarking.
Used with event type ERROR.
Next ID: 3
func (*BenchmarkError) Descriptor
deprecated
func (*BenchmarkError) Descriptor() ([]byte, []int)
Deprecated: Use BenchmarkError.ProtoReflect.Descriptor instead.
func (*BenchmarkError) GetErrorCode ¶
func (x *BenchmarkError) GetErrorCode() *ErrorCode
func (*BenchmarkError) GetErrorMessage ¶
func (x *BenchmarkError) GetErrorMessage() string
func (*BenchmarkError) ProtoMessage ¶
func (*BenchmarkError) ProtoMessage()
func (*BenchmarkError) ProtoReflect ¶
func (x *BenchmarkError) ProtoReflect() protoreflect.Message
func (*BenchmarkError) Reset ¶
func (x *BenchmarkError) Reset()
func (*BenchmarkError) String ¶
func (x *BenchmarkError) String() string
type BenchmarkEventType ¶
type BenchmarkEventType int32
Which stage of benchmarking the event is for.
const ( BenchmarkEventType_BENCHMARK_EVENT_TYPE_UNDEFINED BenchmarkEventType = 0 // Benchmark start. A start without an end can be interpreted as a test that // has crashed or hung. BenchmarkEventType_BENCHMARK_EVENT_TYPE_START BenchmarkEventType = 1 // Benchmarking completion. A model was successfully loaded, acceleration // configured and inference run without errors. There may still be an issue // with correctness of results, or with performance. BenchmarkEventType_BENCHMARK_EVENT_TYPE_END BenchmarkEventType = 2 // Benchmark was not completed due to an error. The error may be a handled // error (e.g., failure in a delegate), or a crash. BenchmarkEventType_BENCHMARK_EVENT_TYPE_ERROR BenchmarkEventType = 3 )
func (BenchmarkEventType) Descriptor ¶
func (BenchmarkEventType) Descriptor() protoreflect.EnumDescriptor
func (BenchmarkEventType) Enum ¶
func (x BenchmarkEventType) Enum() *BenchmarkEventType
func (BenchmarkEventType) EnumDescriptor
deprecated
func (BenchmarkEventType) EnumDescriptor() ([]byte, []int)
Deprecated: Use BenchmarkEventType.Descriptor instead.
func (BenchmarkEventType) Number ¶
func (x BenchmarkEventType) Number() protoreflect.EnumNumber
func (BenchmarkEventType) String ¶
func (x BenchmarkEventType) String() string
func (BenchmarkEventType) Type ¶
func (BenchmarkEventType) Type() protoreflect.EnumType
type BenchmarkMetric ¶
type BenchmarkMetric struct { // Metric name, for example inference_latency_average_us. Name *string `protobuf:"bytes,1,opt,name=name,proto3,oneof" json:"name,omitempty"` // Metric value, for example 180000. Value *float32 `protobuf:"fixed32,2,opt,name=value,proto3,oneof" json:"value,omitempty"` // contains filtered or unexported fields }
A metric from a benchmark, for example an average inference time in microseconds.
func (*BenchmarkMetric) Descriptor
deprecated
func (*BenchmarkMetric) Descriptor() ([]byte, []int)
Deprecated: Use BenchmarkMetric.ProtoReflect.Descriptor instead.
func (*BenchmarkMetric) GetName ¶
func (x *BenchmarkMetric) GetName() string
func (*BenchmarkMetric) GetValue ¶
func (x *BenchmarkMetric) GetValue() float32
func (*BenchmarkMetric) ProtoMessage ¶
func (*BenchmarkMetric) ProtoMessage()
func (*BenchmarkMetric) ProtoReflect ¶
func (x *BenchmarkMetric) ProtoReflect() protoreflect.Message
func (*BenchmarkMetric) Reset ¶
func (x *BenchmarkMetric) Reset()
func (*BenchmarkMetric) String ¶
func (x *BenchmarkMetric) String() string
type ErrorCode ¶
type ErrorCode struct { // What the TF Lite level error is. // See TfLiteStatus in tensorflow/lite/core/c/c_api_types.h for the meaning of // the values. TfliteError *int32 `protobuf:"varint,1,opt,name=tflite_error,json=tfliteError,proto3,oneof" json:"tflite_error,omitempty"` // contains filtered or unexported fields }
A handled error.
Next ID: 2
func (*ErrorCode) Descriptor
deprecated
func (*ErrorCode) GetTfliteError ¶
func (*ErrorCode) ProtoMessage ¶
func (*ErrorCode) ProtoMessage()
func (*ErrorCode) ProtoReflect ¶
func (x *ErrorCode) ProtoReflect() protoreflect.Message
type LatencyCriteria ¶
type LatencyCriteria struct { // The maximum regression (%) of startup overhead time that is allowed. // If initialization_max_regression_percentage_allow is 5 and the // initialization time of the test delegate is slower than the reference // delegate by more than 5%, the threshold is breached. // // Startup overhead time is calculated as below: // initialization time + average warmup time - average inference time StartupOverheadMaxRegressionPercentageAllowed *float32 `` /* 212-byte string literal not displayed */ // The maximum regression (%) of inference time that is allowed. // If average_inference_max_regression_percentage_allowed is 5 and the // inference time of the test delegate is slower than the reference delegate // by more than 5%, the threshold is breached. AverageInferenceMaxRegressionPercentageAllowed *float32 `` /* 215-byte string literal not displayed */ // contains filtered or unexported fields }
Parameters for latency thresholds of the delegate latency benchmarking results. The delegate performance benchmark app generates a "PASS"/"PASS_WITH_WARNING"/"FAIL" result for each pair of test target delegate with a reference delegate and an overall result by aggregating all the results. The result computation involves computing the test target delegate performance regressions and checking if the regression thresholds, specified by LatencyCriteria, are breached.
Please see tensorflow/lite/tools/benchmark/experimental/delegate_performance/android/src/main/java/org/tensorflow/lite/benchmark/delegateperformance/BenchmarkResultType.java for the meanings of "PASS", "PASS_WITH_WARNING" and "FAIL".
The latency criteria is designed to be model specific.
Next ID: 3
func (*LatencyCriteria) Descriptor
deprecated
func (*LatencyCriteria) Descriptor() ([]byte, []int)
Deprecated: Use LatencyCriteria.ProtoReflect.Descriptor instead.
func (*LatencyCriteria) GetAverageInferenceMaxRegressionPercentageAllowed ¶
func (x *LatencyCriteria) GetAverageInferenceMaxRegressionPercentageAllowed() float32
func (*LatencyCriteria) GetStartupOverheadMaxRegressionPercentageAllowed ¶ added in v2.13.0
func (x *LatencyCriteria) GetStartupOverheadMaxRegressionPercentageAllowed() float32
func (*LatencyCriteria) ProtoMessage ¶
func (*LatencyCriteria) ProtoMessage()
func (*LatencyCriteria) ProtoReflect ¶
func (x *LatencyCriteria) ProtoReflect() protoreflect.Message
func (*LatencyCriteria) Reset ¶
func (x *LatencyCriteria) Reset()
func (*LatencyCriteria) String ¶
func (x *LatencyCriteria) String() string
type LatencyResults ¶
type LatencyResults struct { // Type of the benchmark event. EventType *BenchmarkEventType `` /* 142-byte string literal not displayed */ // Metrics that are intended to be compared against other acceleration // configurations, used when type is END. Metrics []*BenchmarkMetric `protobuf:"bytes,2,rep,name=metrics,proto3" json:"metrics,omitempty"` // Error during benchmark, used when type is ERROR. Error *BenchmarkError `protobuf:"bytes,3,opt,name=error,proto3,oneof" json:"error,omitempty"` // contains filtered or unexported fields }
Outcome of a latency benchmark run. If the benchmark run was successfully completed, the message contains the latency metrics. The information is intended to be compared against with other candidate acceleration configurations. If the benchmark run was failed, the message contains the handled errors for the callers to investigate further.
Next ID: 4
func (*LatencyResults) Descriptor
deprecated
func (*LatencyResults) Descriptor() ([]byte, []int)
Deprecated: Use LatencyResults.ProtoReflect.Descriptor instead.
func (*LatencyResults) GetError ¶
func (x *LatencyResults) GetError() *BenchmarkError
func (*LatencyResults) GetEventType ¶
func (x *LatencyResults) GetEventType() BenchmarkEventType
func (*LatencyResults) GetMetrics ¶
func (x *LatencyResults) GetMetrics() []*BenchmarkMetric
func (*LatencyResults) ProtoMessage ¶
func (*LatencyResults) ProtoMessage()
func (*LatencyResults) ProtoReflect ¶
func (x *LatencyResults) ProtoReflect() protoreflect.Message
func (*LatencyResults) Reset ¶
func (x *LatencyResults) Reset()
func (*LatencyResults) String ¶
func (x *LatencyResults) String() string