Documentation ¶
Overview ¶
Package bqutil contains utility functions for BigQuery.
Index ¶
- Constants
- Variables
- func Client(ctx context.Context, gcpProject string) (*bigquery.Client, error)
- func GenerateSchema(fdset *desc.FileDescriptorSet, message string) (schema bigquery.Schema, err error)
- func NewWriterClient(ctx context.Context, gcpProject string) (*managedwriter.Client, error)
- func RowSize(row proto.Message) int
- type Writer
Constants ¶
const InternalDatasetID = "internal"
InternalDatasetID is the name of the BigQuery dataset which is intended for internal service use only.
const RowMaxBytes = 9 * 1024 * 1024 // 9 MB
RowMaxBytes is the maximum number of row bytes to send in one BigQuery Storage Write API - AppendRows request. As at writing, the request size limit for this RPC is 10 MB: https://cloud.google.com/bigquery/quotas#write-api-limits. The maximum size of rows must be less than this as there are some overheads in each request.
Variables ¶
var InvalidRowTagKey = errors.NewTagKey("InvalidRow")
Functions ¶
func Client ¶
TODO (nqmtuan): Share the code with LUCI Analysis. Client returns a new BigQuery client for use with the given GCP project, that authenticates as ResultDB.
func GenerateSchema ¶
func NewWriterClient ¶
NewWriterClient returns a new BigQuery managedwriter client for use with the given GCP project, that authenticates as ResultDB itself.
Types ¶
type Writer ¶
type Writer struct {
// contains filtered or unexported fields
}
Writer is used to export rows to BigQuery table.
func NewWriter ¶
func NewWriter( client *managedwriter.Client, tableName string, tableSchemaDescriptor *descriptorpb.DescriptorProto, ) *Writer
NewWriter creates a writer for exporting rows to the provided BigQuery table via the provided managedWriter client.