Documentation ¶
Index ¶
- func List(client *golangsdk.ServiceClient, clusterId string, opts ListOptsBuilder) pagination.Pager
- type CreateOpts
- type CreateOptsBuilder
- type CreateResp
- type CreateResult
- type DeleteOpts
- type DeleteOptsBuilder
- type DeleteResult
- type GetResult
- type Job
- type JobPage
- type JobResp
- type ListOpts
- type ListOptsBuilder
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func List ¶
func List(client *golangsdk.ServiceClient, clusterId string, opts ListOptsBuilder) pagination.Pager
List is a method to obtain an array of one or more mapreduce jobs according to the query parameters.
Types ¶
type CreateOpts ¶
type CreateOpts struct { // Type of a job, and the valid values are as follows: // MapReduce // SparkSubmit // HiveScript // HiveSql // DistCp, importing and exporting data // SparkScript // SparkSql // Flink // NOTE: // Spark, Hive, and Flink jobs can be added to only clusters that include Spark, Hive, and Flink components. JobType string `json:"job_type" required:"true"` // Job name. It contains 1 to 64 characters. Only letters, digits, hyphens (-), and underscores (_) are allowed. // NOTE: // Identical job names are allowed but not recommended. JobName string `json:"job_name" required:"true"` // Key parameter for program execution. // The parameter is specified by the function of the user's program. // MRS is only responsible for loading the parameter. // The parameter contains a maximum of 4,096 characters, excluding special characters such as ;|&>'<$, // and can be left blank. // NOTE: // If you enter a parameter with sensitive information (such as the login password), the parameter may be exposed // in the job details display and log printing. Exercise caution when performing this operation. // For MRS 1.9.2 or later, a file path on OBS can start with obs://. To use this format to submit HiveScript or // HiveSQL jobs, choose Components > Hive > Service Configuration on the cluster details page, set Type to All, // and search for core.site.customized.configs. Add the endpoint configuration item (fs.obs.endpoint) of OBS and // enter the endpoint corresponding to OBS in Value. Obtain the value from Regions and Endpoints. // For MRS 3.0.2 or later, a file path on OBS can start with obs://. To use this format to submit HiveScript or // HiveSQL jobs, choose Components > Hive > Service Configuration on Manager. Switch Basic to All, and search for // core.site.customized.configs. Add the endpoint configuration item (fs.obs.endpoint) of OBS and enter the // endpoint corresponding to OBS in Value. Obtain the value from Regions and Endpoints. Arguments []string `json:"arguments,omitempty"` // Program system parameter. // The parameter contains a maximum of 2,048 characters, excluding special characters such as ><|'`&!\, and can be // left blank. Properties map[string]string `json:"properties,omitempty"` }
CreateOpts is a structure representing information of the job creation.
func (CreateOpts) ToJobCreateMap ¶
func (opts CreateOpts) ToJobCreateMap() (map[string]interface{}, error)
ToJobCreateMap is a method which to build a request body by the CreateOpts.
type CreateOptsBuilder ¶
CreateOptsBuilder is an interface which to support request body build of the job creation.
type CreateResp ¶
type CreateResp struct { // Job execution result JobSubmitResult JobResp `json:"job_submit_result"` // Error message ErrorMsg string `json:"error_msg"` // Error code ErrorCode string `json:"error_code"` }
CreateResp is a struct that represents the result of Create methods.
type CreateResult ¶
type CreateResult struct {
golangsdk.Result
}
CreateResult represents a result of the Create method.
func Create ¶
func Create(client *golangsdk.ServiceClient, clusterId string, opts CreateOptsBuilder) (r CreateResult)
Create is a method to create a new mapreduce job.
func (CreateResult) Extract ¶
func (r CreateResult) Extract() (*CreateResp, error)
Extract is a method which to extract a creation response.
type DeleteOpts ¶
type DeleteOpts struct {
JobIds []string `json:"job_id_list,omitempty"`
}
DeleteOpts is a structure representing information of the job delete operation.
func (DeleteOpts) ToJobDeleteMap ¶
func (opts DeleteOpts) ToJobDeleteMap() (map[string]interface{}, error)
ToJobDeleteMap is a method which to build a request body by the DeleteOpts.
type DeleteOptsBuilder ¶
DeleteOptsBuilder is an interface which to support request body build of the job delete operation.
type DeleteResult ¶
type DeleteResult struct {
golangsdk.ErrResult
}
DeleteResult represents a result of the Delete method.
func Delete ¶
func Delete(client *golangsdk.ServiceClient, clusterId string, opts DeleteOptsBuilder) (r DeleteResult)
Delete is a method to delete an existing mapreduce job.
type GetResult ¶
type GetResult struct {
// contains filtered or unexported fields
}
GetResult represents a result of the Get method.
type Job ¶
type Job struct { // Job ID. JobId string `json:"job_id"` // Name of the user who submits a job. User string `json:"user"` // Job name. It contains 1 to 64 characters. Only letters, digits, hyphens (-), and underscores (_) are allowed. JobName string `json:"job_name"` // Final result of a job. // FAILED: indicates that the job fails to be executed. // KILLED: indicates that the job is manually terminated during execution. // UNDEFINED: indicates that the job is being executed. // SUCCEEDED: indicates that the job has been successfully executed. JobResult string `json:"job_result"` // Execution status of a job. // FAILED: failed // KILLED: indicates that the job is terminated. // New: indicates that the job is created. // NEW_SAVING: indicates that the job has been created and is being saved. // SUBMITTED: indicates that the job is submitted. // ACCEPTED: indicates that the job is accepted. // RUNNING: indicates that the job is running. // FINISHED: indicates that the job is completed. JobState string `json:"job_state"` // Job execution progress. JobProgress float32 `json:"job_progress"` // Type of a job, which support: // MapReduce // SparkSubmit // HiveScript // HiveSql // DistCp, importing and exporting data // SparkScript // SparkSql // Flink JobType string `json:"job_type"` // Start time to run a job. Unit: ms. StartedTime int `json:"started_time"` // Time when a job is submitted. Unit: ms. SubmittedTime int `json:"submitted_time"` // End time to run a job. Unit: ms. FinishedTime int `json:"finished_time"` // Running duration of a job. Unit: ms. ElapsedTime int `json:"elapsed_time"` // Running parameter. The parameter contains a maximum of 4,096 characters, // excluding special characters such as ;|&>'<$, and can be left blank. Arguments string `json:"arguments"` // Configuration parameter, which is used to configure -d parameters. // The parameter contains a maximum of 2,048 characters, excluding special characters such as ><|'`&!\, // and can be left blank. Properties string `json:"properties"` // Launcher job ID. LauncherId string `json:"launcher_id"` // Actual job ID. AppId string `json:"app_id"` }
Job is a struct that represents the details of the mapreduce job.
func ExtractJobs ¶
func ExtractJobs(r pagination.Page) ([]Job, error)
ExtractJobs is a method which to extract a job list by job pages.
type JobPage ¶
type JobPage struct {
pagination.SinglePageBase
}
JobPage represents the response pages of the List operation.
type JobResp ¶
type JobResp struct { // Job ID JobId string `json:"job_id"` // Job submission status. // COMPLETE: The job is submitted. // JOBSTAT_SUBMIT_FAILED: Failed to submit the job. State string `json:"state"` }
JobResp is an object struct that represents the details of the submit result.
type ListOpts ¶
type ListOpts struct { // Job name. It contains 1 to 64 characters. Only letters, digits, hyphens (-), and underscores (_) are allowed. JobName string `q:"job_name"` // Type of a job, and the valid values are as follows: // MapReduce // SparkSubmit // HiveScript // HiveSql // DistCp, importing and exporting data // SparkScript // SparkSql // Flink JobType string `q:"job_type"` // Execution status of a job. // FAILED: indicates that the job fails to be executed. // KILLED: indicates that the job is terminated. // New: indicates that the job is created. // NEW_SAVING: indicates that the job has been created and is being saved. // SUBMITTED: indicates that the job is submitted. // ACCEPTED: indicates that the job is accepted. // RUNNING: indicates that the job is running. // FINISHED: indicates that the job is completed. JobState string `q:"job_state"` // Execution result of a job. // FAILED: indicates that the job fails to be executed. // KILLED: indicates that the job is manually terminated during execution. // UNDEFINED: indicates that the job is being executed. // SUCCEEDED: indicates that the job has been successfully executed. JobResult string `q:"job_result"` // Number of records displayed on each page in the returned result. The default value is 10. Limit int `q:"limit"` // Offset. // The default offset from which the job list starts to be queried is 1. Offset int `q:"offset"` // Ranking mode of returned results. The default value is desc. // asc: indicates that the returned results are ranked in ascending order. // desc: indicates that the returned results are ranked in descending order. SortBy string `q:"sort_by"` // UTC timestamp after which a job is submitted, in milliseconds. For example, 1562032041362. SubmittedTimeBegin int `q:"submitted_time_begin"` // UTC timestamp before which a job is submitted, in milliseconds. For example, 1562032041362. SubmittedTimeEnd int `q:"submitted_time_end"` }
ListOpts is a structure representing information of the job updation.
func (ListOpts) ToListQuery ¶
ToListQuery is a method which to build a request query by the ListOpts.
type ListOptsBuilder ¶
ListOptsBuilder is an interface which to support request query build of the job list operation.