Learn R Programming

paws.compute (version 0.7.0)

batch_submit_job: Submits an Batch job from a job definition

Description

Submits an Batch job from a job definition. Parameters that are specified during submit_job override parameters defined in the job definition. vCPU and memory requirements that are specified in the resourceRequirements objects in the job definition are the exception. They can't be overridden this way using the memory and vcpus parameters. Rather, you must specify updates to job definition parameters in a resourceRequirements object that's included in the containerOverrides parameter.

See https://www.paws-r-sdk.com/docs/batch_submit_job/ for full documentation.

Usage

batch_submit_job(
  jobName,
  jobQueue,
  shareIdentifier = NULL,
  schedulingPriorityOverride = NULL,
  arrayProperties = NULL,
  dependsOn = NULL,
  jobDefinition,
  parameters = NULL,
  containerOverrides = NULL,
  nodeOverrides = NULL,
  retryStrategy = NULL,
  propagateTags = NULL,
  timeout = NULL,
  tags = NULL,
  eksPropertiesOverride = NULL,
  ecsPropertiesOverride = NULL
)

Arguments

jobName

[required] The name of the job. It can be up to 128 letters long. The first character must be alphanumeric, can contain uppercase and lowercase letters, numbers, hyphens (-), and underscores (_).

jobQueue

[required] The job queue where the job is submitted. You can specify either the name or the Amazon Resource Name (ARN) of the queue.

shareIdentifier

The share identifier for the job. Don't specify this parameter if the job queue doesn't have a scheduling policy. If the job queue has a scheduling policy, then this parameter must be specified.

This string is limited to 255 alphanumeric characters, and can be followed by an asterisk (*).

schedulingPriorityOverride

The scheduling priority for the job. This only affects jobs in job queues with a fair share policy. Jobs with a higher scheduling priority are scheduled before jobs with a lower scheduling priority. This overrides any scheduling priority in the job definition and works only within a single share identifier.

The minimum supported value is 0 and the maximum supported value is 9999.

arrayProperties

The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. For more information, see Array Jobs in the Batch User Guide.

dependsOn

A list of dependencies for the job. A job can depend upon a maximum of 20 jobs. You can specify a SEQUENTIAL type dependency without specifying a job ID for array jobs so that each child array job completes sequentially, starting at index 0. You can also specify an N_TO_N type dependency with a job ID for array jobs. In that case, each index child of this job must wait for the corresponding index child of each dependency to complete before it can begin.

jobDefinition

[required] The job definition used by this job. This value can be one of definition-name, definition-name:revision, or the Amazon Resource Name (ARN) for the job definition, with or without the revision (arn:aws:batch:region:account:job-definition/definition-name:revision , or arn:aws:batch:region:account:job-definition/definition-name ).

If the revision is not specified, then the latest active revision is used.

parameters

Additional parameters passed to the job that replace parameter substitution placeholders that are set in the job definition. Parameters are specified as a key and value pair mapping. Parameters in a submit_job request override any corresponding parameter defaults from the job definition.

containerOverrides

An object with properties that override the defaults for the job definition that specify the name of a container in the specified job definition and the overrides it should receive. You can override the default command for a container, which is specified in the job definition or the Docker image, with a command override. You can also override existing environment variables on a container or add new environment variables to it with an environment override.

nodeOverrides

A list of node overrides in JSON format that specify the node range to target and the container overrides for that node range.

This parameter isn't applicable to jobs that are running on Fargate resources; use containerOverrides instead.

retryStrategy

The retry strategy to use for failed jobs from this submit_job operation. When a retry strategy is specified here, it overrides the retry strategy defined in the job definition.

propagateTags

Specifies whether to propagate the tags from the job or job definition to the corresponding Amazon ECS task. If no value is specified, the tags aren't propagated. Tags can only be propagated to the tasks during task creation. For tags with the same name, job tags are given priority over job definitions tags. If the total number of combined tags from the job and job definition is over 50, the job is moved to the FAILED state. When specified, this overrides the tag propagation setting in the job definition.

timeout

The timeout configuration for this submit_job operation. You can specify a timeout duration after which Batch terminates your jobs if they haven't finished. If a job is terminated due to a timeout, it isn't retried. The minimum value for the timeout is 60 seconds. This configuration overrides any timeout configuration specified in the job definition. For array jobs, child jobs have the same timeout configuration as the parent job. For more information, see Job Timeouts in the Amazon Elastic Container Service Developer Guide.

tags

The tags that you apply to the job request to help you categorize and organize your resources. Each tag consists of a key and an optional value. For more information, see Tagging Amazon Web Services Resources in Amazon Web Services General Reference.

eksPropertiesOverride

An object, with properties that override defaults for the job definition, can only be specified for jobs that are run on Amazon EKS resources.

ecsPropertiesOverride

An object, with properties that override defaults for the job definition, can only be specified for jobs that are run on Amazon ECS resources.