Submits an AWS Batch job from a job definition. Parameters specified during SubmitJob override parameters defined in the job definition.
batch_submit_job(jobName, jobQueue, arrayProperties, dependsOn,
jobDefinition, parameters, containerOverrides, nodeOverrides,
retryStrategy, timeout)
[required] The name of the job. The first character must be alphanumeric, and up to 128 letters (uppercase and lowercase), numbers, hyphens, and underscores are allowed.
[required] The job queue into which the job is submitted. You can specify either the name or the Amazon Resource Name (ARN) of the queue.
The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. For more information, see Array Jobs in the AWS Batch User Guide.
A list of dependencies for the job. A job can depend upon a maximum of
20 jobs. You can specify a SEQUENTIAL
type dependency without
specifying a job ID for array jobs so that each child array job
completes sequentially, starting at index 0. You can also specify an
N_TO_N
type dependency with a job ID for array jobs. In that case,
each index child of this job must wait for the corresponding index child
of each dependency to complete before it can begin.
[required] The job definition used by this job. This value can be either a
name:revision
or the Amazon Resource Name (ARN) for the job
definition.
Additional parameters passed to the job that replace parameter
substitution placeholders that are set in the job definition. Parameters
are specified as a key and value pair mapping. Parameters in a
SubmitJob
request override any corresponding parameter defaults from
the job definition.
A list of container overrides in JSON format that specify the name of a
container in the specified job definition and the overrides it should
receive. You can override the default command for a container (that is
specified in the job definition or the Docker image) with a command
override. You can also override existing environment variables (that are
specified in the job definition or Docker image) on a container or add
new environment variables to it with an environment
override.
A list of node overrides in JSON format that specify the node range to target and the container overrides for that node range.
The retry strategy to use for failed jobs from this SubmitJob operation. When a retry strategy is specified here, it overrides the retry strategy defined in the job definition.
The timeout configuration for this SubmitJob operation. You can specify a timeout duration after which AWS Batch terminates your jobs if they have not finished. If a job is terminated due to a timeout, it is not retried. The minimum value for the timeout is 60 seconds. This configuration overrides any timeout configuration specified in the job definition. For array jobs, child jobs have the same timeout configuration as the parent job. For more information, see Job Timeouts in the Amazon Elastic Container Service Developer Guide.
svc$submit_job( jobName = "string", jobQueue = "string", arrayProperties = list( size = 123 ), dependsOn = list( list( jobId = "string", type = "N_TO_N"|"SEQUENTIAL" ) ), jobDefinition = "string", parameters = list( "string" ), containerOverrides = list( vcpus = 123, memory = 123, command = list( "string" ), instanceType = "string", environment = list( list( name = "string", value = "string" ) ) ), nodeOverrides = list( nodePropertyOverrides = list( list( targetNodes = "string", containerOverrides = list( vcpus = 123, memory = 123, command = list( "string" ), instanceType = "string", environment = list( list( name = "string", value = "string" ) ) ) ) ) ), retryStrategy = list( attempts = 123 ), timeout = list( attemptDurationSeconds = 123 ) )
# NOT RUN {
# This example submits a simple container job called example to the
# HighPriority job queue.
# }
# NOT RUN {
svc$submit_job(
jobDefinition = "sleep60",
jobName = "example",
jobQueue = "HighPriority"
)
# }
# NOT RUN {
# }
Run the code above in your browser using DataLab