Batch
Using Batch, you can run batch computing workloads on the Amazon Web Services Cloud. Batch computing is a common means for developers, scientists, and engineers to access large amounts of compute resources. Batch uses the advantages of the batch computing to remove the undifferentiated heavy lifting of configuring and managing required infrastructure. At the same time, it also adopts a familiar batch computing software approach. You can use Batch to efficiently provision resources, and work toward eliminating capacity constraints, reducing your overall compute costs, and delivering results more quickly.
As a fully managed service, Batch can run batch computing workloads of any scale. Batch automatically provisions compute resources and optimizes workload distribution based on the quantity and scale of your specific workloads. With Batch, there's no need to install or manage batch computing software. This means that you can focus on analyzing results and solving your specific problems instead.
batch(config = list(), credentials = list(), endpoint = NULL, region = NULL)
A client for the service. You can call the service's operations using
syntax like svc$operation(...)
, where svc
is the name you've assigned
to the client. The available operations are listed in the
Operations section.
Optional configuration of credentials, endpoint, and/or region.
credentials:
creds:
access_key_id: AWS access key ID
secret_access_key: AWS secret access key
session_token: AWS temporary session token
profile: The name of a profile to use. If not given, then the default profile is used.
anonymous: Set anonymous credentials.
endpoint: The complete URL to use for the constructed client.
region: The AWS Region used in instantiating the client.
close_connection: Immediately close all HTTP connections.
timeout: The time in seconds till a timeout exception is thrown when attempting to make a connection. The default is 60 seconds.
s3_force_path_style: Set this to true
to force the request to use path-style addressing, i.e. http://s3.amazonaws.com/BUCKET/KEY
.
sts_regional_endpoint: Set sts regional endpoint resolver to regional or legacy https://docs.aws.amazon.com/sdkref/latest/guide/feature-sts-regionalized-endpoints.html
Optional credentials shorthand for the config parameter
creds:
access_key_id: AWS access key ID
secret_access_key: AWS secret access key
session_token: AWS temporary session token
profile: The name of a profile to use. If not given, then the default profile is used.
anonymous: Set anonymous credentials.
Optional shorthand for complete URL to use for the constructed client.
Optional shorthand for AWS Region used in instantiating the client.
svc <- batch(
config = list(
credentials = list(
creds = list(
access_key_id = "string",
secret_access_key = "string",
session_token = "string"
),
profile = "string",
anonymous = "logical"
),
endpoint = "string",
region = "string",
close_connection = "logical",
timeout = "numeric",
s3_force_path_style = "logical",
sts_regional_endpoint = "string"
),
credentials = list(
creds = list(
access_key_id = "string",
secret_access_key = "string",
session_token = "string"
),
profile = "string",
anonymous = "logical"
),
endpoint = "string",
region = "string"
)
cancel_job | Cancels a job in an Batch job queue |
create_compute_environment | Creates an Batch compute environment |
create_job_queue | Creates an Batch job queue |
create_scheduling_policy | Creates an Batch scheduling policy |
delete_compute_environment | Deletes an Batch compute environment |
delete_job_queue | Deletes the specified job queue |
delete_scheduling_policy | Deletes the specified scheduling policy |
deregister_job_definition | Deregisters an Batch job definition |
describe_compute_environments | Describes one or more of your compute environments |
describe_job_definitions | Describes a list of job definitions |
describe_job_queues | Describes one or more of your job queues |
describe_jobs | Describes a list of Batch jobs |
describe_scheduling_policies | Describes one or more of your scheduling policies |
get_job_queue_snapshot | Provides a list of the first 100 RUNNABLE jobs associated to a single job queue |
list_jobs | Returns a list of Batch jobs |
list_scheduling_policies | Returns a list of Batch scheduling policies |
list_tags_for_resource | Lists the tags for an Batch resource |
register_job_definition | Registers an Batch job definition |
submit_job | Submits an Batch job from a job definition |
tag_resource | Associates the specified tags to a resource with the specified resourceArn |
terminate_job | Terminates a job in a job queue |
untag_resource | Deletes specified tags from an Batch resource |
update_compute_environment | Updates an Batch compute environment |
update_job_queue | Updates a job queue |
update_scheduling_policy | Updates a scheduling policy |
if (FALSE) {
svc <- batch()
# This example cancels a job with the specified job ID.
svc$cancel_job(
jobId = "1d828f65-7a4d-42e8-996d-3b900ed59dc4",
reason = "Cancelling job."
)
}
Run the code above in your browser using DataLab