Learn R Programming

paws.database (version 0.7.0)

dynamodb: Amazon DynamoDB

Description

Amazon DynamoDB is a fully managed NoSQL database service that provides fast and predictable performance with seamless scalability. DynamoDB lets you offload the administrative burdens of operating and scaling a distributed database, so that you don't have to worry about hardware provisioning, setup and configuration, replication, software patching, or cluster scaling.

With DynamoDB, you can create database tables that can store and retrieve any amount of data, and serve any level of request traffic. You can scale up or scale down your tables' throughput capacity without downtime or performance degradation, and use the Amazon Web Services Management Console to monitor resource utilization and performance metrics.

DynamoDB automatically spreads the data and traffic for your tables over a sufficient number of servers to handle your throughput and storage requirements, while maintaining consistent and fast performance. All of your data is stored on solid state disks (SSDs) and automatically replicated across multiple Availability Zones in an Amazon Web Services Region, providing built-in high availability and data durability.

Usage

dynamodb(config = list(), credentials = list(), endpoint = NULL, region = NULL)

Value

A client for the service. You can call the service's operations using syntax like svc$operation(...), where svc is the name you've assigned to the client. The available operations are listed in the Operations section.

Arguments

config

Optional configuration of credentials, endpoint, and/or region.

  • credentials:

    • creds:

      • access_key_id: AWS access key ID

      • secret_access_key: AWS secret access key

      • session_token: AWS temporary session token

    • profile: The name of a profile to use. If not given, then the default profile is used.

    • anonymous: Set anonymous credentials.

  • endpoint: The complete URL to use for the constructed client.

  • region: The AWS Region used in instantiating the client.

  • close_connection: Immediately close all HTTP connections.

  • timeout: The time in seconds till a timeout exception is thrown when attempting to make a connection. The default is 60 seconds.

  • s3_force_path_style: Set this to true to force the request to use path-style addressing, i.e. http://s3.amazonaws.com/BUCKET/KEY.

  • sts_regional_endpoint: Set sts regional endpoint resolver to regional or legacy https://docs.aws.amazon.com/sdkref/latest/guide/feature-sts-regionalized-endpoints.html

credentials

Optional credentials shorthand for the config parameter

  • creds:

    • access_key_id: AWS access key ID

    • secret_access_key: AWS secret access key

    • session_token: AWS temporary session token

  • profile: The name of a profile to use. If not given, then the default profile is used.

  • anonymous: Set anonymous credentials.

endpoint

Optional shorthand for complete URL to use for the constructed client.

region

Optional shorthand for AWS Region used in instantiating the client.

Service syntax

svc <- dynamodb(
  config = list(
    credentials = list(
      creds = list(
        access_key_id = "string",
        secret_access_key = "string",
        session_token = "string"
      ),
      profile = "string",
      anonymous = "logical"
    ),
    endpoint = "string",
    region = "string",
    close_connection = "logical",
    timeout = "numeric",
    s3_force_path_style = "logical",
    sts_regional_endpoint = "string"
  ),
  credentials = list(
    creds = list(
      access_key_id = "string",
      secret_access_key = "string",
      session_token = "string"
    ),
    profile = "string",
    anonymous = "logical"
  ),
  endpoint = "string",
  region = "string"
)

Operations

batch_execute_statementThis operation allows you to perform batch reads or writes on data stored in DynamoDB, using PartiQL
batch_get_itemThe BatchGetItem operation returns the attributes of one or more items from one or more tables
batch_write_itemThe BatchWriteItem operation puts or deletes multiple items in one or more tables
create_backupCreates a backup for an existing table
create_global_tableCreates a global table from an existing table
create_tableThe CreateTable operation adds a new table to your account
delete_backupDeletes an existing backup of a table
delete_itemDeletes a single item in a table by primary key
delete_resource_policyDeletes the resource-based policy attached to the resource, which can be a table or stream
delete_tableThe DeleteTable operation deletes a table and all of its items
describe_backupDescribes an existing backup of a table
describe_continuous_backupsChecks the status of continuous backups and point in time recovery on the specified table
describe_contributor_insightsReturns information about contributor insights for a given table or global secondary index
describe_endpointsReturns the regional endpoint information
describe_exportDescribes an existing table export
describe_global_tableReturns information about the specified global table
describe_global_table_settingsDescribes Region-specific settings for a global table
describe_importRepresents the properties of the import
describe_kinesis_streaming_destinationReturns information about the status of Kinesis streaming
describe_limitsReturns the current provisioned-capacity quotas for your Amazon Web Services account in a Region, both for the Region as a whole and for any one DynamoDB table that you create there
describe_tableReturns information about the table, including the current status of the table, when it was created, the primary key schema, and any indexes on the table
describe_table_replica_auto_scalingDescribes auto scaling settings across replicas of the global table at once
describe_time_to_liveGives a description of the Time to Live (TTL) status on the specified table
disable_kinesis_streaming_destinationStops replication from the DynamoDB table to the Kinesis data stream
enable_kinesis_streaming_destinationStarts table data replication to the specified Kinesis data stream at a timestamp chosen during the enable workflow
execute_statementThis operation allows you to perform reads and singleton writes on data stored in DynamoDB, using PartiQL
execute_transactionThis operation allows you to perform transactional reads or writes on data stored in DynamoDB, using PartiQL
export_table_to_point_in_timeExports table data to an S3 bucket
get_itemThe GetItem operation returns a set of attributes for the item with the given primary key
get_resource_policyReturns the resource-based policy document attached to the resource, which can be a table or stream, in JSON format
import_tableImports table data from an S3 bucket
list_backupsList DynamoDB backups that are associated with an Amazon Web Services account and weren't made with Amazon Web Services Backup
list_contributor_insightsReturns a list of ContributorInsightsSummary for a table and all its global secondary indexes
list_exportsLists completed exports within the past 90 days
list_global_tablesLists all global tables that have a replica in the specified Region
list_importsLists completed imports within the past 90 days
list_tablesReturns an array of table names associated with the current account and endpoint
list_tags_of_resourceList all tags on an Amazon DynamoDB resource
put_itemCreates a new item, or replaces an old item with a new item
put_resource_policyAttaches a resource-based policy document to the resource, which can be a table or stream
queryYou must provide the name of the partition key attribute and a single value for that attribute
restore_table_from_backupCreates a new table from an existing backup
restore_table_to_point_in_timeRestores the specified table to the specified point in time within EarliestRestorableDateTime and LatestRestorableDateTime
scanThe Scan operation returns one or more items and item attributes by accessing every item in a table or a secondary index
tag_resourceAssociate a set of tags with an Amazon DynamoDB resource
transact_get_itemsTransactGetItems is a synchronous operation that atomically retrieves multiple items from one or more tables (but not from indexes) in a single account and Region
transact_write_itemsTransactWriteItems is a synchronous write operation that groups up to 100 action requests
untag_resourceRemoves the association of tags from an Amazon DynamoDB resource
update_continuous_backupsUpdateContinuousBackups enables or disables point in time recovery for the specified table
update_contributor_insightsUpdates the status for contributor insights for a specific table or index
update_global_tableAdds or removes replicas in the specified global table
update_global_table_settingsUpdates settings for a global table
update_itemEdits an existing item's attributes, or adds a new item to the table if it does not already exist
update_kinesis_streaming_destinationThe command to update the Kinesis stream destination
update_tableModifies the provisioned throughput settings, global secondary indexes, or DynamoDB Streams settings for a given table
update_table_replica_auto_scalingUpdates auto scaling settings on your global tables at once
update_time_to_liveThe UpdateTimeToLive method enables or disables Time to Live (TTL) for the specified table

Examples

Run this code
if (FALSE) {
svc <- dynamodb()
# This example reads multiple items from the Music table using a batch of
# three GetItem requests.  Only the AlbumTitle attribute is returned.
svc$batch_get_item(
  RequestItems = list(
    Music = list(
      Keys = list(
        list(
          Artist = list(
            S = "No One You Know"
          ),
          SongTitle = list(
            S = "Call Me Today"
          )
        ),
        list(
          Artist = list(
            S = "Acme Band"
          ),
          SongTitle = list(
            S = "Happy Day"
          )
        ),
        list(
          Artist = list(
            S = "No One You Know"
          ),
          SongTitle = list(
            S = "Scared of My Shadow"
          )
        )
      ),
      ProjectionExpression = "AlbumTitle"
    )
  )
)
}

Run the code above in your browser using DataLab