Learn R Programming

paws.database (version 0.7.0)

timestreamwrite: Amazon Timestream Write

Description

Amazon Timestream is a fast, scalable, fully managed time-series database service that makes it easy to store and analyze trillions of time-series data points per day. With Timestream, you can easily store and analyze IoT sensor data to derive insights from your IoT applications. You can analyze industrial telemetry to streamline equipment management and maintenance. You can also store and analyze log data and metrics to improve the performance and availability of your applications.

Timestream is built from the ground up to effectively ingest, process, and store time-series data. It organizes data to optimize query processing. It automatically scales based on the volume of data ingested and on the query volume to ensure you receive optimal performance while inserting and querying data. As your data grows over time, Timestream’s adaptive query processing engine spans across storage tiers to provide fast analysis while reducing costs.

Usage

timestreamwrite(
  config = list(),
  credentials = list(),
  endpoint = NULL,
  region = NULL
)

Value

A client for the service. You can call the service's operations using syntax like svc$operation(...), where svc is the name you've assigned to the client. The available operations are listed in the Operations section.

Arguments

config

Optional configuration of credentials, endpoint, and/or region.

  • credentials:

    • creds:

      • access_key_id: AWS access key ID

      • secret_access_key: AWS secret access key

      • session_token: AWS temporary session token

    • profile: The name of a profile to use. If not given, then the default profile is used.

    • anonymous: Set anonymous credentials.

  • endpoint: The complete URL to use for the constructed client.

  • region: The AWS Region used in instantiating the client.

  • close_connection: Immediately close all HTTP connections.

  • timeout: The time in seconds till a timeout exception is thrown when attempting to make a connection. The default is 60 seconds.

  • s3_force_path_style: Set this to true to force the request to use path-style addressing, i.e. http://s3.amazonaws.com/BUCKET/KEY.

  • sts_regional_endpoint: Set sts regional endpoint resolver to regional or legacy https://docs.aws.amazon.com/sdkref/latest/guide/feature-sts-regionalized-endpoints.html

credentials

Optional credentials shorthand for the config parameter

  • creds:

    • access_key_id: AWS access key ID

    • secret_access_key: AWS secret access key

    • session_token: AWS temporary session token

  • profile: The name of a profile to use. If not given, then the default profile is used.

  • anonymous: Set anonymous credentials.

endpoint

Optional shorthand for complete URL to use for the constructed client.

region

Optional shorthand for AWS Region used in instantiating the client.

Service syntax

svc <- timestreamwrite(
  config = list(
    credentials = list(
      creds = list(
        access_key_id = "string",
        secret_access_key = "string",
        session_token = "string"
      ),
      profile = "string",
      anonymous = "logical"
    ),
    endpoint = "string",
    region = "string",
    close_connection = "logical",
    timeout = "numeric",
    s3_force_path_style = "logical",
    sts_regional_endpoint = "string"
  ),
  credentials = list(
    creds = list(
      access_key_id = "string",
      secret_access_key = "string",
      session_token = "string"
    ),
    profile = "string",
    anonymous = "logical"
  ),
  endpoint = "string",
  region = "string"
)

Operations

create_batch_load_taskCreates a new Timestream batch load task
create_databaseCreates a new Timestream database
create_tableAdds a new table to an existing database in your account
delete_databaseDeletes a given Timestream database
delete_tableDeletes a given Timestream table
describe_batch_load_taskReturns information about the batch load task, including configurations, mappings, progress, and other details
describe_databaseReturns information about the database, including the database name, time that the database was created, and the total number of tables found within the database
describe_endpointsReturns a list of available endpoints to make Timestream API calls against
describe_tableReturns information about the table, including the table name, database name, retention duration of the memory store and the magnetic store
list_batch_load_tasksProvides a list of batch load tasks, along with the name, status, when the task is resumable until, and other details
list_databasesReturns a list of your Timestream databases
list_tablesProvides a list of tables, along with the name, status, and retention properties of each table
list_tags_for_resourceLists all tags on a Timestream resource
resume_batch_load_taskResume batch load task
tag_resourceAssociates a set of tags with a Timestream resource
untag_resourceRemoves the association of tags from a Timestream resource
update_databaseModifies the KMS key for an existing database
update_tableModifies the retention duration of the memory store and magnetic store for your Timestream table
write_recordsEnables you to write your time-series data into Timestream

Examples

Run this code
if (FALSE) {
svc <- timestreamwrite()
svc$create_batch_load_task(
  Foo = 123
)
}

Run the code above in your browser using DataLab