Learn R Programming

paws (version 0.1.12)

kinesis: Amazon Kinesis

Description

Amazon Kinesis Data Streams Service API Reference

Amazon Kinesis Data Streams is a managed service that scales elastically for real-time processing of streaming big data.

Usage

kinesis(config = list())

Value

A client for the service. You can call the service's operations using syntax like svc$operation(...), where svc is the name you've assigned to the client. The available operations are listed in the Operations section.

Arguments

config

Optional configuration of credentials, endpoint, and/or region.

Service syntax

svc <- kinesis(
  config = list(
    credentials = list(
      creds = list(
        access_key_id = "string",
        secret_access_key = "string",
        session_token = "string"
      ),
      profile = "string"
    ),
    endpoint = "string",
    region = "string"
  )
)

Operations

add_tags_to_streamAdds or updates tags for the specified Kinesis data stream
create_streamCreates a Kinesis data stream
decrease_stream_retention_periodDecreases the Kinesis data stream's retention period, which is the length of time data records are accessible after they are added to the stream
delete_streamDeletes a Kinesis data stream and all its shards and data
deregister_stream_consumerTo deregister a consumer, provide its ARN
describe_limitsDescribes the shard limits and usage for the account
describe_streamDescribes the specified Kinesis data stream
describe_stream_consumerTo get the description of a registered consumer, provide the ARN of the consumer
describe_stream_summaryProvides a summarized description of the specified Kinesis data stream without the shard list
disable_enhanced_monitoringDisables enhanced monitoring
enable_enhanced_monitoringEnables enhanced Kinesis data stream monitoring for shard-level metrics
get_recordsGets data records from a Kinesis data stream's shard
get_shard_iteratorGets an Amazon Kinesis shard iterator
increase_stream_retention_periodIncreases the Kinesis data stream's retention period, which is the length of time data records are accessible after they are added to the stream
list_shardsLists the shards in a stream and provides information about each shard
list_stream_consumersLists the consumers registered to receive data from a stream using enhanced fan-out, and provides information about each consumer
list_streamsLists your Kinesis data streams
list_tags_for_streamLists the tags for the specified Kinesis data stream
merge_shardsMerges two adjacent shards in a Kinesis data stream and combines them into a single shard to reduce the stream's capacity to ingest and transport data
put_recordWrites a single data record into an Amazon Kinesis data stream
put_recordsWrites multiple data records into a Kinesis data stream in a single call (also referred to as a PutRecords request)
register_stream_consumerRegisters a consumer with a Kinesis data stream
remove_tags_from_streamRemoves tags from the specified Kinesis data stream
split_shardSplits a shard into two new shards in the Kinesis data stream, to increase the stream's capacity to ingest and transport data
start_stream_encryptionEnables or updates server-side encryption using an AWS KMS key for a specified stream
stop_stream_encryptionDisables server-side encryption for a specified stream
update_shard_countUpdates the shard count of the specified stream to the specified number of shards

Examples

Run this code
if (FALSE) {
svc <- kinesis()
svc$add_tags_to_stream(
  Foo = 123
)
}

Run the code above in your browser using DataLab