Learn R Programming

SparkR (version 3.1.2)

localCheckpoint: localCheckpoint

Description

Returns a locally checkpointed version of this SparkDataFrame. Checkpointing can be used to truncate the logical plan, which is especially useful in iterative algorithms where the plan may grow exponentially. Local checkpoints are stored in the executors using the caching subsystem and therefore they are not reliable.

Usage

localCheckpoint(x, eager = TRUE)

# S4 method for SparkDataFrame localCheckpoint(x, eager = TRUE)

Arguments

x

A SparkDataFrame

eager

whether to locally checkpoint this SparkDataFrame immediately

Value

a new locally checkpointed SparkDataFrame

See Also

Other SparkDataFrame functions: SparkDataFrame-class, agg(), alias(), arrange(), as.data.frame(), attach,SparkDataFrame-method, broadcast(), cache(), checkpoint(), coalesce(), collect(), colnames(), coltypes(), createOrReplaceTempView(), crossJoin(), cube(), dapplyCollect(), dapply(), describe(), dim(), distinct(), dropDuplicates(), dropna(), drop(), dtypes(), exceptAll(), except(), explain(), filter(), first(), gapplyCollect(), gapply(), getNumPartitions(), group_by(), head(), hint(), histogram(), insertInto(), intersectAll(), intersect(), isLocal(), isStreaming(), join(), limit(), merge(), mutate(), ncol(), nrow(), persist(), printSchema(), randomSplit(), rbind(), rename(), repartitionByRange(), repartition(), rollup(), sample(), saveAsTable(), schema(), selectExpr(), select(), showDF(), show(), storageLevel(), str(), subset(), summary(), take(), toJSON(), unionAll(), unionByName(), union(), unpersist(), withColumn(), withWatermark(), with(), write.df(), write.jdbc(), write.json(), write.orc(), write.parquet(), write.stream(), write.text()

Examples

Run this code
# NOT RUN {
df <- localCheckpoint(df)
# }

Run the code above in your browser using DataLab