This is needed to make a duplicate of a big.matrix
, with the new copy
optionally filebacked.
deepcopy(
x,
cols = NULL,
rows = NULL,
y = NULL,
type = NULL,
separated = NULL,
backingfile = NULL,
backingpath = NULL,
descriptorfile = NULL,
binarydescriptor = FALSE,
shared = options()$bigmemory.default.shared
)
a big.matrix
.
a big.matrix
.
possible subset of columns for the deepcopy; could be numeric, named, or logical.
possible subset of rows for the deepcopy; could be numeric, named, or logical.
optional destination object (matrix
or big.matrix
);
if not specified, a big.matrix
will be created.
preferably specified, "integer"
for example.
use separated column organization of the data instead of column-major organization; use with caution if the number of columns is large.
the root name for the file(s) for the cache of x
.
the path to the directory containing the file-backing cache.
we recommend specifying this for file-backing.
the flag to specify if the binary RDS format should
be used for the backingfile description, for subsequent use with
attach.big.matrix
; if NULL
of FALSE
, the
dput()
file format is used.
TRUE
by default, and always TRUE
if the
big.matrix
is file-backed. For a non-filebacked big.matrix
,
shared=FALSE
uses non-shared memory, which can be more stable for
large (say, >50\
fail in such cases due to exhausted shared-memory resources in the system.
This is needed to make a duplicate of a big.matrix
, because
traditional syntax would only copy the object (the pointer to the
big.matrix
rather than the big.matrix
itself).
It can also make a copy of only a subset of columns.
big.matrix
x <- as.big.matrix(matrix(1:30, 10, 3))
y <- deepcopy(x, -1) # Don't include the first column.
x
y
head(x)
head(y)
Run the code above in your browser using DataLab