Learn R Programming

readr (version 0.1.1)

Tokenizers: Tokenizers.

Description

Explicitly create tokenizer objects. Usually you will not call these function, but will instead use one of the use friendly wrappers like read_csv.

Usage

tokenizer_delim(delim,
  quote = "\"", na = "NA", escape_double = TRUE, escape_backslash = FALSE)

tokenizer_csv(na = "NA")

tokenizer_tsv(na = "NA")

tokenizer_line()

tokenizer_log()

tokenizer_fwf(begin, end, na = "NA")

Arguments

delim
Single character used to separate fields within a record.
quote
Single character used to quote strings.
na
String to use for missing values.
escape_double
Does the file escape quotes by doubling them? i.e. If this option is TRUE, the value """" represents a single quote, ".
escape_backslash
Does the file use backslashes to escape special characters? This is more general than escape_double as backslashes can be used to escape the delimeter character, the quote characer, or to add special characters like \n.
begin,end
Begin and end offsets for each file. These are C++ offsets so the first column is column zero, and the ranges are [begin, end) (i.e inclusive-exclusive).

Examples

Run this code
tokenizer_csv()

Run the code above in your browser using DataLab