Please use api-job instead.
insert_query_job(query, project, destination_table = NULL,
default_dataset = NULL, create_disposition = "CREATE_IF_NEEDED",
write_disposition = "WRITE_EMPTY", use_legacy_sql = TRUE, ...)
SQL query string
(optional) destination table for large queries,
either as a string in the format used by BigQuery, or as a list with
project_id
, dataset_id
, and table_id
entries
(optional) default dataset for any table references in
query
, either as a string in the format used by BigQuery or as a
list with project_id
and dataset_id
entries
behavior for table creation.
defaults to "CREATE_IF_NEEDED"
,
the only other supported value is "CREATE_NEVER"
; see
the API documentation
for more information
behavior for writing data.
defaults to "WRITE_EMPTY"
, other possible values are
"WRITE_TRUNCATE"
and "WRITE_APPEND"
; see
the API documentation
for more information
(optional) set to FALSE
to enable BigQuery's standard SQL.
a job resource list, as documented at https://developers.google.com/bigquery/docs/reference/v2/jobs
API documentation for insert method: https://developers.google.com/bigquery/docs/reference/v2/jobs/insert
Other jobs: get_job
,
insert_extract_job
,
insert_upload_job
, wait_for