thedata <- importKCL(site = "my1", year = 2009, pollutant =
"all", met = FALSE, units = "mass")
site =
c("my1", "kc1")
--- to import Marylebone Road and North Kensignton for
example.year = 1990:2000
. To import several
specfic years use year = c(1990, 1995, 2000)
for example.pollutant = c("nox", "no2")
.FALSE
. If TRUE
wind speed (m/s), wind
direction (degrees) and
temperature (deg. C) are added from the Heathrow site. This is
currently for testingunits = "volume"
to use ppb etc. PM10 TEOM data are multiplied
by 1.3 and PM2.5 have no correction applied. We are importKCL
function is currently released as a
test version. NOTE - Information is required on the site codes for
easy use.importKCL
function has been written to make it easy
to import data from the King's College London air pollution
networks. KCL have provided .RData files (R workspaces) of all
individual sites and years for the KCL networks. These files are updated on a
weekly basis. This approach requires a link to the Internet to work.
There are several advantages over the web portal approach where .csv
files are downloaded. First, it is quick to select a range of sites,
pollutants and periods (see examples below). Second, storing the data as
.RData objects is very efficient as they are about four times smaller
than .csv files --- which means the data downloads quickly and saves
bandwidth. Third, the function completely avoids any need for data
manipulation or setting time formats, time zones etc. Finally, it is
easy to import many years of data beyond the current limit of about
64,000 lines. The final point makes it possible to download several long
time series in one go. The function also has the advantage that the
proper site name is imported and used in openair
functions.
The site codes and pollutant names can be upper or lower case. The
function will issue a warning when data less than six months old is
downloaded, which may not be ratified.
The data are imported by stacking sites on top of one another and will
have field names date
, site
, code
(the site code) and
pollutant(s). Sometimes it is useful to have columns of site
data. This can be done using the reshape
function --- see
examples below.
While the function is being developed, the following site codes should
help with selection. We will also make available other meta data such as
site type and location to make it easier to select sites based on other
information.
importAURN
, importADMS
## import all pollutants from Marylebone Rd from 1990:2009
mary <- importKCL(site = "my1", year = 2000:2009)
## import nox, no2, o3 from Marylebone Road and North Kensignton for 2000
thedata <- importKCL(site = c("my1", "kc1"), year = 2000,
pollutant = c("nox", "no2", "o3"))
## import met data too...
my1 <- importKCL(site = "my1", year = 2008, met = TRUE)
## reshape the data so that each column represents a pollutant/site
thedata <- importKCL(site = c("my1", "kc1"), year = 2008,
pollutant = "o3")
thedata <- reshape(thedata, idvar = "date", timevar = "code", direction
= "wide")
## thedata now has columns o3.MY1 and o3.KC1
## can also get rid of columns, in this case site:
thedata <- reshape(thedata, idvar = "date", timevar = "code", direction
= "wide", drop = "site")
## now can export as a csv file:
write.csv(thedata, file = "~/temp/thedata.csv")
Run the code above in your browser using DataLab