Learn R Programming

rNOMADS (version 2.5.3)

GribGrab: Download grib file from the NOMADS server.

Description

This function interfaces with the programming API at https://nomads.ncep.noaa.gov/ to download NOMADS model data. The available models can be viewed by calling NOMADSRealTimeList. The data arrives in grib (gridded binary) format that can be read with ReadGrib.

Usage

GribGrab(model.url, preds, levels, variables, 
    local.dir = NULL, file.names = NULL,
    model.domain = NULL, tidy = FALSE, verbose = TRUE,
    check.url = TRUE, download.method = NULL)

Value

grib.info$file.name

The path and file name of the grib file that was downloaded.

grib.info$url

The URL that the grib file was downloaded from

Arguments

model.url

The address of a model download page, probably from CrawlModels.

preds

A vector of predictions (or model times) determined by the specific model from model.url

levels

A list of model levels to download.

variables

A list of model variables to download.

local.dir

Where to save the grib file, defaults to the current directory.

file.names

What to name the grib file, defaults to "fcst.grb".

model.domain

A vector of latitudes and longitudes that specify the area to return a forecast for. This is a rectangle with elements: west longitude, east longitude, north latitude, south latitude.

tidy

If TRUE, remove all files with the suffix ".grb" from local.dir prior to downloading a new grib file.

verbose

If TRUE, give information on connection status. Default TRUE

check.url

If TRUE, verify that the model URL is real and contains data. Default TRUE

download.method

Allows the user to set the download method used by download.file: "internal", "wget" "curl", "lynx". If NULL (the default), let R decide.

Author

Daniel C. Bowman danny.c.bowman@gmail.com

References

https://nomads.ncep.noaa.gov/

See Also

CrawlModels, ParseModelPage, ReadGrib

Examples

Run this code

#An example for the Global Forecast System 0.5 degree model

#Get the latest model url
if (FALSE) {
urls.out <- CrawlModels(abbrev = "gfs_0p50", depth = 1)

#Get a list of forecasts, variables and levels
model.parameters <- ParseModelPage(urls.out[1])

#Figure out which one is the 6 hour forecast
#provided by the latest model run
#(will be the forecast from 6-12 hours from the current date) 

my.pred <- model.parameters$pred[grep("06$", model.parameters$pred)]

#What region of the atmosphere to get data for
levels <- c("2 m above ground", "800 mb") 

#What data to return
variables <- c("TMP", "RH") #Temperature and relative humidity

#Get the data
grib.info <- GribGrab(urls.out[1], my.pred, levels, variables)

#Extract the data
model.data <- ReadGrib(grib.info[[1]]$file.name, levels, variables)

#Reformat it
model.grid <- ModelGrid(model.data, c(0.5, 0.5))

#Show an image of world temperature at ground level
image(model.grid$z[2, 1,,])
}

Run the code above in your browser using DataLab