Learn R Programming

rNOMADS (version 2.5.3)

CrawlModels: Get Available Model Runs

Description

This function determine which instances of a given model are available for download.

Usage

CrawlModels(abbrev = NULL, model.url = NULL, depth = NULL, verbose = TRUE)

Value

urls.out

A list of web page addresses, each of which corresponds to a model instance.

Arguments

abbrev

The model abbreviation, see NOMADSRealTimeList. Defaults to NULL.

model.url

A URL to use instead of using the abbreviations in NOMADSRealTimeList. Defaults to NULL.

depth

How many model instances to return. This avoids having to download the entire model list (sometimes several hundred) if only the first few instances are required. Defaults to NULL, which returns everything.

verbose

Print out each link as it is discovered. Defaults to TRUE.

Author

Daniel C. Bowman danny.c.bowman@gmail.com

Details

This function calls WebCrawler, a recursive algorithm that discovers each link available in the URL provided. It then searches each link in turn, and follows those links until it reaches a dead end. At that point, it returns the URL. For the model pages on the NOMADS web site, each dead end is a model instance that can be examined using ParseModelPage or have data retrieved from it using GribGrab.

See Also

WebCrawler, ParseModelPage, NOMADSRealTimeList, GribGrab

Examples

Run this code

#Get the latest 5 instances
#for the Global Forecast System 0.5 degree model

if (FALSE) urls.out <- CrawlModels(abbrev = "gfs_0p50", depth = 5)

Run the code above in your browser using DataLab