Discover all links on a given web page, follow each one, and recursively scan every link found.
Return a list of web addresses whose pages contain no links.
Usage
WebCrawler(url, depth = NULL, verbose = TRUE)
Value
urls.out
A list of web page addresses, each of which corresponds to a model instance.
Arguments
url
A URL to scan for links.
depth
How many links to return.
This avoids having to recursively scan hundreds of links.
Defaults to NULL, which returns everything.
verbose
Print out each link as it is discovered.
Defaults to TRUE.
#Find the first 10 model runs for the #GFS 0.5x0.5 modelif (FALSE) urls.out <- WebCrawler(
"http://nomads.ncep.noaa.gov/cgi-bin/filter_gfs_0p50.pl", depth = 10)