简体   繁体   中英

Download all CSV files from folders and subfolders with URL in R

I was trying to download the CSV files from this site: http://tiservice.hii.or.th/opendata/data_catalog/daily_rain/ . These are the daily rainfall data, which were stored by year (in its own folder), and months as subfolder in each year.

example of the folder

In each month folder, individual files for each weather station are available. I want to be able to pull data from a specific station (eg ABRT.csv) from all the subfolders within the parent directory.

I was able to do this with the file in my local folder like this.

dir("climate_data/temperature/", recursive=TRUE, full.names=TRUE, pattern="\\ABRT.csv$")

Is it possible to do a similar thing directly from the URL from R (or terminal, if that would be easier)?

How about this:

eg <- expand.grid(2012:2021, 
                  c("01", "02", "03", "04","05","06", "07", "08", "09", "10","11","12"))
eg$url <- paste("http://tiservice.hii.or.th/opendata/data_catalog/daily_rain/",
                eg[,1], 
                "/", 
                paste(eg[,1], 
                      eg[,2], 
                      sep=""), 
                "/ABRT.csv", 
                sep="")
eg$dest <- paste(eg[,1], eg[,2], "ABRT.csv", sep="_")
for(i in 1:nrow(eg)){
  curl::curl_download(eg$url[1], eg$dest[1])
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM