简体   繁体   中英

How to improve speed while converting Big spatial polygon data frame to raster stack

I'm working on spatial polygon data frame (spdf) dataset. This dataset contains a time series of monthly climate data. What I'm trying to achieve is to convert the spdf to raster stack with 1000m resolution for further statistical analysis. I've writen an R code but is is very slow and it took ages to convert one column.I would appreciate if any of you could suggest tips to make it faster.

hru<-readOGR("E:\\Tade\\HRU\\ubn_merge.shp",layer="ubn_merge") # spatial polygon
spdf<-merge(hru,spdf.2000,by.x="HRU",by.y="HRU",all.x=T,sort=F) # spdf nrow=565 ncol=375
# convert sp to raster
hru.ras<-raster(hru,resolution=1000) # raster hru shape to 1km
for (i in 1:length(spdf){
  et.ras<-rasterize(spdf,hru.ras,field=paste("m",1,sep="")) # rasterize
  et.stack<-stack(et.stack,et.ras)
}

Thanks

As Forrest says (and you have experienced), rasterize is a bit slow, but you can do much better than what you have now. You do not need to use a loop at all:

r <- raster(spdf, resolution=1000)
et.ras <-rasterize(spdf, r, field=paste0("m",1:ncol(spdf)))

This will create a single RasterLayer with a Raster Attribute Table. To create a RasterStack, do:

s <- deratify(et.ras)

If you were to use a loop, use rasterize only once, the get the polygon IDs, and then use subs for the actual variables of interest.

And of course this saves you the pain of external dependencies.

As of March 2020, you can now consider this new package.

https://cran.r-project.org/web/packages/fasterize/vignettes/using-fasterize.html

fasterize()

Works exactly as

rasterize()

But 100-1000 times faster. I think this can help a lots of people.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM