I am working on data in R with 18M records. My computer does not have a wealth of RAM available, so I am trying the "ff" package to compensate. To make the amount of time reasonable, I am also using the "foreach" package and running the job in parallel. I am having issues when I run "foreach" in parallel with the full data; smaller groups of the data (say first 100K rows) run correctly.
What I am trying to obtain is rolling daily averages for peoples' values based on dates. I want the average daily value for past 7, 28, 91, etc. days. I am relatively new to R, so I do not understand its nuances. When I run this on the full data it stops after an hour and gives the error:
Task 1 failed - object 'PersonID' not found
What can I do to appropriately use the "ff" package with the "foreach" package. Also, it would be great if there were some way to output the data in a ff data frame and then into SQL. The code is below:
library("ff")
library("ffbase")
library("RODBC")
myconn <- odbcConnect("NO SHOW")
data <- as.ffdf(sqlFetch(myconn, "NO SHOW"))
#data[data=="NULL"] <- NA
#persons <- unique(data$PersonID, incomparables=FALSE)
persons <- aggregate(Value ~ PersonID, data=data, FUN=length)$PersonID
rollingLength <- 7
rollingTimes <- c(7,28,91,182,364,728,100000000)
valueCol <- 6
sinceCol <- 4
func <- function(stuff,id) {
check <- subset(stuff, PersonID == id)
tempvalue <- data.frame(matrix(,nrow=nrow(check),ncol=7,byrow=TRUE))
colnames(tempvalue) <- c("value7","value28","value91","value182","value364","value728","valueLTD")
tempvalue[1,] <- c(NA,NA,NA,NA,NA,NA,NA)
rollingTrips <- c(1,1,1,1,1,1,1)
rollingSinceLast <- c(0,0,0,0,0,0,0)
startIndex <- c(1,1,1,1,1,1,1)
rollingvalues <- c(0,0,0,0,0,0,0)
rollingvalues[1:rollingLength] <- check[1,valueCol]
if (nrow(check) > 1) {
for (r in 2:nrow(check)) {
tempvalue[r,] <- rollingvalues / rollingTrips
rollingvalues <- rollingvalues + check[r,valueCol]
rollingTrips <- rollingTrips + 1
rollingSinceLast <- rollingSinceLast + ifelse(is.na(check[r,sinceCol]), 0, check[r,sinceCol])
for (c in 1:(rollingLength-1)) {
while (rollingSinceLast[c] >= rollingTimes[c]) {
rollingvalues[c] <- rollingvalues[c] - check[startIndex[c],valueCol]
rollingTrips[c] <- rollingTrips[c] - 1
rollingSinceLast[c] <- rollingSinceLast[c] - check[startIndex[c]+1,sinceCol]
startIndex[c] <- startIndex[c] + 1
}
}
}
}
return (cbind(check, tempvalue))
}
library(foreach)
library(doParallel)
cl<-makeCluster(12)
registerDoParallel(cl)
strt<-Sys.time()
outdata <- foreach(id=persons, .combine="rbind", .packages="ff") %dopar% func(data,id)
print(Sys.time()-strt)
stopCluster(cl)
sqlSave(myconn, outdata)
odbcClose(myconn)
foreach
package's %dopar%
command need boundaries of a key value. You can simply split your personID
. Also, you sholud set the partition
value less than makeCluster()
. If you don't do that, you got file.access(filename, 0) == 0 is not TRUE
massage. Because, you can not access to pre-saved ff
package file on the same cluster.
split personID
example:
split_min<-min(persons$personID)
split_max<-max(persons$personID)
partition<-12 # "partition < cluster" is good.
quart_half<-floor((split_max-split_min)/partition)
split_num<-matrix(0,partition,2)
split_num[1,1]<-split_min
split_num[1,2]<-quart_half+split_min
if(partition>=3){
for(i in 2:(partition-1)){
split_num[i,1]<-split_num[i-1,2]+1
split_num[i,2]<-split_num[i-1,2]+quart_half
}}
split_num[partition,1]<-split_num[partition-1,2]+1
split_num[partition,2]<-split_max
And, change foreach
statement.
outdata <- foreach(i=1:partition, .combine="rbind", .packages="ff") %dopar% {
IDs<-subset(persons,personID>=split_num[i,1] & personID<=split_num[i,1])$personID
for(z in IDs){
func(data,z)}
}
or,
outdata <- foreach(i=1:partition, .combine="rbind") %dopar% {
require(ff) #or require(ffbase)
IDs<-subset(persons,personID>=split_num[i,1] & personID<=split_num[i,1])$personID
for(z in IDs){
func(data,z)}
}
Good luck to you.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.