简体   繁体   中英

Passing parameters to parallel R jobs

I am trying to run parallel R jobs using the multicore package. Every job is the execution of the same script.R with different arguments.

A general idea is to define a function that takes the args and then calls source("script.R") . The problem is that I cannot pass the args to the script.R . Since I am running in parallel the args cannot be defined in the global scope.

Any help is welcomed.

As running parallel R instances which might be even on different nodes/computers, using an outer database to store parameters might be a good option.

I would use redis as being extremely fast and fully accessible in R , and for parallel runs its brother: doredis .

So you could have a redis server (or even a replicated, slave database on every host) which could be fetched for parameters. You could instantly update the parameters even from outside of R available to all workers and could easily add new workers for the task with doredis.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM