[英]Export R package functions to parallel cluster within the R package
There are functions (like function1
) within an R package I'm working on which rely on helper functions (like h_function1
and h_function2
) in my package.我正在处理的 R 包中有一些函数(如
function1
)依赖于我的包中的辅助函数(如h_function1
和h_function2
)。 I'm parallelizing to repeatedly call function1
within another function in my package.我正在并行化以在我的包中的另一个函数中重复调用
function1
。
Currently, in my package I'm using the code目前,在我的包中,我正在使用代码
parallel::clusterExport(cl, varlist=c("h_function1", "h_function2"), envir=environment())
parallel::parSapply(X=1:100, FUN=function1, cl=cl)
where cl
is cluster from the parallel package that was passed into the function.其中
cl
是传入函数的并行包中的集群。 This however assumes that the helper functions h_function1
and h_function2
are loaded from my package into the local environment, which may not be true for some users.然而,这假设辅助函数
h_function1
和h_function2
是从我的包加载到本地环境中的,这对于某些用户来说可能并非如此。
What is a way to flexibly export the functions to the cluster?有什么方法可以灵活地将功能导出到集群中? Is it possible to export all functions of the package by package name?
是否可以通过包名导出包的所有功能?
我通常使用以下代码执行此操作:
parallel::clusterEvalQ({library("<package-name>")})
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.