简体   繁体   中英

Shipping R packages to the nodes in SparkR

嗨,我想知道是否存在一种解决方案,可以通过spark-submit运送R软件包并在SparkR的计算节点上使用它们?

Zip the contents of the R package and ship it as my_package.zip to the cluster, ie with the spark-submit command. This will place the zip file into the working directory of the SparkR script.

Then you can install the package within your R script:

 install.packages("my_package.zip",repos=NULL)
 library(my_package)

DISCLAIMER: If your package depends on other packages, you will need to ship and install them manually.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM