嗨,我想知道是否存在一种解决方案,可以通过spark-submit
运送R软件包并在SparkR的计算节点上使用它们?
Zip the contents of the R package and ship it as my_package.zip to the cluster, ie with the spark-submit command. This will place the zip file into the working directory of the SparkR script.
Then you can install the package within your R script:
install.packages("my_package.zip",repos=NULL)
library(my_package)
DISCLAIMER: If your package depends on other packages, you will need to ship and install them manually.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.