简体   繁体   English

将Spark Scala脚本加载到Spark Shell中

[英]Load spark scala script into spark shell

I am trying to load a spark scala script into a spark shell using load command where the location of the script is passed in a variable. 我正在尝试使用load命令将spark scala脚本加载到spark shell中,其中脚本的位置在变量中传递。 It's not working 没用

val scriptLoc="/abc/spark"  
:load ${scriptLoc}/scriptName.scala

Even tried like this which didn't work either 甚至这样尝试都没有用

:load scriptLoc/scriptName.scala

Any help would be appreciated. 任何帮助,将不胜感激。

你可以试试

spark-shell -i /path/to/file.scala

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM