I have written a class for spark-ml library that uses another classes from it. If to be clear, my class is a wrapper for RandomForestClassifier. Now I want to have an opportunity to import this class from spark-shell.
So the question is: how to make package containing my own class that it will be able to be imported from spark-shell? Many thanks!
如果要导入未编译的文件(例如Hello.scala
,请在spark shell中执行以下操作:
scala> :load ./src/main/scala/Hello.scala
Read the docs:
In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work. You can set which master the context connects to using the --master argument, and you can add JARs to the classpath by passing a comma-separated list to the --jars argument. You can also add dependencies (eg Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates to the --packages argument. Any additional repositories where dependencies might exist (eg SonaType) can be passed to the --repositories argument.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.