简体   繁体   中英

How to import own scala package using spark-shell?

I have written a class for spark-ml library that uses another classes from it. If to be clear, my class is a wrapper for RandomForestClassifier. Now I want to have an opportunity to import this class from spark-shell.

So the question is: how to make package containing my own class that it will be able to be imported from spark-shell? Many thanks!

如果要导入未编译的文件(例如Hello.scala ,请在spark shell中执行以下操作:

scala> :load ./src/main/scala/Hello.scala

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM