[英]How to import own scala package using spark-shell?
I have written a class for spark-ml library that uses another classes from it. 我已经为spark-ml库编写了一个类,该类使用其中的另一个类。 If to be clear, my class is a wrapper for RandomForestClassifier.
如果清楚一点,我的课程是RandomForestClassifier的包装。 Now I want to have an opportunity to import this class from spark-shell.
现在,我想有机会从spark-shell导入此类。
So the question is: how to make package containing my own class that it will be able to be imported from spark-shell? 所以问题是:如何使包含我自己的类的包能够从spark-shell导入? Many thanks!
非常感谢!
如果要导入未编译的文件(例如Hello.scala
,请在spark shell中执行以下操作:
scala> :load ./src/main/scala/Hello.scala
Read the docs: 阅读文档:
In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc.
在Spark Shell中,已经在名为sc的变量中为您创建了一个特殊的可识别解释器的SparkContext。 Making your own SparkContext will not work.
制作自己的SparkContext将不起作用。 You can set which master the context connects to using the --master argument, and you can add JARs to the classpath by passing a comma-separated list to the --jars argument.
您可以使用--master参数设置上下文连接的主机, 也可以通过将逗号分隔的列表传递给--jars参数来将JAR添加到类路径。 You can also add dependencies (eg Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates to the --packages argument.
您还可以通过在--packages参数中提供逗号分隔的Maven坐标列表,从而将依赖项(例如Spark Packages)添加到Shell会话中。 Any additional repositories where dependencies might exist (eg SonaType) can be passed to the --repositories argument.
可以存在依赖关系的任何其他存储库(例如SonaType)都可以传递给--repositories参数。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.