简体   繁体   English

Spark Shell不覆盖方法定义

[英]Spark-shell not overriding method definition

I'm running Spark 2 - shell with scala 2.11.8 and wondering about the the following behavior, where each line is a block of code to be executed in the shell: 我正在运行Spark 2-具有scala 2.11.8的shell,并想知道以下行为,其中每一行都是要在shell中执行的代码块:

def myMethod()  = "first definition"

val tmp = myMethod(); val out = tmp

println(out) // prints "first definition"

def myMethod()  = "second definition" // override above myMethod

val tmp = myMethod(); val out = tmp 

println(out) // should be "second definition" but is "first definition"

So if I-redefine myMethod , the implementation seems not to be updated in this case. 因此,如果我重新定义myMethod ,则在这种情况下,似乎不对实现进行更新。 I figured out that the second-last statement ( val out = tmp ) causes this behavior, if this is moved in a separate block, the code works just fine. 我发现倒数第二条语句( val out = tmp )会导致此行为,如果将其移动到单独的块中,则代码可以正常工作。

So my question: Is this the desired behavior or a bug? 所以我的问题是:这是所需的行为还是错误?

这确实是一个Spark-Shell错误(或更多是Scala REPL错误),将在Spark 2.3中解决: https//issues.apache.org/jira/browse/SPARK-20706

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM