简体   繁体   English

Scala案例类忽略了Spark shell中的导入

[英]Scala case class ignoring import in the Spark shell

I hope there is an obvious answer to this question! 我希望这个问题有一个明显的答案!

I've just upgraded to Spark v2.0 and have an odd problem with the spark-shell (Scala 2.11 build). 我刚刚升级到Spark v2.0并且遇到了spark-shell (Scala 2.11版本)的奇怪问题。

If I enter the following minimal Scala, 如果我输入以下最小的Scala,

import java.sql.Timestamp

case class Crime(caseNumber: String, date: Timestamp, description: String, detail: String, arrest: Boolean)

I get the following error, 我收到以下错误,

<console>:11: error: not found: type Timestamp

If I use the Java Timestamp class elsewhere, eg in a function, then no errors are generated (as you would expect because of the import). 如果我在其他地方使用Java Timestamp类,例如在函数中,则不会生成错误(正如您所期望的那样导致导入)。

If I fully qualify and use java.sql.Timestamp in the case class it works! 如果我完全符合条件并在case类中使用java.sql.Timestamp它就可以了!

Am I missing something obvious? 我错过了一些明显的东西吗

It's just that the Timestamp is not loaded in the case class declaration, to fix this you can: 只是在案例类声明中没有加载Timestamp,为了解决这个问题,您可以:

:paste
import java.sql.Timestamp
case class Crime(caseNumber: String, date: Timestamp, description: String, detail: String, arrest: Boolean)

or 要么

case class Crime(caseNumber: String, date: java.sql.Timestamp, description: String, detail: String, arrest: Boolean)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM