简体   繁体   中英

Zeppelin with Spark interpreter ignores imports declared outside of class/function definition

I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:

%spark
import scala.beans.BeanProperty

class Node(@BeanProperty val parent: Option[Node]) {
}

But imports do not seem to be taken into account

import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
                  @BeanProperty val parent: Option[Node]) {
                   ^

EDIT: I found out that the following code works :

class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}

This also works fine :

def loadCsv(CSVPATH: String): DataFrame = {
    import org.apache.spark.sql.types._
    //[...] some code
    val schema = StructType(
        firstRow.map(s => StructField(s, StringType))
    )
    //[…] some code again
}

So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class when used.

QUESTION: How do I import outside of a class/function definition?

Importing by path.to.package.Class works well in Zeppelin. You can try it with importing and using java.sql.Date ;

import java.sql.Date
val date = Date.valueOf("2019-01-01")

The problem is about Zeppelin context . If you try to use following code snippets in Zeppelin, you will see that it works fine;

object TestImport {
     import scala.beans.BeanProperty
     class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None

I hope it helps!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM