I am trying to pass a case class as a variable to ScalaReflection to get the schema.
I am able to run the code successfully with case class name where as when I assign the case class to a variable and pass that to ScalaReflection I am getting error.
Here is my code
import org.apache.spark.sql.catalyst.ScalaReflection
import org.apache.spark.sql.types.StructType
case class Emp (empId: Integer, empName: String)
val myschema = ScalaReflection.schemaFor[Emp].dataType.asInstanceOf[StructType]
println(myschema)
val empModel = Emp
val myschema2 = ScalaReflection.schemaFor[empModel].dataType.asInstanceOf[StructType]
Error: error: not found: type empModel
val myschema2 = ScalaReflection.schemaFor[empModel].dataType.asInstanceOf[StructType]
Any suggestion is helpful!
Type synonyms (= aliases) should be introduced with a keyword type
type empModel = Emp
val myschema2 = ScalaReflection.schemaFor[empModel].dataType.asInstanceOf[StructType]
In Scala values and types make different namespaces.
Top definitions with type
(and val
) are not allowed in Scala 2, so they have to be inside some object
object App {
type empModel = Emp
val myschema2 = ScalaReflection.schemaFor[empModel].dataType.asInstanceOf[StructType]
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.