I have a method:
object Object1{
def method1[T >: Null: ClassTag: TypeTag](input: String): T = {
val m = new ObjectMapper() with ScalaObjectMapper
m.registerModule(DefaultScalaModule)
Option(input)
.map(m.readValue[T](_, classTag[T].runtimeClass.asInstanceOf[Class[T]]))
.orNull
}
}
I want to dynamically pass type to this method, I have tried it like:
import spark.implicits._
def getHome[T](s: Dataset[Sig], someString: String): Dataset[T] = {
s.filter(s => (s.sType == someString))
.map(s => Object1.method1[T](s.sDetails))
}
But I am getting error: No implicits found for parameter evidence$6 . I have also imported implicits. What is the best way to implement above logic in Scala?
You are trying to pass a value of a wider type T
to a narrower type T >: Null: ClassTag: TypeTag
and this is never OK. You have to add the same (or better) type constraints to the type parameter of getHome
:
def getHome[T >: Null: ClassTag: TypeTag](s: Dataset[Sig], someString: String): Dataset[T] = {
It also feels like method1
should be using the Manifest
typeclass rather than ClassTag
and TypeTag
. This might simplify things, so check the docs for ScalaObjectMapper
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.