简体   繁体   English

有什么方法可以动态指定 scala 中的类型

[英]Is there any way to specify type in scala dinamically

I'm new in Spark, Scala, so sorry for stupid question.我是 Spark 的新手,Scala,很抱歉这个愚蠢的问题。 So I have a number of tables:所以我有很多表:

table_a, table_b, ...表_a,表_b,...

and number of corresponding types for these tables以及这些表的相应类型的数量

case class classA(...), case class classB(...), ...案例 class A 类(...),案例 class B 类(...),...

Then I need to write a methods that read data from these tables and create dataset:然后我需要编写一个从这些表中读取数据并创建数据集的方法:

def getDataFromSource: Dataset[classA] = {
       val df: DataFrame = spark.sql("SELECT * FROM table_a")
       df.as[classA]
}

The same for other tables and types.其他表和类型也是如此。 Is there any way to avoid routine code - I mean individual fucntion for each table and get by with one?有什么办法可以避免例行代码 - 我的意思是每个表都有单独的功能并用一个来解决吗? For example:例如:

def getDataFromSource[T: Encoder](table_name: String): Dataset[T] = {
       val df: DataFrame = spark.sql(s"SELECT * FROM $table_name")
       df.as[T]
}

Then create list of pairs (table_name, type_name):然后创建对列表(table_name,type_name):

val tableTypePairs = List(("table_a", classA), ("table_b", classB), ...)

Then to call it using foreach:然后使用 foreach 调用它:

tableTypePairs.foreach(tupl => getDataFromSource[what should I put here?](tupl._1))

Thanks in advance!提前致谢!

Something like this should work像这样的东西应该工作

def getDataFromSource[T](table_name: String, encoder: Encoder[T]): Dataset[T] =
  spark.sql(s"SELECT * FROM $table_name").as(encoder)

val tableTypePairs = List(
  "table_a" -> implicitly[Encoder[classA]],
  "table_b" -> implicitly[Encoder[classB]]
)

tableTypePairs.foreach {
  case (table, enc) =>
    getDataFromSource(table, enc)
}

Note that this is a case of discarding a value, which is a bit of a code smell.请注意,这是丢弃值的情况,这有点代码味道。 Since Encoder is invariant, tableTypePairs isn't going to have that useful of a type, and neither would something like由于Encoder是不变的, tableTypePairs不会有那么有用的类型,也不会像

tableTypePairs.map {
  case (table, enc) =>
    getDataFromSource(table, enc)
}

One option is to pass the Class to the method, this way the generic type T will be inferred:一种选择是将Class传递给该方法,这样将推断出泛型类型T

def getDataFromSource[T: Encoder](table_name: String, clazz: Class[T]): Dataset[T] = {
       val df: DataFrame = spark.sql(s"SELECT * FROM $table_name")
       df.as[T]
}

tableTypePairs.foreach { case (table name, clazz) => getDataFromSource(tableName, clazz) }

But then I'm not sure of how you'll be able to exploit this list of Dataset without .asInstanceOf .但是我不确定你将如何在没有.asInstanceOf的情况下利用这个Dataset列表。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM