I am trying to pass the fourth (targetFileCount) argument to a method as below
val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit
But I am facing the below error,
Error:(368, 12) constructor cannot be instantiated to expected type;
found : (T1, T2, T3, T4)
required: (org.apache.spark.sql.DataFrame, String, String)
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
Error:(368, 67) not found: value df
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
Please let me know on how to rectify the same.
writeArray1
contains tuple 3 of org.apache.spark.sql.DataFrame, String, String
So pattern matching on 4 params cannot work.
Another example:
val l = List(5)
l.map { case (a, b) => a.toString }
Also yields same error:
error: constructor cannot be instantiated to expected type;
found : (T1, T2)
required: Int
As said above writeArray1.par contains tuple of 3 org.apache.spark.sql.DataFrame, String, String So pattern matching on 4 params cannot work.
Please use as below.
val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
case (df, path, tog) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.