简体   繁体   中英

How to pass dataframe to spark udf?

I want to define a udf. In the function body, it will search data from external dataframe. How can I do that? I tried to pass the dataframe to udf. But it cannot work.

Sample code:

val countryDF = spark.read
  .option("inferSchema", "true")
  .option("header", "true")
  .csv("Country.csv")

val geo = (originString: String, dataFrame: DataFrame) => {
  // Search data from countryDF
  val row = dataFrame.where(col("CountryName") === originString)
  if (row != Nil){
    // set data to row index 2
    row.getAs[String](2)
  }
  else{
    "0"
  }
}
val udfGeo = udf(geo)

val cLatitudeAndLongitude = udfGeo(countryTestDF.col("CountryName"), lit(countryDF))

countryTestDF = countryTestDF.withColumn("Latitude", cLatitudeAndLongitude)

If you want to use a UDF, you have to work on columns, not on dataframe object You have to create a new column that take the output of the UDF.

def geo(originString : String, CountryName: String) : Int = {

    if (CountryName == originString){
      return 1}
    else{
      return 0}
  }

val geoUDF = udf(geo _)

val newData = countryDF.withColum("isOrignOrNot", geoUDF(col("originString"),col("CountryName"))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM