简体   繁体   中英

Using Filter Condition While Joining Spark Dataframes: Spark/Scala

can someone please suggest me how to use filter while joining 2 dataframes in spark scala.I am trying below code.

    var name="abcd"
    var last_name="xyz"

    val df3 = df1.join(df2, df1("id") === df2("id"))
    .filter(df1("name")==='${name}').
    filter(df1("last_name")==='${last_name}')
    .drop(df1("name"))
    .drop(df2("name"))

But getting multiple error.

在此处输入图像描述

Spark is not like java's JDBC APIs where we need wrap string with single quotes for where condition. Can you simple try using name variable w/o any quotes and $ sign

    var name="abcd"
    var last_name="xyz"
    val df3 = df1.join(df2, df1("id") === df2("id"))
    .filter(df1("name")===name && df1("last_name")===last_name)
    .drop(df1("name"))
    .drop(df2("name"))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM