简体   繁体   中英

How to call spark UDF from scala using reflection?

I am building a spark application which has dependency on a java library. Java Interface exposed is as

String doSomething(String, Map<String,String>)

I have created a UDF as

def myfunc(properties: Map[String, String]) = udf((data: String) => {
    ...
    doSomething(data,properties)
})

This function can be called as myfunc(properties)(data) from spark shell where properties is a Map and data is of type Column.

The issue is I need to call this via reflection from a scala file. I need to do something like this:

val c = Class.forName("package.class")
val m = c.getMethod("myfunc",classOf[Map[String,String]])
m.invoke(c.newInstance, someMap)

m.invoke returns the function itself. How and where to pass the Column parameter? Or is there any other way to pass these properties map to spark UDF so that it can directly be called via reflection?

Try

m.invoke(c.newInstance, someMap).asInstanceOf[UserDefinedFunction].apply(data)

for data of type Column .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM