简体   繁体   中英

Spark cast column to sql type stored in string

The simple request is I need help adding a column to a dataframe but, the column has to be empty, its type is from ...spark.sql.types and the type has to be defined from a string.

I can probably do this with ifs or case but I'm looking for something more elegant. Something that does not require writing a case for every type in org.apache.spark.sql.types

If I do this for example:

df = df.withColumn("col_name", lit(null).cast(org.apache.spark.sql.types.StringType))

It works as intended, but I have the type stored as a string,

var the_type = "StringType"

or var the_type = "org.apache.spark.sql.types.StringType"

and I can't get it to work by defining the type from the string.

For those interested here are some more details: I have a set containing tuples (col_name, col_type) both as strings and I need to add columns with the correct types for a future union between 2 dataframes.

I currently have this:

for (i <- set_of_col_type_tuples) yield {
    val tip = Class.forName("org.apache.spark.sql.types."+i._2)
    df = df.withColumn(i._1, lit(null).cast(the_type))
    df }

if I use

val the_type = Class.forName("org.apache.spark.sql.types."+i._2)

I get

error: overloaded method value cast with alternatives:   (to: String)org.apache.spark.sql.Column <and>   (to: org.apache.spark.sql.types.DataType)org.apache.spark.sql.Column  cannot be applied to (Class[?0])

if I use

val the_type = Class.forName("org.apache.spark.sql.types."+i._2).getName()

It's a string so I get:

org.apache.spark.sql.catalyst.parser.ParseException: mismatched input '.' expecting {<EOF>, '('}(line 1, pos 3)
== SQL == org.apache.spark.sql.types.StringType
---^^^

EDIT: So, just to be clear, the set contains tuples like this ("col1","IntegerType"), ("col2","StringType") not ("col1","int"), ("col2","string"). A simple cast(i._2) does not work.

Thank you.

You can use overloaded method cast , which has a String as an argument:

val stringType : String = ...
column.cast(stringType)

def cast(to: String): Column

Casts the column to a different data type, using the canonical string representation of the type.

You can also scan for all Data Types:

val types = classOf[DataTypes]
    .getDeclaredFields()
    .filter(f => java.lang.reflect.Modifier.isStatic(f.getModifiers()))
    .map(f => f.get(new DataTypes()).asInstanceOf[DataType])

Now types is Array[DataType]. You can translate it to Map:

val typeMap = types.map(t => (t.getClass.getSimpleName.replace("$", ""), t)).toMap

and use in code:

column.cast(typeMap(yourType))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM