简体   繁体   中英

Change datatype in Array[StructField] at an index spark-shell/ scala

I created an Array[StructField] of length 8 in spark-shell. Now, I want to edit the datatype of one of the fields. Code:

val fields = header.map(field_name => StructField(field_name, IntegerType, true))

'header' is a schema string I created.

In python/ pyspark, to edit column datatype at index 5, the following code worked.

fields[5].dataType = StringType()

How do I achieve it in scala/spark-shell? I tried the following two codes but did not work.

fields(5).dataType = StringType

fields(5).update(1, StringType)

I just got started with scala. Thank you and appreciate any help.

尝试:

fields(5) = fields(5).copy(dataType=StringType)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM