Is there a way in spark-sql to add a new column in a dataframe such that:
Basically this is the same thing as the concat
function except that the concatenation returns a list of strings instead of a big string seperated by an identifier
NB : I'm using the python API
you can use the built-in array
function. In scala this would look like:
df
.withColumn("col_arr",array(df.columns.map(c => col(c)):_*))
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.