简体   繁体   中英

Scala Spark-> Select first 15 columns from a DataFrame

I am trying to get the first 15 columns from a DataFrame that contains more than 500 cols. But I don't know how to do it because is my first time using Scala Spark.

I was searching but didn't find anything, just how to get cols by name, for example:

val df2 = df.select("firstColName", "secondColeName")

How can i do this by index?

Thanks in advance!

Scala example:

df.selectExpr(df.columns.take(15):_*)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM