简体   繁体   English

Scala Spark-> Select 来自 DataFrame 的前 15 列

[英]Scala Spark-> Select first 15 columns from a DataFrame

I am trying to get the first 15 columns from a DataFrame that contains more than 500 cols.我正在尝试从包含 500 多个列的 DataFrame 中获取前 15 列。 But I don't know how to do it because is my first time using Scala Spark.但我不知道该怎么做,因为这是我第一次使用 Scala Spark。

I was searching but didn't find anything, just how to get cols by name, for example:我正在搜索但没有找到任何东西,只是如何按名称获取 cols,例如:

val df2 = df.select("firstColName", "secondColeName")

How can i do this by index?我怎样才能通过索引做到这一点?

Thanks in advance!提前致谢!

Scala example: Scala 示例:

df.selectExpr(df.columns.take(15):_*)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM