简体   繁体   中英

Read csv file into a dataframe and access it using scala

I have a csv file having data such as below -

a1, 1
a2, 2
a3, 3

I want to get output as 1 when I put filter for a1 like filter(a1)._2 .

So the exact syntax would depend on your particular version of Spark. In Spark v2.4.3 you would do this:

val df: DataFrame = sparkSession.sqlContext.read.option("header", 
"false").csv("/path/to/some.csv")

From there you can apply dataframe operations to filter you data:

df.select($"_c1").filter($"_c0" === "a1").show

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM