简体   繁体   English

如何在spark java中的数据集上应用map函数

[英]How to apply map function on dataset in spark java

My CSV File:我的CSV文件:

YEAR,UTILITY_ID,UTILITY_NAME,OWNERSHIP,STATE_CODE,AMR_METERING_RESIDENTIAL,AMR_METERING_COMMERCIAL,AMR_METERING_INDUSTRIAL,AMR_METERING_TRANS,AMR_METERING_TOTAL,AMI_METERING_RESIDENTIAL,AMI_METERING_COMMERCIAL,AMI_METERING_INDUSTRIAL,AMI_METERING_TRANS,AMI_METERING_TOTAL,ENERGY_SERVED_RESIDENTIAL,ENERGY_SERVED_COMMERCIAL,ENERGY_SERVED_INDUSTRIAL,ENERGY_SERVED_TRANS,ENERGY_SERVED_TOTAL
2011,34,City of Abbeville - (SC),M,SC,880,14,,,894,,,,,,,,,,
2011,84,A & N Electric Coop,C,MD,135,25,,,160,,,,,,,,,,
2011,84,A & N Electric Coop,C,VA,31893,2107,0,,34000,,,,,,,,,,
2011,97,Adams Electric Coop,C,IL,8334,190,,,8524,,,,,0,,,,,0
2011,108,Adams-Columbia Electric Coop,C,WI,33524,1788,709,,36021,,,,,,,,,,
2011,118,Adams Rural Electric Coop, Inc,C,OH,7457,20,,,7477,,,,,,,,,,
2011,122,Village of Arcade,M,NY,3560,498,100,,4158,,,,,,,,,,
2011,155,Agralite Electric Coop,C,MN,4383,227,315,,4925,,,,,,,,,,

Here down the Spark code to read the CSV file:下面是用于读取CSV文件的Spark代码:

public class ReadFile8 {

    public static void main(String[] args) throws IOException {

        SparkSession session = new SparkSession.Builder().appName("CsvReader").master("local").getOrCreate();

        //Data taken by Local System
        Dataset<Row> file8Data = session.read().format("com.databricks.spark.csv").option("header", "true").load("file:///home/kumar/Desktop/Eletricaldata/file8_2011.csv");

        // Register the DataFrame as a SQL temporary view
        file8Data.createOrReplaceTempView("EletricalFile8Data");
        file8Data.show();
    }

}

How can apply a map function and flatmap function in Spark using Java ?如何使用JavaSpark 中应用 map 函数和 flatmap 函数?

You can use the following code as an example:您可以使用以下代码作为示例:

Dataset<Integer> years = file8Data.map((MapFunction<Row, Integer>) row -> row.<Integer>getAs("YEAR"), Encoders.INT());
Dataset<Integer> newYears = years.flatMap((FlatMapFunction<Integer, Integer>) year -> {
  return Arrays.asList(year + 1, year + 2).iterator();
}, Encoders.INT());

If Encoders.INT() is not working, try this instead如果 Encoders.INT() 不工作,试试这个

Encoders$.MODULE$.INT()编码器$.MODULE$.INT()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM