簡體   English   中英

使用Java的Apache Spark流自定義接收器(文本文件)

[英]Apache Spark Streaming Custom Receiver(Text File) using Java

我是Apache Spark的新手。

我需要從本地/掛載目錄中讀取日志文件。 一些外部源將文件寫入本地/掛載目錄。 例如,外部源寫入將記錄到combined_file.txt文件中,一旦文件寫入完成,外部源將創建前綴為0_的新文件,例如0_combined_file.txt 然后,我需要閱讀combined_file.txt日志文件並進行處理。 因此,我試圖編寫自定義接收器,以檢查是否已完成將日志文件寫入本地/掛載目錄的操作,然后讀取已完成的文件。

這是我的代碼

@Override
    public void onStart() {
        Runnable th = () -> {
            while (true) {
                try {
                    Thread.sleep(1000l);
                    File dir = new File("/home/PK01/Desktop/arcflash/");
                    File[] completedFiles = dir.listFiles((dirName, fileName) -> {
                        return fileName.toLowerCase().startsWith("0_");
                    });
                    //metaDataFile --> 0_test.txt
                    //completedFiles --> test.txt
                    for (File metaDataFile : completedFiles) {
                        String compFileName = metaDataFile.getName();
                        compFileName = compFileName.substring(2, compFileName.length());
                        File dataFile = new File("/home/PK01/Desktop/arcflash/" + compFileName);
                        if (dataFile.exists()) {
                            byte[] data = new byte[(int) dataFile.length()];
                            fis.read(data);
                            fis.close();
                            store(new String(data));
                            dataFile.delete();
                            metaDataFile.delete();
                        }
                    }
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        };
    new Thread(th);
    }

我正在嘗試處理如下數據。

JavaReceiverInputDStream<String> data = jssc.receiverStream(receiver);
data.foreachRDD(fileStreamRdd -> {
                        processOnSingleFile(fileStreamRdd.flatMap(streamBatchData -> {
                        return Arrays.asList(streamBatchData.split("\\n")).iterator();
                    }));
});

但要低於例外

18/01/19 12:08:39 WARN RandomBlockReplicationPolicy: Expecting 1 replicas with only 0 peer/s.
18/01/19 12:08:39 WARN BlockManager: Block input-0-1516343919400 replicated to only 0 peer(s) instead of 1 peers
18/01/19 12:08:40 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
java.lang.OutOfMemoryError: Java heap space
    at com.esotericsoftware.kryo.io.Output.<init>(Output.java:60)
    at org.apache.spark.serializer.KryoSerializer.newKryoOutput(KryoSerializer.scala:91)
    at org.apache.spark.serializer.KryoSerializerInstance.output$lzycompute(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.output(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:312)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
18/01/19 12:08:40 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker for task 1,5,main]
java.lang.OutOfMemoryError: Java heap space
    at com.esotericsoftware.kryo.io.Output.<init>(Output.java:60)
    at org.apache.spark.serializer.KryoSerializer.newKryoOutput(KryoSerializer.scala:91)
    at org.apache.spark.serializer.KryoSerializerInstance.output$lzycompute(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.output(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:312)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
18/01/19 12:08:40 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.OutOfMemoryError: Java heap space
    at com.esotericsoftware.kryo.io.Output.<init>(Output.java:60)
    at org.apache.spark.serializer.KryoSerializer.newKryoOutput(KryoSerializer.scala:91)
    at org.apache.spark.serializer.KryoSerializerInstance.output$lzycompute(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.output(KryoSerializer.scala:308)
    at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:312)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

任何人都可以幫助我在這里解決錯誤。

任何幫助將不勝感激

19/18/19 12:08:40錯誤SparkUncaughtExceptionHandler:線程Thread中的未捕獲異常[任務1,5,main的執行器任務啟動工作器] java.lang.OutOfMemoryError:Java堆空間

以上顯示您遇到了內存不足錯誤。在提交Spark作業時顯式增加內存

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM