[英]Spark streaming one log file with java doesn't generate any output
我想通過 java 和 spark 流式傳輸日志文件。 我的代碼很簡單:
String base = "c:/test";
SparkConf conf = new SparkConf().setAppName("First_App").setMaster("local[2]");
JavaStreamingContext ssc= new JavaStreamingContext(conf, Seconds.apply(1));
JavaDStream<String> line = ssc.textFileStream(base);
line.map(new Function<String, Integer>()
{
@Override
public Integer call(String v1) throws Exception
{
System.out.println(v1);
int l = v1.length();
return l;
}
});
line.print();
ssc.start();
ssc.awaitTermination();
在c:/test
是一個日志文件,它生成日志返回。 它的內容是:
INFO:Data=Do Save Entity
INFO:Data=Do Delete Entity
但是當我運行我的應用程序時,控制台中會打印以下結果:
18/02/18 19:55:30 INFO JobScheduler: Added jobs for time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Starting job streaming job 1518971130000 ms.0 from job set of time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Finished job streaming job 1518971130000 ms.0 from job set of time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Total delay: 0.291 s for time 1518971130000 ms (execution: 0.002 s)
-------------------------------------------
Time: 1518971130000 ms
-------------------------------------------
18/02/18 19:55:30 INFO FileInputDStream: Cleared 0 old files that were older than 1518971070000 ms:
18/02/18 19:55:30 INFO ReceivedBlockTracker: Deleting batches:
18/02/18 19:55:30 INFO InputInfoTracker: remove old batch metadata:
18/02/18 19:55:31 INFO FileInputDStream: Finding new files took 16 ms
18/02/18 19:55:31 INFO FileInputDStream: New files at time 1518971131000 ms:
-------------------------------------------
Time: 1518971131000 ms
-------------------------------------------
這個輸出還在繼續。 我的目標很簡單:流式傳輸日志文件,然后在控制台中打印其內容,當然,這是臨時的,因為最后,我想將文件保存在數據庫中。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.