简体   繁体   中英

Getting mapreduce job result from Java API without permanent storage on HDFS

What I need looks like pretty common task but I failed to find 'ready to use' solution.

  • I have some Hadoop MapReduce job which produces result as a set of files.
  • If it is important, it has several mappers and no reducers but I'd like solution for general case with all of stages.
  • I'd like to collect job result and use it just after job execution.
  • But I dislike need to store job results permanently on HDFS as with FileOutputFormat so it looks like temporary job storage is option for me.

Any common approach for such cases?

One of solutions for some kind of jobs happened to be counters. It's far from being universal but for some simple cases it can help a lot:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM