简体   繁体   中英

Can a OpenMP C++ program be used as mapper/reducer function in Hadoop?

Can we combine OpenMP and MapReduce something like this:

Map/Reduce can be used to distribute the data set among different computers.
Then each computer runs mapper/reducer function that take advantage of multiprocessing using OpenMP.

Is this possible? (I couldn't find anything substantial on google search).
If this possible, would there be any advantage of this?

PS I'm using Hadoop Streaming Utility.

The point of Hadoop is to have processing nodes deal with data locality automatically and transparently for you.

If I understand your correctly you want to use Hadoop just for storage, and then do your Map/Reduce work in OpenMP. While this should be possible, you will end up losing one of the main design advantages of Hadoop.

This approach does not make a lot of sense. I suggest either sticking to the Hadoop framework, or looking at one of the alternatives if you don't like it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM