简体   繁体   中英

How many Mapreduce Jobs can be run simultaneously

我想知道在单个节点hadoop环境中可以同时提交/运行多少个Mapreduce作业。是否有限制?

From a configuration standpoint, there's no limit I'm aware of. You can set the number of map and reduce slots to whatever you want. Practically, though, each slot has to spin up a JVM capable of running some hadoop code, which requires some amount of memory, so eventually you would run out of memory on your machine. You might also have to configure job queues cleverly in order to run a ton at the same time.

Now, what is possible is a very different question than what is a good idea...

您可以提交任意数量的作业,这些作业将排队等待,调度程序将根据FIFO(默认情况下)和可用资源来运行它们。hadoop执行的作业数量将取决于上述John。

The number of Reducer slots is set when the cluster is configured. This will limit the number of MapReduce jobs based on the number of Reducers each job requests. Mappers are generally more limited by number of DataNodes and # of processors per node.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM