简体   繁体   中英

how to write sbatch to handle multiple job in slurm

I have two executable file need to run: a.out and b.out .

(1) I want run the a.out on two node, each node have one a.out process.

(2) I want run the b.out on two node,the node is same in (1), but each node have two b.out processes.

My naive code shown as follow

#!/bin/bash
#SBATCH --nodes 2
#SBATCH --ntasks-per-node 2

srun a.out
srun b.out

But it can not handle the needs above. How can I write sbatch to achieve that?

I apologize if it is a stupid problem. Thanks for your time.

Suppose we have 3 node and want run first job A and job B. The job A should run on per node with 1 process. The job B should run on per node with 5 process.

Thus the sbatch file sbatch_input.sh can be written that:

#!/bin/bash
#SBATCH --nodes 3
#SBATCH --ntasks-per-node 5
#SBATCH -p cnall

# job A
srun --nodes 3 --ntasks=3 hostname
echo "-------------------------------"
# job B
srun --nodes 3 --ntasks-per-node=5 hostname

run the script with sbatch

sbatch ./sbatch_input.sh

The result shown that

c05b01n06
c05b01n08
c05b01n07
-------------------------------
c05b01n06
c05b01n08
c05b01n07
c05b01n06
c05b01n06
c05b01n06
c05b01n06
c05b01n08
c05b01n08
c05b01n08
c05b01n08
c05b01n07
c05b01n07
c05b01n07
c05b01n07

Here not using the number 2 in problem in case of misunderstand.

  • Reference

https://slurm.schedmd.com/srun.html

How do the terms "job", "task", and "step" relate to each other?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM