I have many .sh
scripts in a single folder and would like to run them one after another. A single script can be executed as:
bash wget-some_long_number.sh -H
Assume my directory is /dat/dat1/files
How can I run bash wget-some_long_number.sh -H
one after another?
I understand something in these lines should work:
for i in *.sh;...do ....; done
Use this:
for f in *.sh; do
bash "$f"
done
If you want to stop the whole execution when a script fails:
for f in *.sh; do
bash "$f" || break # execute successfully or break
# Or more explicitly: if this execution fails, then stop the `for`:
# if ! bash "$f"; then break; fi
done
It you want to run, eg, x1.sh
, x2.sh
, ..., x10.sh
:
for i in `seq 1 10`; do
bash "x$i.sh"
done
To preserve exit code of failed script (responding to @VespaQQ):
#!/bin/bash
set -e
for f in *.sh; do
bash "$f"
done
有一种更简单的方法,您可以使用run-parts
命令来执行文件夹中的所有脚本:
run-parts /path/to/folder
I ran into this problem where I couldn't use loops and run-parts works with cron.
foo () {
bash -H $1
#echo $1
#cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script
You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.