[英]Run all shell scripts in folder
I have many .sh
scripts in a single folder and would like to run them one after another.我在一个文件夹中有很多.sh
脚本,我想一个接一个地运行它们。 A single script can be executed as:单个脚本可以执行为:
bash wget-some_long_number.sh -H
Assume my directory is /dat/dat1/files
假设我的目录是/dat/dat1/files
How can I run bash wget-some_long_number.sh -H
one after another?我怎样才能一个接一个地运行bash wget-some_long_number.sh -H
?
I understand something in these lines should work:我明白这些行中的某些内容应该有效:
for i in *.sh;...do ....; done
Use this:用这个:
for f in *.sh; do
bash "$f"
done
If you want to stop the whole execution when a script fails:如果您想在脚本失败时停止整个执行:
for f in *.sh; do
bash "$f" || break # execute successfully or break
# Or more explicitly: if this execution fails, then stop the `for`:
# if ! bash "$f"; then break; fi
done
It you want to run, eg, x1.sh
, x2.sh
, ..., x10.sh
:你想运行,例如, x1.sh
, x2.sh
, ..., x10.sh
:
for i in `seq 1 10`; do
bash "x$i.sh"
done
To preserve exit code of failed script (responding to @VespaQQ):保留失败脚本的退出代码(响应@VespaQQ):
#!/bin/bash
set -e
for f in *.sh; do
bash "$f"
done
有一种更简单的方法,您可以使用run-parts
命令来执行文件夹中的所有脚本:
run-parts /path/to/folder
I ran into this problem where I couldn't use loops and run-parts works with cron.我遇到了这个问题,我无法使用循环和运行部分与 cron 一起工作。
foo () {
bash -H $1
#echo $1
#cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script
You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate.您使用 GNU 并行,这会执行目录中的所有内容,并以更快的速度增加它的增益。 Not to mention it isn't just with script execution, you could put any command in the function and it'll work.更不用说它不仅仅是脚本执行,你可以在函数中放置任何命令,它会起作用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.