简体   繁体   中英

multiple files as argument in bash script

I have a script that exports log files, usually I get between 1-3 separate files depending on the size of the log for a particular day. After the export I run a report generator to spit out an html document of those logs. My question is how to call the command that generates the reports depending on the number of files, I know I can use if statements and do:

./generateReport -i file1 -o output.html

./generateReport -i file1 file2 -o output.html

./generateReport -i file1 file2 file3 -o output.html

is there a way to loop over the files and include them as an input??

是否有必要使用标志?

./generateReport file1 file2 file3 > output.html

The following will collect an array of inputFiles, and a single variable with the output file name:

inputFiles=( )
outputFile=
while (( $# )); do
  if [[ $1 = -o ]]; then
    outputFile=$2; shift
  elif [[ $1 = -i ]]; then
    inputFiles+=( "$2" ); shift
  else
    inputFiles+=( "$1" )
  fi
  shift
done

...then, you could do something like this:

# redirect stdout to the output file, if one was given
[[ $outputFile ]] && exec >"$outputFile"

# loop over the input files and process each one
for inputFile in "${inputFiles[@]}"; do
  process "$inputFile"
done

Try filename expansion :

./generateReport -i file? -o output.html

or using find for all log files created in the last 24 hours :

./generateReport -i `find . -name "file*" -mtime -1` -o output.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM