简体   繁体   中英

Bash subshell to file

I'm looping over a large file, on each line I'm running some commands, when they finish I want the entire output to be appended to a file.

Since there's nothing stopping me from running multiple commands at once, I tried to run this in the background & .

It doesn't work as expected, it just appends the commands to the file as they finish, but not in the order they appear in the subshell

#!/bin/bash
while read -r line; do
  (
    echo -e "$line\n-----------------"
    trivy image --severity CRITICAL $line
    # or any other command that might take 1-2 seconds
    echo "============="
  ) >> vulnerabilities.txt &
done <images.txt

Where am I wrong?

Consider using GNU Parallel to get lots of things done in parallel. In your case:

parallel -k -a images.txt trivy image --severity CRITICAL > vulnerabilities.txt

The -k keeps the output in order. Add --bar or --eta for progress reports. Add --dry-run to see what it would do without actually doing anything. Add -j... to control the number of parallel jobs at any one time.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM