简体   繁体   中英

Iterative zipping process through 10 files in bash

Hi I was wondering if anybody could help with my coding, in theory I know how to do it manually using basic commands however to save time and make it more efficient I'd like to use a .sh script to do it manually as I might use large amounts of data.

So the idea is I have 10 files, which would be named file_1.txt through to file_10.txt, and I want to treat file_1 as a sort of 'training' file. Therefore append file_2 to file_1, then use gzip to compress and save elsewhere, I imagine it would look something like this...

cat file_1.txt file_2.txt | gzip -c > newfile.gz

but then the next step is to do the same with file_3 to file_1 and zip as a new file, then file_4 to file_1 etc. (you get the idea).

Once it's ran through _2 to _10 with file_1, I'd like to then do the same with file_2 and append 1,3,4,5,6,7,8, and 10 each individually and zip etc.

Hence at the end I should have 90 seperate zip files. (Including each file just by itself zipped, ie file_1.gz, file_2.gz etc. as a reference to compare my appended files to).

Thanks in advance for any tips or help.

Ps it doesn't matter if any code provided is really efficient, so for example split into 10 different loops one after the other, if it's clear then it's fine. I'm only analysing the results, not using the code per say. Thanks!

Use loops:

for i in {1..10} ; do
    for j in {1..10} ; do
        if (( i != j )) ; then
            cat file_$i.txt file_$j.txt | gzip -c > newfile-$i-$j.gz
        fi
    done
done

This creates 90 gzip files excluding each file gzipped by itself. Creation of the additional 10 gzips is left as an exercise to the reader.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM