简体   繁体   中英

How to append multiple files into one file as a whole using perl?

I've multiple chunk files in my Unix system. Lets say (file1.csv, file2.csv..., file10.csv). I wanted to append all these files in same order without reading them line by line.

open(my $fh, '<:encoding(UTF-8)', $filename)
  or die "Could not open file '$filename' $!";

while (my $row = <$fh>) {
  chomp $row;
  print "$row\n";
}

The above sample code reads the file line by line. But the chunk files I've are having huge data. So I cannot read them line by line neither can I store it in string using Stringify() .

I'm looking for something in Perl similar to the below shell command.

cat file1.csv >> file.csv

Please let me know I can do similar copy (append at the end of target file) in Perl. Thanks!

As Shawn mentioned above, you can read a block at a time. You can experiment with different block sizes to see what is fastest on your system. For example:

my $BLOCK_SIZE = 1024 * 1024 * 2;    # try reading 2 megabytes at a time
my $outfile = "output.csv";
open(OUT, ">>", $outfile) or die "Can't append to output file $outfile: $!\n";
foreach my $input_file (@input_files) {
     open(IN, "<", $input_file) or die "Can't read from input file $input_file: $!\n";
     my $data;
     my $blocks_read = 0;
     while (read(IN, $data, $BLOCK_SIZE)) {
         print OUT $data;
         $blocks_read++;
     }
     close IN;
     print "Read $blocks_read blocks from $input_file\n";
     }
close OUT;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM