What is the best way to feed a huge CSV file to LZ4 compression API? The following isn't optimal for very large files.
byte[] data = file.getBytes();
You can try https://github.com/flanglet/kanzi . The block compressor takes an InputStream and splits it into blocks (you specify the size in the command line).
From the Wiki, you can do something like this:
java -cp kanzi.jar kanzi.app.BlockCompressor --input=myFile.csv --output=myFile.knz --overwrite --block=8M --transform=lz --entropy=none
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.