简体   繁体   中英

How to disable boost::iostreams buffer when reading through a filter chain

I have some code that looks approximately like this:

boost::iostreams::filtering_istreambuf in;
in.push(Lz4DecompressionFilter());
in.push(AesDecryptionFilter());
in.push(file_source("somefile"));

I already have meta-data that stores the length of the result:

std::vector<char> buf;
buf.reserve(resultLength /* retrieved from a meta-data server */);
std::streamsize ret = in.read(buf, buf.capacity);

By adding trace-points, I observed that the Lz4 and Aes filter only get reads of 128 bytes. Also, if I replace file_source with a custom device, it only gets reads of 4096 bytes.

Since I know exactly the size the reads should have, is there a way to disable buffering in iostreams entirely and just chain the read down the filter? I know I can change the buffer sizes, but I am interested in completely disabling them.

  • Standard streams by definition use a buffer abstraction. This is largely because some of the functions exposed necessitate the presence of a buffer (peek/putback).

  • How would compression and encryption still function without buffering? Compression and block ciphers both require operating on (sometimes even fixed-size) chunks.

  • Re:

    Also, if I replace file_source with a custom device, it only gets reads of 4096 bytes.

    What behaviour would you have expected instead? Do you expect infinite size reads?

  • Using blocks of >4k is highly unusual in stream-oriented processing. In that case, did just just want to copy all input into one large buffer first (perhaps using array_sink ...)?

Really, it looks like you just wanted to increase the buffer size yes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM